Artificial Intelligence Comes to Fisher

April 3, 2024

The field of artificial intelligence (AI) has been around for decades, with some of the early work dating back to the 1950s. Largely the stuff of sci-fi movies and TV shows throughout the late 20th century, AI now is embedded in our daily lives in things like Amazon and Netflix recommendations, face ID on cell phones, and banking alerts based on prior purchases, just to name a few examples.

Image generated by AI with the prompt, What does AI in higher education look like?

By Katie Sabourin, assistant vice president for digital learning

The field of artificial intelligence (AI) has been around for decades, with some of the early work dating back to the 1950s. Largely the stuff of sci-fi movies and TV shows throughout the late 20th century, AI now is embedded in our daily lives in things like Amazon and Netflix recommendations, face ID on cell phones, and banking alerts based on prior purchases, just to name a few examples.

This field broadly uses computer science and data analysis to look for patterns in large data sets to make predictions about future actions. A subset of the field is generative AI, which uses the same strategies to analyze large sets of text, scraped from the web, to create what is known as a large language model. The large language model is then used to predict what the most likely next word or phrase would be in order to generate its own responses to questions asked by the user, called a prompt. This specific type of AI took the world by storm in November 2022 with the release of ChatGPT 3.5 by OpenAI, an AI research and development company.

Since then, the capabilities of this technology have advanced rapidly. Hundreds of companies have created their own generative AI tool and language models. Among them is Google, which has released Bard and, more recently, Duet and Gemini. OpenAI has received funding from Microsoft, which has released tools within the Microsoft Suite, specifically Bing Chat and Microsoft Co-Pilot. Hundreds of other companies are creating generative AI tools as well.

The ability of generative AI tools to compose text, often long prose, that sounds as if it was written by a real person has created both amazing opportunities and extensive ethical and privacy concerns for many industries. Generative AI tools are not 100 percent accurate in the text they produce, known in the AI world as a hallucination. Furthermore, generative AI can include what are known as biased results, due to inherent biases that exist in the large language models that serve as its source. In addition, information entered into large language models, used to train the model to produce better results, can be made public, which may compromise data privacy and violate copyright agreements. A significant challenge for education is the possibility that students may submit text produced via a generative AI tool as if it was their own original writing to an assignment - a whole new type of plagiarism that may compromise learning.

Despite these important concerns, the power and possibility inherent in generative AI tools to transform our daily lives is undeniable. These technologies have the capacity to complete many tasks at a faster pace than any individual could, allowing people to focus on higher-level tasks and novel problem-solving. In many industries, generative AI will not replace workers, but instead will become a tool used to augment their efforts, so they can work faster and get more done in the same amount of time.

The same is true when applied to higher education. One of the most promising uses of generative AI tools in the classroom is the use of these platforms to act as personal tutoring assistants. A student may use these tools to help them discuss a reading assignment in order to further understand vocabulary and concepts included, or explain a topic in a different way that may resonate with them.

Another possibility is to include generating a practice quiz to help prepare for an exam. These types of uses would enhance and augment the student’s learning experience above and beyond traditional course materials, support services, and instructor interactions.  An added benefit is that generative AI is available whenever and wherever students might be when these questions arise.

It is essential that educational institutions  prepare students with the skills to use these emerging technologies safely and ethically, as well as strategically, to ensure that graduates will be able to compete in a world where these tools are ubiquitous. 

What Are We Doing at Fisher with Generative AI?

The conversations and exploration of generative AI tools at Fisher began early in the Spring 2023 semester. The School of Arts and Sciences and the DePeters Family Center for Innovation and Teaching Excellence each held information workshops and exploratory sessions that brought the campus community together to discuss this newly emerging technology. As discussions evolved, it was clear that this technology would have a huge impact on higher education and require more specific conversations in order to prepare for the 2023-24 academic year.

That summer, the DePeters Family Center staff put out a call to interested faculty and staff to join a working group to engage in a deeper conversation about generative AI. More than 60 respondents met with the goal of crafting recommendations for an institutional response to help guide faculty, staff, and students in the coming year.

The summer working group focused its energy on two main policies on campus, a revision to the academic integrity policy and the creation of language for faculty to use on each course syllabi. After reviewing the current policy, participants noted that only minor adjustments to the academic integrity policy were needed, as the policy already covered acts of plagiarism. The creation of syllabus language was a more substantial undertaking. The group came up with a broad statement about generative AI and responsibilities all students must abide by if they were to use these tools in their coursework. Then each faculty member had the choice to decide if the use of generative AI tools was either prohibited, permitted, or required. Faculty were encouraged to provide specific guidance on the use of generative AI tools unique to their course. This language was added to Fall 2023 course syllabi across campus.

Creation of the Generative AI Toolkit

It was the belief of the working group that banning the use of such tools would be a futile effort with potentially negative repercussions. The conclusion was that not experimenting with these tools in an effort to fully understand their capabilities and setting up guidelines for their ethical use would be a disservice to students and the greater campus community, and would not allow Fisher to stay competitive in a quickly changing world. Therefore, the working group spent a great deal of time and energy crafting the Generative AI Toolkit, which would neither endorse nor discourage the use of these technologies on campus, but instead provide guidance and best practices to users when such tools were being used.

Image generated by AI with the prompt, Create an image that represents an AI Toolkit for faculty and students.

Image generated by AI with the prompt, Create an image that represents an AI Toolkit for faculty and students.

 

 

The goal of the toolkit was to act as a trusted resource for our campus community, a place where they can learn what these technologies are all about, the commonly used terms in the field, how to stay safe when using them, and how to write effective prompts to get the most out of these tools. A unique page was set up for different audiences (faculty, staff, and students), providing specific guidance on the ways each may use these technologies in their specific roles. The toolkit also includes an AI Micro-Blog, where DePeters Family Center staff post recent news and updates related to the development of generative AI technologies. This is intended to be a quick reference page to help readers keep up-to-date on an area of technology that is evolving and changing so quickly.

Ongoing Work

While the revision and creation of campus policies, as well as the support provided by the Generative AI Toolkit, have served our campus community well thus far, more work needs to be undertaken to fully maximize the potential of generative AI tools at Fisher.

Image generated by AI with the prompt, What does AI in higher education look like?

Image generated by AI with the prompt, What does AI in higher education look like?

 

Enter the Generative AI Advisory Board, including more than 25 faculty, staff, and students from across the University. The mission of the advisory board is to engage stakeholders in discussions on the use of generative AI tools on campus, make recommendations on policies affected by the use of generative AI, evaluate the rapidly changing landscape of generative AI technologies, and recommend tools that support Fisher’s mission and strategic plan, as well as provide professional development opportunities to the campus community on the ethical uses of these platforms.

Bill Gates has said, “The development of AI is as fundamental as the creation of the microprocessor, the personal computer, the Internet, and the mobile phone. It will change the way people work, learn, travel, get health care, and communicate with each other.” The impact of generative AI on St. John Fisher University is still unfolding, but it will undoubtedly have far-reaching implications across our faculty, staff, and students over the months and years to come. 

About the Author

Katie Sabourin is assistant vice president for digital learning at Fisher. She provides vision and leadership for digital learning and academic innovation that strategically differentiates Fisher from competitors both locally and nationally. She oversees the DePeters Family Center for Innovation and Teaching Excellence, which provides the Fisher community with collaborative leadership for the design and delivery of innovative teaching and learning activities across campus. The Center supports the University in three main areas; supporting the effective use of educational technologies across all forms of teaching and learning, pedagogical consultation on innovative and inclusive teaching practices, and support for the design and delivery of learning for both enhanced delivery formats including online and hybrid and to alternative audiences both internally and externally to campus.