The Future of AI on College Campuses
By: Jordan Mayers Mandley
Artificial intelligence has been around since the 1950s, but it didn’t create any buzz. It is ingrained into our society, as part of our everyday life. Almost a year ago, that changed. On November 30, 2022, OpenAI launched Chatgpt. Open AI was also responsible for DALL-E 2(AI art generator) in April 2022. Chatgpt can generate essays, proofread papers, give summaries on books, generate scripts, etc. DALLE- E 2 produces realistic artwork based on textual description. These new AI inventions sparked Nationwide attention in the US and started a series of discourses about AI, where it was taking us, and the effects it may have on our society. Naturally, there has been debate over how colleges should handle AI moving forward. It is an ongoing discussion that is still developing because some colleges are starting to put things in place to manage AI. Here are three ways that colleges are managing the AI crisis.
Some colleges have decided to implement AI into their course curriculum. According to “The Future of Higher Education – The Rise of AI and ChatGPT on Your Campus,” “University of Rochester: Jonathan Herington, an assistant professor in the Department of Philosophy, used ChatGPT as part of an assignment this semester. He asked students to co-write an essay with the chatbot on a question that would challenge the technology’s capabilities, such as citations from obscure texts or knowledge of readings published after 2020.” Similar to that, the University of Columbia is offering a virtual AI boot camp that helps you build skills to use AI effectively.
Some colleges are using policies to regulate AI. The American University of Armenia made a note about AI in its cheating policy. According to “How colleges and universities are responding to AI now,” “Cheating. Cheating includes but is not limited to: 6.4.2.1. using or referring to notes, books, devices”, “or other sources of information, including advanced Artificial Intelligence (AI) tools, such as ChatGPT, in completing an Academic Evaluation or Assignment, when such use has not been expressly allowed by the faculty member who is conducting the examination.” The Hofstra University in Long Island also stated something similar in their policy. “Use of generative artificial intelligence tools (e.g. Chat GPT) must be consistent with the instructor’s stated course policy. Unless indicated otherwise in the instructions for a specific assignment, the use of Chat GPT or similar artificial intelligence tools for work submitted in this course constitutes the receiving of “unauthorized assistance for academic work,” and is a violation of the Hofstra University Honor Code.” Colleges are trying their best to maintain academic integrity through the technological advancements of the world.
Some colleges have utilized many strategies to help their faculty learn how to implement AI efficiently in their classroom and utilize it in a more effective way moving forward. Colorado State University has developed a website to help its faculty. According to “ The Future of Higher Education – The Rise of AI and ChatGPT on Your Campus”, “The Institute for Learning and Teaching (TILT) has developed a new website called Artificial Intelligence and Academic Integrity. Academic Integrity Program Director Joseph Brown and TILT staff developed it to provide faculty with short-term strategies for ChatGPT, as they continue to monitor the availability of technology-based solutions already in development.” Auburn University also took the initiative to educate its faculty on the inner workings of AI. According to “What Will Determine AI’s Impact on College Teaching? 5 Signs to Watch”, “About 600 faculty members at Auburn University signed up for a self-paced course created by its teaching center that covered the basics of teaching with AI, and that included a discussion of course redesign and how to partner with students.”
Colleges are still in the process of developing their responses to AI. According to unesco.com, “A new UNESCO global survey of over 450 schools and universities found that fewer than 10% have developed institutional policies and/or formal guidance concerning the use of generative AI applications.” As you can see, there are still many discussions and meetings to be held before all universities have a plan on how they will individually use AI. Think about AI in your colleges. How do you want AI to be utilized here at Stony Brook? Do you think AI needs to be regulated, embraced, or both in higher education?