Drexel University Professors Reflect on Potential of AI

Though technology is often prophesied to be the death of learning, some Drexel professors are largely optimistic about the ways artificial intelligence can change education.
AI image created by DALL.E using the prompt "Drexel University Mario the Dragon doing homework on a computer"
This image was created by AI using the DALL.E service. The prompt used to generate the image was "Drexel University Mario the Dragon doing homework on a computer." 

With the public debut of ChatGPT, one of the world’s most advanced machine learning models, artificial intelligence and its growth has been a hot topic in the last several months. Experts in the field have been keeping an eye on the growth of new artificial intelligence (AI). Drexel University has its own experts in artificial intelligence, including professors who work in law, music and sociology.

Vice Provost for Undergraduate Curriculum and Education Steven Weber, PhD, is leading a working group composed mostly of faculty member that is developing guidelines for how faculty should approach AI in the classroom and what anyone in the University, but especially faculty, should be aware of. The goal is for that group to give a presentation to the Office of the Provost by late March or early April.

“The charges of this group are three things," Weber said. “One is awareness, meaning you should know that ChatGPT exists and what it is and isn’t. Second is pedagogy, meaning insight as to how faculty may leverage it in their courses to advance student learning and awareness that some students will turn to it for help on assignments. Third is opportunity and innovation: this technology represents an incredible opportunity for Drexel to differentiate itself through both our educational and research mission.”

Whatever the working group comes up with won’t be the final word from Drexel about AI and its impact on the educational and research world of the University, but it’s an important first step in getting an idea of what AI can bring. Several of Drexel’s experts, not all of whom are in the working group, shared their thoughts on how ChatGPT and its ilk may change the face of education and how educators may respond to it.

How do you see the proliferation of AI as a continuation of integrating technology into education?

Steven Weber, PhD, Vice Provost for Undergraduate Curriculum and Education

There is an unfortunate historical context with the persistent and regular declaration of the imminent death of learning due to technology. Too often, a technological innovation is accompanied by claims that this is going to kill the learning process. Invariably, what we find is that the technology allows for creative teachers to leverage the technology for learning enhancement. When personal calculators became available, it was predicted that their use would impede a student’s ability to learn math. When the personal computer came out, there was a fear that the use of computers in the classroom would be an obstacle to learning.

We can coexist with technology and in fact leverage technology to do greater things than we would be able to do in the absence of that technology. It just requires some thoughtful pivoting by our faculty to alter the way in which we present our material to our students, how we assess their knowledge of that material, and how we leverage these technological frameworks for them to use in those assignments.

What are potential advantages and drawbacks of the growth in AI?

Youngmoo Kim, PhD, professor of electrical and computer engineering in the College of Engineering and Director of the Expressive and Creative Interactive Technologies (ExCITe) Center

The level of what the machine can do keeps rising. It’s our job to know where that bar is and then make sure our students are above that. That’s the whole point of higher education: that we’re training expertise and the ability to deal with subtleties or the nuances of a particular field or technology.

One of my fears with this AI gold rush is that you get a lot of people who see it from a technological lens or they’re in it for the money. They lack that broader perspective of thinking about how something might be used or misused.

The biggest concern I have about ChatGPT is not actually its use or application, but the fact that it’s run by a company that, despite the name OpenAI, is the exact opposite of "open." This is not a university lab; this is a ginormous company with billions of dollars and when they do publish their procedures, it’s with resources no mere mortal has access to. My biggest fear is that it’s the large companies that are now the gatekeepers and they’re all in this game with more resources than any academic or even consortium of academics can create. I’m not of the mindset that companies are bad, but it’s still not great to have this thing only available if you work for one of those companies.

How could AI change education?

Weber: At Drexel, we are trying to teach our students calculus, but we also expect they will use a calculator because that is why the tool is there. By analogy, modern AI tools such as ChatGPT may be thought of as a tool to accelerate learning. If a 10-week class was able to get from A to B, then we might hope that the thoughtful use of modern AI tools such as ChatGPT will allow that class to get from A to C.

AI can help instructors keep the focus of the learning on the objectives that are most relevant to the discipline, as opposed to having students continually spend lots of time on doing things that they have already learned before. They still need to learn how to do those things, but they don't need to have to continually put in effort on those more basic things. In that way instructors can hopefully include more advanced topics in their courses.

How could the proliferation of AI challenge educators?

Becka Rich, assistant dean of the law library and technology services, supervisor of law school IT, assistant teaching professor of artificial intelligence and law

There are so many generative AI tools out there that it’s difficult, as a faculty member, to keep track of every tool that a student could use. Our usual tools for detecting academic misconduct are trying to catch up but because no one is sure exactly which direction generative AI is going to evolve in. There are so many AI-based tools out there, but none of them are currently at the point where I feel confident in their abilities. AI’s rapid evolution also presents significant challenges to course content and development, as well as to Drexel’s cybersecurity and legal team.

While I think the fear of misconduct is completely reasonable and understandable, I personally have decided that I’d rather put my time into trying to make my courses more relevant, engaging and interactive than into trying to win an arms race with AI. Many of suggestions given to faculty to prevent AI-based cheating have the potential to be particularly harmful to our students with disabilities, because they make it hard to give students some of the most common classroom accommodations. I encourage my colleagues to take equity into account as they consider pedagogy changes.

Lastly, Drexel is known for experiential learning that prepares students for the workplace and, in many professions, using some form of AI effectively is a necessary element of success. There are even job openings for AI prompt engineers! In both of my fields, AI has become a fundamental part of conducting research. Many lawyers are using AI tools in practice to assist with contract drafting and brief writing. I think we’d be doing our students a disservice if we avoid teaching them about essential tools because we’re worried they might engage in academic misconduct.

How should AI be used for and by students?

Hualou Liang, PhD, professor of biomedical engineering with research focuses on machine learning and neuroengineering in the School of Biomedical Engineering, Science and Health Systems:

AI has the potential to revolutionize the way students learn, and there are many ways in which it can be used by and for students to enhance their educational experiences. This can be done by, for example, (1) creating customized learning experiences to meet the individual needs; (2) creating interactive learning experiences that help students to engage with and understand complex concepts; (3) providing instant feedback and grading that allow students to identify areas for improvement and learn at their own pace; and (4) supporting research like our recent work that uses the very technologies used to build ChatGPT for early diagnosis of dementia.

Source: Agbavor, F., & Liang, H. (2022). Predicting dementia from spontaneous speech using large language models. PLOS Digital Health.