How AI Is Changing Higher Education For The Better…Or Worse
It’s not a huge leap to get to a place where AI is deciding who should be doctors or lawyers or shouldn’t pursue professional careers. And since AI make suggestions that mirror students’ current circumstances and preferences, it flies in the face of and downplays the importance of the proactive effort institutions routinely make to bring diversity and variety to campus life.
In 2002, a film called “Minority Report” starring Tom Cruise and directed by Steven Spielberg debuted. The concept for the movie was that in the year 2054 a PreCrime police unit was empowered to apprehend criminals BEFORE they committed crimes based on having computers hooked up to psychics with precognition to predict the future. It was a scary example of artificial intelligence interacting with humanity and picking winners and losers. Now coming soon to a campus near you—artificial intelligence, sans psychics, will start picking winners and losers in the higher education areas of recruitment, admissions and grading.
Surely there are benefits to the use of artificial intelligence technology. And we’ve already seen some of this for decades. We’ve all had to carefully fill in little circles and squares so that a machine could read and grade our tests. Enrolling in classes online is standard. But should guard rails be installed on the fast track to AI dominance in higher education? And is there a way to balance the preferences of the new generation of digital native students without setting up arguably arbitrary processes that reward some students and faculty and penalize others in ways that are life-changing and dramatic?
The Netflix Model
A survey conducted by Sonic Foundry, Inc., a creator of management solutions, and University Business magazine (UB) concluded that these same digital natives still want a traditional college experience but want to combine it with fast-paced world technologies that include social media connections, streaming and instant access to relevant information. What the survey results suggest is using a “Netflix model” to personalize learning and improve student outcomes. And how does this Netflix model personalize the student experience? Sixty-six percent of education leaders surveyed are in favor of leveraging student data from what they are viewing to what courses they take to the financial aid they receive and what extracurricular activities they participate in to personalize learning. Forty-four percent consider using AI to steering students toward certain videos and information based on their interest.
“Using AI to enhance education and personalize information flow has enormous potential. Schools can learn a lot from the Netflix model of learning. The more you use Netflix, the smarter it gets about personal preferences, making informed decisions about what you should watch. The future of learning will consider student preferences like how and when they want to learn and on what device,” said Rob Lipps, executive vice president, Sonic Foundry. “We’re seeing faculty and students watch videos more than 35 million times per year. Paired with artificial intelligence applications like IBM Watson speech-to-text, institutions have more accessible videos and deeper insights than ever before about student data.”
If that seems a bit Orwellian, consider that Big Brother has already arrived at your supermarket, big retailers and pharmacies. That little courtesy card on your keychain that gives you points and discounts also builds a record of your preferences, buying habits, health status and hobbies. The more you buy, the more they know about you.
“The possibilities for how technology can personalize learning are endless,” said Kurt Eisele-Dyrli, research editor for UB. “This survey showed that higher education leaders see a lot of potential specifically in the combination of AI and academic video to create personalized education on a new level. It’s going to be fascinating to see how colleges and universities will use these tools in the future.”
Benign Vs. The Slippery Slope
The application of AI to enhance learning outcomes seems mostly benign, but there are a few caveats to observe. Directing students to things they already relate to or like can stunt their desire and ability to embrace new ideas or points of view. This is especially troubling in a society that is increasingly tribal, reinforcing beliefs that are already held without question or reservation. Also, there is the temptation for cash-strapped institutions to commercialize student information to conflict with privacy norms. Finally, will opening the door to AI on campus provide the slippery slope to less benign practices?
In December of 2018 Learning House, a Wiley online program manager, released Artificial Intelligence in Higher Education, a research brief summarizing AI developments in higher education. The paper explains how universities might use AI at their institution in the future and includes recommendations on how to prepare for potential changes that might transform standard processes.
The brief assesses current, future and in-development opportunities for AI in four areas:
• student acquisition
• learning and instruction
• student affairs
• institutional efficiency
Across all four sectors, the authors expect that AI will have a positive influence on higher education by improving outcomes and helping institutions scale quality education for their students. But are there serious implications associated with these changes?
In a 2018 article for EdSurge entitled, “Can AI Help Students—and Colleges—Determine the Best Fit?” author Tina Nazerian, reports that AI is being developed to identify and single out a pool of students who have the best chance of being successful students at particular institution in terms of grades and completion rates. Aside from reducing complex decisions into a clinical choice, the problem is that it could also push students to choose one school instead of another for arbitrary reasons that don’t take into account intangibles variables.
Nazerian also noted that Curtis Patrick, a senior architect at Ellucian, a company that builds software for higher education institutions, believes that students choose higher education institution for many reasons, including as a result of social media activity. In this case, he says, AI could be used to create a dossier of sorts. She quotes him as concluding, “Maybe, maybe, maybe machine learning could come back and say look, we’ve identified these things and we think this school is a better fit for these reasons.”
AI And Admissions
It’s not a huge leap to get to a place where AI is deciding who should be doctors or lawyers or shouldn’t pursue professional careers. And since AI make suggestions that mirror students’ current circumstances and preferences, it flies in the face of and downplays the importance of the proactive effort institutions routinely make to bring diversity and variety to campus life. And that potential diversity gap goes both ways. Kasey Urquidez, dean of undergraduate admissions at the University of Arizona, explained to EdSurge that it’s possible that AI can be used to identify the kind of students that are best suited for admission to a particular school, but schools aren’t using AI for admission selections. However, she thinks that AI will surely be an integral part of the admission process in the future.
The delay is due, in part, to the regulations and policies that are in place at each school as part of their admission process. If schools are committed to accept applicants in proportion to the demographics of the area, for example, it complicates the role of AI is choosing the “best fit” for that school. Paul Hays, a consultant who helps students with the college application process, summed it up for EdSurge. “Colleges are for the most part looking for the best class, not necessarily the best person.”
AI Faculty
As the Learning House survey points out, AI can help instructors grade and supply struggling students with the resources they need to succeed. In the future, this could free up faculty members to oversee large classes while still engaging with students on a deeper level. But could there be a flipside to this? “Freeing up” professors could be code for more AI teaching assistants and downsizing faculty. Furthermore, many experts agree that taking the human factor out of grading is not yet practical.
But what does seem quite practical is the use of AI to create virtual teaching assistants for technology departments and professors. In “Pushing the Boundaries of Learning With AI” for Inside Digital Learning author Lindsay McKenzie interviewed Ashok Goel, professor of computer and cognitive science, at the Georgia Institute of Technology. Professor Goel has considerable experience working with virtual teaching assistants; the AI-powered assistant, Jill Watson, was built on IBM’s Watson platform. However, Goel told McKenzie that his team wasn’t pursuing AI grading. But that hasn’t stopped companies from trying to use and develop grading technologies. McKenzie also interviewed Isaac Chuang, professor of physics and senior associate dean of digital learning at the Massachusetts Institute of Technology who sees improvement in this technology but doesn’t think it’s ready for prime time yet. It’s a matter of not being able to process the intangibles. In the end, AI can’t help professors understand the students thought patterns in essays to fairly evaluate how well they comprehend and have mastered the subject matter.
On the other hand, can AI deliver personalized degree planning and intervene with struggling students? It is possible, as the Learning House survey reveals, that “In the future, AI could anticipate students’ academic needs based on predictive data and past performance, and then proactively supply appropriate resources, such as additional tutoring or advising.” As attractive as this seems, relying solely on AI to make career choices ignores the imperceptible elements of the student’s environment such as peer pressure, family obligations and social media influences, which could be more important to the student than a cold analysis of facts and figures.
AI, Education And Career Paths
In terms of institutional efficiency, it is true, as the Learning House survey suggests, that, “AI can pull together information from multiple campus systems and use the data to guide administrative decisions such as course offerings. In the future, AI could help institutions understand local employers’ hiring needs and create curricula that prepares students to fill them.” On the other hand, will this drive toward institutional efficiency direct students towards careers that are deemed best for the student rather than what the student has a passion to pursue? And will AI determine the institutions’ priorities? Will STEM careers overshadow music and art studies? Can AI ever truly discern the importance of a communication degree when compared to a technology degree?
In the book, “Robot-Proof,” Northeastern University President Joseph Aoun proposes a way to educate the next generation of college students to invent, to create, and to discover—to fill needs in society that even the most sophisticated artificial intelligence agent cannot. On his website, Aoun offer this argument: “Driverless cars are hitting the road, powered by artificial intelligence. Robots can climb stairs, open doors, win Jeopardy, analyze stocks, work in factories, find parking spaces, advise oncologists. In the past, automation was considered a threat to low-skilled labor. Now, many high-skilled functions, including interpreting medical images, doing legal research, and analyzing data, are within the skill sets of machines. How can higher education prepare students for their professional lives when professions themselves are disappearing?”
Aoun lays out the framework for a new discipline, humanics, on his website: “which builds on our innate strengths and prepares students to compete in a labor market in which smart machines work alongside human professionals. The only certainty about the future is change. Higher education based on the new literacies of humanics can equip students for living and working through change. A “robot-proof” education, Aoun argues, is not concerned solely with topping up students’ minds with high-octane facts. Rather, it calibrates them with a creative mindset and the mental elasticity to invent, discover, or create something valuable to society—a scientific proof, a hip-hop recording, a web comic, a cure for cancer.”
The point is that as we develop and innovate our artificial intelligence, we should just as vigorously develop and innovate basic human intelligence. •