Using AI to Identify & Help Struggling Students

By analyzing interactions in web-based learning management systems, college advisors can connect to struggling students and help them stay on the path to graduation and future success.

By Marty Graham, Contributor

College instructors have long used web-based learning management systems to virtually deliver reading material, exercises, and assignments to students. But they’ve only just begun to look at these delivery systems—and the data trails users leave behind—for clues about how students are really doing.

The results are powerful.

By leveraging artificial intelligence (AI) to analyze and find patterns in the massive data sets, college officials can identify students who are struggling and likely to drop out. For instance, they would see how many times a student visited an online lesson and then suddenly stopped. With quick intervention by advisors, the schools can then track and measure the students’ improvements and successes.

YOU MAY ALSO LIKE: Evolving Technologies Make STEM Learning Fun

At the University of Nevada Las Vegas (UNLV), Professor Matt Bernacki (who’s now at the University of North Carolina) pioneered such efforts as part of his postgraduate work in educational psychology.

“I set out to study learning by using the clicks, highlights, and annotations students make when they read,” Bernacki says. “Then I dug into the learning management system to confirm a suspicion: If students have to access their learning materials from university servers, then those servers have to receive those clicks in order to return the material the student requests.”

YOU MAY ALSO LIKE THIS PODCAST ON EDUCATION

Once Bernacki began looking into those clicks, he was able to produce an untapped resource for understanding and improving learning.

The results from these experiments are dramatic. UNLV’s initial program was so successful that it won a National Science Foundation grant in 2018 for nearly $1 million, funds that would be used to figure out how to keep students in science, technology, engineering, and math (STEM) programs. (According to the university, 40 percent of declared STEM majors abandon the major once they face the challenging coursework.)

Meanwhile, Arizona State University employed learning systems data to identify and help struggling freshmen in 2016. The results: a 15 percent increase in the number of first-year students who stay for the second year, with important increases in retaining low-income and first-generation students.

When experts talk about continued examples of success, they often point to Georgia State University, where administrators have intervened to connect students to advisors 500,000 times since the school began looking at navigation data in 2012, according to senior vice president for student success Timothy Renick. “Our graduation rates are up 62 percent and we are graduating 3,000 more students annually than we were before,” he says.

Ousting the Trial by Fire

Students have long been expected to fend for themselves, and systems were set up with that mindset, says UNLV Provost Carl Reiber. “That trial-by-fire mentality was all wrong. We’re here to teach students, not weed them out of their futures. It’s an approach that’s been proven harmful to first-generation students and underrepresented minorities, in particular.”

“That trial-by-fire mentality was all wrong. We’re here to teach students, not weed them out of their futures.”

—Carl Reiber, provost, University of Nevada Las Vegas

Web-based learning management systems first appeared in the 1990s, and by 2003 were the norm rather than the exception. But the systems focused primarily on delivering lessons to students, not observing their interactions. Meanwhile, the features used by college staff were mainly for administrative matters. Today, the systems’ big data can identify the obvious, like failing courses at midterm, but also more nuanced issues.

“What’s interesting about our use of big data is its ability to uncover less obvious early signs of trouble,” Renick says. “For instance, even a passing grade of ‘C’ can be an indicator of significant risk when the grade is in a course critical to the student’s field of study or the first grade a student takes in his or her major.”

“We have also identified ‘toxic combinations,’ two courses that, if the student takes in the same semester, produce much higher failure rates [based on past data] than if the courses are taken in different semesters,” Renick adds. For example, students may do just fine in physics and organic chemistry classes, but not in the same semester, he notes.

Repurposing Students’ Data Is Still Novel

“We’re shifting from the purpose of the data being administrative to data being used to help the students,” says James Wiley, principal analyst for the National Research Center for College and University Admissions.

“We’re shifting from the purpose of the data being administrative to data being used to help the students.”

—James Wiley, principal analyst, National Research Center for College and University Admissions

Wiley has encouraged vendors to invert the platforms’ goals by rethinking the basis for the algorithms they’re designing.

“The question for designing the algorithms is what action do you want to take? What end result do you want to achieve and then think backwards,” he says. “Then you think about getting the right data, contextualizing it, displaying it and learning from it and gaining some wisdom.”

But for all the magic of the technology, the students’ success still comes down to connecting advisors to help—and this requires finesse, says Columbia University communications manager Elizabeth “Lisa” Ganga.“It’s tricky, we have to avoid sending the wrong kind of message, one that the student might understand as I’m doing poorly and might as well drop out.” The human touch matters, and may be the most important part, she stresses.

“You can have the algorithms flag the students and then you need to have the advisors ready to assist these students.”

—Elizabeth “Lisa” Ganga, communications manager, Columbia University

“Part of making these interventions work involves retraining advisors to move faster and have more resources to offer right away,” Ganga continues. “You can have the algorithms flag the students and then you need to have the advisors ready to assist these students.”

Like most experiments, plumbing student data has a learning curve that’s based on measuring its impact.

“Measuring outcomes will give you a sense of humility,” Wiley says. “It’s a hand-holding journey and you have to be prepared to iterate along this journey. Data will change. Patterns will change. Tools will change.”

The need for students to succeed, however, will remain constant.