Predictive data helps more students of color reach graduation

Predictive data helps more students of color reach graduation

Editor’s note: This story first appeared on palabra, the digital news site by the National Association of Hispanic Journalists.

By Aitana Vargas | Edited by Jessica Kutz

Debanhi Romero, a first-generation Latina student at Georgia State University (GSU), found her first year of college classes difficult to keep up with.

Her parents, who were born in Mexico, never had the chance to go to college. But in the fall of 2022, with the help of an academic scholarship, she began pursuing a Bachelor’s degree with a concentration in studio arts.

She was juggling a part-time job, and with a 2.61 GPA after her first semester, she was on the cusp of losing her scholarship. “It took me a minute to get used to college,” she admits.

But unlike at other schools, where she might have fallen through the cracks, a predictive analytics program called GPS Advising flagged that Romero, now 20, needed extra support from the school. She had to attend mandatory meetings with advisors who made sure she wouldn’t lose her HOPE Scholarship, which required her to maintain a 3.0 GPA.

Advisors provided Romero with a list of resources that could help her, including the university’s Writing Studio, a place on campus where students can meet with tutors to improve their writing and composition. Though she was born in Atlanta, Romero grew up in Mexico, and English is not her first language. Ever since she started college, she’s felt the language barrier has been a major obstacle to her academic progress.

“I think I’m gonna use (the studio) next semester, even if I’m not required to, because I found it helpful,” she says.

Debanhi Romero works on a class project at Georgia State University’s Edgewood Sculpture Studio in Atlanta. GSU’s predictive analytics program, GPS Advising, flagged Romero as needing academic support when her GPA dropped below the requirement of her HOPE Scholarship. Photo by Bita Honarvar for palabra

The GPS advising program that helped Romero is part of the university’s effort to revolutionize academics by harnessing the power of predictive analytics to provide tailored help when students hit roadblocks. At GSU, the program analyzes historical academic data to identify which students are at a higher risk of dropping out. The ultimate goal is to boost students’ overall academic performance and increase graduation rates. But as the role of predictive analytics and other tools like AI grows in education, concerns are mounting about the potential downside of these technologies and the possibility that they could perpetuate existing racial and ethnic biases.

At GSU, the positive impact has been profound and measurable: overall graduation rates are up 30% from 2012. Among underserved groups, including Black, Hispanic and low-income students, the graduation rate has been at, or above, that of the student body as a whole for the last seven years. Their chance of graduating is higher than it was a decade ago, and the equity gap has disappeared, according to Timothy Renick, executive director of GSU’s National Institute of Student Success (NISS), a program the university created to help other schools adopt some of its approaches. Disadvantaged students, on average, also graduate half a semester ahead of schedule, which can help them save on student loans.

“We’re graduating now every year about 3,500 more students than we were a decade ago,” said Renick.

Big data in schools by Palabra

Executive director of GSU’s National Institute of Student Success (NISS) Timothy Renick, left, and Crystal Mitchell, director of the GSU Advisement Center, chat in Mitchell’s office in Atlanta. The NISS program was established to train other academic institutions in GSU’s success model. Photo by Bita Honarvar for palabra

Student data is analyzed for 800 different risk factors. When a student is flagged, an advisor reaches out to schedule a 30-minute appointment to discuss why that student may be getting off track. While most students are flagged for academic-related issues, those who fall behind on payments also meet with advisors to resolve financial challenges they may be facing.

Factors that trigger warnings include poor grades, performing poorly in classes essential to one’s major, wasting money on the wrong course or taking too many difficult classes at once.

“If students take organic chemistry and calculus in the same semester, they might, on average, only have about a 60% chance of passing both those classes,” says Renick. “But when taken separately, the chance of passing both classes increases to 80%,” he says.

Atlanta-born Terrance Wiggins, a 24-year-old student who obtained a degree in media entrepreneurship from GSU in December 2022, says the predictive program kept him on track to graduate. After being flagged as someone who needed extra academic guidance, he was connected with an advisor at the school’s student advisement center and was required to take a financial literacy class that helped him understand how to pay for school.

“That was actually helpful because instead of me trying to figure out what classes I needed on my own, or if I’m taking the right classes, I could just come up to the advisement center, speak to someone that could lead me and guide me in the direction that I need to go in,” he explains. “It took a lot of stress off my back.”

Big data in schools by Palabra

Terrance “Tee” Wiggins on campus at Georgia State University in Atlanta. Photo by Bita Honarvar for palabra

For students on the verge of academic failure, the advisors’ proactive role is a lifeline, according to Crystal Mitchell, director of the advisement center.

If a student isn’t doing well on their current path, sometimes advisors take a direct approach to help them understand their options. “We explain: This is how much it is going to cost you academically and financially to stay on this course. … We have another course that can get you to your end goal and still make you attractive for the industry of interest,” she says.

Other resources include tutors and a 24/7 AI-enhanced chatbot named Pounce, which allows students to ask questions about registration, finances, class assignments and course content. It can also send email reminders about quizzes and exams. According to a study conducted by GSU, first-generation students who received the messages earned final grades about 11 points higher than their peers.

Big data in schools by Palabra

Crystal Mitchell, director of the Georgia State University Advisement Center, speaks with members of the center’s leadership team in Atlanta. Photo by Bita Honarvar for palabra

The predictive analytics model has drawn attention from other institutions interested in developing similar strategies, Renick said. Representatives from over 500 colleges and universities have visited the campus and two years ago, GSU founded NISS to meet the high demand for training.

While the university’s program is largely considered successful, the use of data-driven and predictive tools has been controversial since their inception. According to the Electronic Frontier Foundation (EFF), a digital rights nonprofit, and civil rights advocates, some of these concerns include whether people know their data is being used and have consented to its collection, how and who is interpreting the data to make predictions or how it’s stored or potentially managed by third parties.

For Michele E. Gilman, law professor at Baltimore University and author of the article “Expanding Civil Rights to Combat Digital Discrimination on the Basis of Poverty,” a major concern is how to eliminate biases from predictive tools, particularly those attached to ethnicity or race. “It can be very hard to totally get rid of race (in an algorithm),” she says because other variables correlate to race like a student’s neighborhood, his or her ZIP code or the high school that he or she attended.

“(Predictive analytics are) not a crystal ball…it’s really just a mirror of all the biases and discrimination that we have in our current society,” she says. Rather than making accurate predictions these technologies may perpetuate systemic racism.

Big data in schools by Palabra

Students at the Georgia State University in Atlanta. GSU overall graduation rates are up 30% since 2012. Part of the success is attributed to the GPS advising program. Photo by Bita Honarvar for palabra

Julia Dressel, a researcher and Dartmouth College graduate — who a few years ago unraveled the inner workings of COMPAS, a controversial data-driven tool used by judges and probation officers to predict recidivism — has similar fears. She warns that because predictive tools at universities end up inadvertently factoring in race, it may deter students of color from pursuing more difficult majors or fields where certain groups are often underrepresented.

“Women, and particularly women of color, in certain STEM majors, have been historically super disadvantaged,” says Dressel. “They’re either facing discrimination or they also have no role models in the faculty to look up to.”

As a computer science major, Dressel found herself often being the only woman –– or one of a very few –– in most of her classes. She worries that predictive analytics reflect, perpetuate and reinforce systemic discrimination and the barriers that certain minority groups have historically faced. When past data about these groups is used to predict someone’s performance today, that “is where you’re re-encoding and making things worse that have been present historically,” she explains.

Furthermore, Gilman, the law professor, argues that by using other students’ data to make predictions about a current student, the algorithm isn’t considering the aspirations and unique abilities a student has to overcome obstacles and achieve specific goals.

Romero’s experience seems to reflect that. While she was ultimately grateful that the program kept her on track for her scholarship, at times, she felt discouraged.

“I was very stressed about it,” she says. “I was trying to be sincere when I was talking about it with my advisors and stuff, and I did find most of them were trying to get me to look at other options or to have a backup plan,” she says.

Big data in schools by Palabra

Debanhi Romero works on a class project at the GSU sculpture studio in Atlanta. Romero is a first-generation Latina who struggled with her English skills. Being flagged by GSU’s predictive analytics program helped her to improve her English and academic performance. Photo by Bita Honarvar for palabra

She felt they doubted she could keep her scholarship on her current track. “In the moment, I was like, I didn’t need that. I was like, I’m gonna keep my scholarship,” she says. “I needed them to also tell me that I could keep a scholarship, and they were mostly just telling me, ‘Oh, but if you don’t, here is this (option.)’”

Mitchell, however, said that deterring students from pursuing a certain major is the option of last resort. They want students to have access to all the school resources and support to pursue the career of their choice. It is only after all these tools have been exhausted that they recommend other options.

“You have those that are committed, still don’t do well and still don’t want to change their major or take any of your advice. That’s a very hard conversation. But the student, ultimately, has the final decision,” says Mitchell.

Gilman’s and Dressel’s concerns about data collection and their potential uses at other universities are nonetheless well founded. In 2016, Mount St. Mary’s University, a private Catholic institution in Maryland, attempted to use information provided by their students in a survey administered during freshman orientation to manipulate retention rates.

Big data in schools by Palabra

Debanhi Romero chats with a friend near the Georgia State University student center in Atlanta. Photo by Bita Honarvar for palabra

Struggling students were flagged and encouraged to drop out in the first weeks of the semester, before the school was due to report retention numbers to the federal government. Then-President Simon Newman argued that his efforts were aimed at avoiding academic debt. The story, first reported by the university’s paper, turned into months of heated controversy and turmoil that drew national attention and led to his resignation.

Still, over the past decade, GSU’s retention rates and graduation figures have shown that the model has succeeded overall in helping students of color graduate who may have otherwise dropped out. But as AI and predictive analytics continue to infiltrate our lives, Gilman offers one last word of caution.

“It’s not like you graduate, and your data just disappears and goes away,” she says. “Someone is still using that to make predictions about other people, and it could perhaps fall into other hands to make predictions about you.”

__

Aitana Vargas is a Columbia University graduate and an award-winning on-camera news reporter, foreign correspondent and live tennis commentator based in Los Angeles. She began her career anchoring a local Spanish-language TV show while obtaining her B.S. in physics from Berry College and later interned at the BBC, CNN International and the NASA/ESA Hubble Space Telescope communications department in Germany. Her Master’s thesis on the Israeli-Palestinian conflict at Columbia University was supervised by Professor Rashid Khalidi. Her stories have appeared on Público, EFE, CNN Expansión, Narratively, Hoy Los Ángeles, the LA Times, DirecTV Sports, TVE Internacional, Cuatro/Telecinco TV Network, HITN TV Network and others. She’s received several LA Press Club awards (Investigative Series, Sports Journalist of the Year, Obituary, Consumer, Sports & Hard News), the 2018 Berry College Outstanding Young Alumni Award and is a Livingston Award finalist. Aitana was also the Spanish-English interpreter for transgender artist Daniela Vega, lead actress in Academy Award-winning film “A Fantastic Woman.” Learn more about her at aitanavargas.com.

Jessica Kutz is a national reporter covering gender and climate change at The 19th, a nonprofit news organization that reports at the intersection of gender, politics and policy. She previously worked as an editor and reporter at High Country News, a regional nonprofit that covers the Western United States. Her work has appeared in many outlets including The Guardian, Slate, Mother Jones, PBS NewsHour and The Atlantic. She is based in Tucson, Arizona.

Bita Hornavar is an independent photojournalist and visuals editor based in Atlanta, Georgia. She also works as an image editor at Gravy, a quarterly publication from the Southern Foodways Alliance. There, she primarily commissions original illustrations, and also original photography, to accompany non-fiction stories, essays and poems. Bita spent the early part of her career at The Atlanta Journal-Constitution, where she was a staff photojournalist and photo editor for 16 years. Her work there took her around the United States and abroad, including stints in Afghanistan, Iraq, and Iran. More recently, she was the senior photo editor at Vox.com. She is a member of the National Press Photographers Association and serves on the board of the Atlanta Photojournalism Seminar, the longest continuously-operating photojournalism conference in the U.S.