Whenever you surf the Web, sophisticated algorithms are tracking where you go, comparing you with millions of other people. They’re trying to predict what you’ll do next: Apply for a credit card? Book a family vacation?
At least 40 percent of universities report that they’re trying some version of the same technology on their students, according to several recent surveys. It’s known as predictive analytics, and it can be used to either help or hurt students, says a new report from the New America Foundation.
The dangers come from the possibility of discrimination, invasions of privacy and groups of students being stigmatized, the authors, Manuela Ekowo and Iris Palmer, write. There can also be a lack of transparency when decision-making is turned over to an opaque computer program.
But a happy story cited in the report comes from Georgia State University, a large public university in Atlanta with more than 24,000 undergrads. Of those students, 60 percent are nonwhite, and many are from working-class and first-generation families.
As with many public universities, resources for student advising are limited. Large institutions tend to have staggering caseloads; a few years ago, GSU’s ratio was 700 students per adviser.