Technology Uncategorized

The Science of Predictive Analytics in Education


  • Patrick J. Bauer, Chief Information Officer, Harper College
  • Scott Feeny, Director of Policy and Research, Independent Colleges of Indiana
  • Vince Kellen, Senior Vice Provost for Analytics & Technologies, University of Kentucky
  • Jon Phillips, Managing Director – Worldwide Education Strategy, Dell Inc.
  • John K. Thompson, GM, Advanced Analytics, Dell Inc.

This session will focus on innovations in using data insights in decision-making.  What are the dos and don’ts that we’ve learned thus far.  We’ll start with stories from each panelist, then go into Q&A.  All material will be made available later (more to come on that).



  • William Rainey Harper College:  NW suburb of Chicage, a 2-year institution. 40,000 full time equivalent students
  • “Project Discover” leader Matt McLaughlin.  We got a title 3 grant to help do this project.  Includes Inclusion, Engagement, Achievement, Onboarding, Intervening, etc.
  • Data has been collected over 6 years.
  • We originally used a proprietary data warehouse
  • Grad rate increase in 10% in 5 years
  • New reactive programs:  early alert, supplemental instruction, completion concierge, summer bridge.
  • These were REACTIVE programs, we wanted PROACTIVE solutions.


  • University of KY
  • What have we learned?  We’ve integrated virtually everything we can, and are now moving into personalized learning and messaging.
  • Respect complexity in learning analytics!  I recommend reading “Arrival of the Fittest,” a book by Andreas Wagner.  Their research on genomics highlights and models that can help our process.  Instructional complexity is at least as complex as that of genomics.  We don’t have just one paradigm of instructional theory, but dozens.
  • Structure is important:  get the right people on the bus, remove rivalries within your organization, give groups distinct and clear missions, align with organizational strategy.
  • Engage the community:  transparency makes a big difference; democratize analysis; enforce community etiquette, bring in students & faculty researchers; engage the broader higher education community.
  • Use the right tools and techniques:  speed enables fast thinking, fast group decision-making, fast everything; maximum semantic expressiveness and rich detail improves data quality, analytic flexibility; visualization is important.
  • Conclusion:  respect complexity, attend diligently to the very human aspects of this puzzle, ignite the passion of the community, choose and use your tools wisely


  • I represent the Independent Colleges of IN
  • A statute required student record information needed to be shared back with the state
  • I needed to know how our institutions compared to others
  • We worked with vendor partners (Dell & Statistica) to run descriptive and predictive analytics against the data we had
  • We wanted to do card swipes, meal plans, and more for sub-group comparisons.


  • The Statistica product has been made free for higher ed faculty and students
  • I run the Statistica group at Dell
  • We’ve done a lot of work in universities and hospitals
  • We’re moving toward using data for real-time decision-making.  A specific example was given about reduction in surgical infections…pretty powerful stuff.

“Maslow’s Hierarchy of Data Management”

  • The spectrum:  Data Management > Business Intelligence > Analytics
  • The specific levels:  Data Foundation > Basic Reporting > Performance Mgmt > Predictive > Prescriptive

Challenges and Observations

  • Master organizational and technical planning, orchestrating organizational adoption.
  • Bringing in the “executive management hammer” can be useful
  • IR, advisor and counselor pushback, i.e. “you’re coming to take our jobs!”  Dashboards and forms are actually a value-add for these folks that let them do their jobs more effectively.
  • Usability testing and adoption feedback from students were interesting:  “Why do you give us a number?  Why don’t you just give us feedback and actions we can take?”
  • ROL (“Return On Learning”), how can we quantify what you’re seeing?  There is no control group!  Profound payoff is that you’re able to make informed changes to policies that have real impact.
  • Student subgroups with a GPA lower than X (not specified) were much more likely to stop out.  This challenged many people’s beliefs, i.e. “how is this even possible?”
  • University of Iowa cited an avoided cost of $31 million

Next Steps

  • Data sharing with school districts for a full life-cycle on our students as they go through our system
  • Classroom on realtime analytics, such as triggers set by faculty
  • Get a handle on what our students do when they leave, i.e. wage data
  • Improving the advising process
  • Sharing findings with our institutions

By Paul Schantz

CSUN Director of Web & Technology Services, Student Affairs. husband, father, gamer, part time aviator, fitness enthusiast, Apple fan, and iguana wrangler.

%d bloggers like this: