Categories
Education Technology

A Researcher, an Advisor, and a Marketer Walk Into a Predictive Analytics Tool

Presenters

  • Philip Needles, VP of Student Services, Montgomery County Community College
  • Celeste Schwartz, VP for Information Technology and Chief Digital Officer, Montgomery County Community College
  • Stefanie Crouse, Academic Advisor/Assistant Professor, Montgomery County Community College
  • Angela Polec, Executive Director of Marketing & Communications, Montgomery County Community College
  • David Kowalski, Executive Director of Institutional Research, Montgomery County Community College

FUN FACT: this session was interrupted by an evacuation signal, which effectively demonstrated the ability of technology to move lots of people back and forth 🙂

We learned a lot along the way with Predictive Analytics, we’re here to share the good and bad we’ve experienced along the way.

DK: Once we got the tool, we were like: what do we do with it now?

CS: decision to purchase was made at a conference between the campus president, myself and a vendor. It was “risk investment,” and we weren’t sure if it would be successful. It was less about the tool and more about the people. Initial expectation was very exploratory…we believed that PA would help the institution, but we weren’t exactly sure to what extent. We wanted to be able to take (short term) actions based on what the tool would tell us, and it has done that for us.  It took us 18 – 24 months to mash the data and data models loaded into the vendor’s system. We were the first campus to use some of the features of the vendor’s product, which was an interesting experience. We needed our IR executive director to play a critical communications role that would allow our campus to make the leap to implement actions based on our findings. Being inclusive of the various areas of the campus was an objective.

Step 1: Be prepared to invest time & resources without an immediate return. MC3 Time to initial return about two years.

Step 2: Get the right people at the table. MC3 Team about 12 people.

DK: having the latitude to get the right people at the table from the colleges was hugely beneficial. Data the tool provides is around associations and trends. We used an iterative process that helped us make progress as we worked with different campus constituents.

Question: how much time did each team member contribute to the team? DK: it varies depending on the area of inquiry. AC: we have a very active agenda, and everyone at the table is empowered to take action. The more people who have access to the raw data, the better…it helped us work more effectively.

Step 3: Identify Your Research Questions

AP: we had to gain a level of familiarity with the tool to interpret what it was telling us. PA tools can be giant rabbit holes that take you into some scary places. We pulled back a bit to think about hypotheses, and then go into the tool with that intent in a more thoughtful way.

AC: we have different sessions, a 15 week, and two 7 week sessions. Second 7 week session had way more drop-outs; some students were using this session to “beef up their units.” What was happening is that students were simply forgetting to go to their new class! We implemented a text message communication plan to remind them about their class…some of the responses were hysterical: “I TOTALLY FORGOT ABOUT THIS CLASS, THANK YOU!!”

Step 4: Empower the Team to Take Action

2016-2017 about 10 interventions enacted. If your institution is committed to making data-driven decisions, this just helps to drive further assessment and action. For us, it just “fits in” to the way we do our work.

Step 5: Evaluate Your Impact

DK: we do a few things to evaluate impact (the tool has propensity score matching). Some were more successful than others. For example, students who participate more in class tend to persist better; supplemental instruction courses also help a lot.

AP: we always want to “move the needle,” but we also want to influence student behavior. For example, we want them to register early.

Step 7: Review, Revise, Repeat

AP: we’re constantly looking to see how we can improve, bringing new faculty into the fold. How do we put what we’ve learned into the hands of students to help influence positive behaviors. The data has to get out of IR and into the hands of advisors and communications folk and other appropriate folk. What can we do to assist “at risk” students BEFORE we get their GPA?

By Paul Schantz

CSUN Director of Web & Technology Services, Student Affairs. husband, father, gamer, part time aviator, fitness enthusiast, Apple fan, and iguana wrangler.

%d bloggers like this: