Categories
Education Technology

A Researcher, an Advisor, and a Marketer Walk Into a Predictive Analytics Tool

Presenters

  • Philip Needles, VP of Student Services, Montgomery County Community College
  • Celeste Schwartz, VP for Information Technology and Chief Digital Officer, Montgomery County Community College
  • Stefanie Crouse, Academic Advisor/Assistant Professor, Montgomery County Community College
  • Angela Polec, Executive Director of Marketing & Communications, Montgomery County Community College
  • David Kowalski, Executive Director of Institutional Research, Montgomery County Community College

FUN FACT: this session was interrupted by an evacuation signal, which effectively demonstrated the ability of technology to move lots of people back and forth 🙂

We learned a lot along the way with Predictive Analytics, we’re here to share the good and bad we’ve experienced along the way.

DK: Once we got the tool, we were like: what do we do with it now?

CS: decision to purchase was made at a conference between the campus president, myself and a vendor. It was “risk investment,” and we weren’t sure if it would be successful. It was less about the tool and more about the people. Initial expectation was very exploratory…we believed that PA would help the institution, but we weren’t exactly sure to what extent. We wanted to be able to take (short term) actions based on what the tool would tell us, and it has done that for us.  It took us 18 – 24 months to mash the data and data models loaded into the vendor’s system. We were the first campus to use some of the features of the vendor’s product, which was an interesting experience. We needed our IR executive director to play a critical communications role that would allow our campus to make the leap to implement actions based on our findings. Being inclusive of the various areas of the campus was an objective.

Step 1: Be prepared to invest time & resources without an immediate return. MC3 Time to initial return about two years.

Step 2: Get the right people at the table. MC3 Team about 12 people.

DK: having the latitude to get the right people at the table from the colleges was hugely beneficial. Data the tool provides is around associations and trends. We used an iterative process that helped us make progress as we worked with different campus constituents.

Question: how much time did each team member contribute to the team? DK: it varies depending on the area of inquiry. AC: we have a very active agenda, and everyone at the table is empowered to take action. The more people who have access to the raw data, the better…it helped us work more effectively.

Step 3: Identify Your Research Questions

AP: we had to gain a level of familiarity with the tool to interpret what it was telling us. PA tools can be giant rabbit holes that take you into some scary places. We pulled back a bit to think about hypotheses, and then go into the tool with that intent in a more thoughtful way.

AC: we have different sessions, a 15 week, and two 7 week sessions. Second 7 week session had way more drop-outs; some students were using this session to “beef up their units.” What was happening is that students were simply forgetting to go to their new class! We implemented a text message communication plan to remind them about their class…some of the responses were hysterical: “I TOTALLY FORGOT ABOUT THIS CLASS, THANK YOU!!”

Step 4: Empower the Team to Take Action

2016-2017 about 10 interventions enacted. If your institution is committed to making data-driven decisions, this just helps to drive further assessment and action. For us, it just “fits in” to the way we do our work.

Step 5: Evaluate Your Impact

DK: we do a few things to evaluate impact (the tool has propensity score matching). Some were more successful than others. For example, students who participate more in class tend to persist better; supplemental instruction courses also help a lot.

AP: we always want to “move the needle,” but we also want to influence student behavior. For example, we want them to register early.

Step 7: Review, Revise, Repeat

AP: we’re constantly looking to see how we can improve, bringing new faculty into the fold. How do we put what we’ve learned into the hands of students to help influence positive behaviors. The data has to get out of IR and into the hands of advisors and communications folk and other appropriate folk. What can we do to assist “at risk” students BEFORE we get their GPA?

Categories
Technology

Initiative Impact Analysis to Prioritize Action and Resource Allocation

Presenters

  • Virginia Fraire, VP of Student Success, Austin Community College District
  • Laura Malcom, VP of Product, Civitas Learning Inc.
  • Angela Baldasare, Asst. Provost, Institutional Research, The University of Arizona
  • partnerships@civitaslearning.com
  • civitaslearning.com

University of Arizona

  • Goal: improve 1st year retention rate from 81% to 91% by 2025
  • How do we find and integrate good data to make good decisions that help our students?
  • When I came on board, I found out that we never had a centralized student academic support office
  • SALT office (Strategic Alternative Learning Techniques) – used to support students with learning disabilities. How can we adopt and adapt some of the techniques that worked there?
  • We were using siloed participant data that was not very helpful. It was not transformative and it didn’t tell us much.
  • We came to Civitas for help.
  • In 2009, U of A opened doors to the “Think Tank” to streamline and centralize a number of academic support services offered by nationally certified tutors; mission is to empower UA students by providing a positive environment where they can master the skills needed to become successful lifelong learners.
  • In one year, nearly 11,000 students make more than 70,000 visits and spend 85,000+ hours with support staff.

Think Tank Impact

  • Illume Impact used PPSM to measure 2.7%(pp) overall life in persistence for students using the writing center
  • 3.4% (pp) increase for 1st year students
  • Less than 10% of 1st year students taking advantage of this service!
  • These results will inform strategic campaigns to offer Think Tank services to students as part of first-year experience.
  • 8.2% persistence increase for students who were most at risk

Taking Initiative With Confidence

  • Sharing impact findings with academic colleges to discuss the need for increased referrals to Think Tank.
  • PPSM has changed the conversation with faculty who want rigorous data.
  • Bolstering credibility and validity to Think Tank services.

Austin Community College

Highland Campus is home to the ACCelerator, one of the largest high-tech learning environments in the country.

“The Accelerator”

  • Provides access to 600+ desktop computer stations spread over 32,000 square feet, surrounded by classrooms and study rooms.
  • Offers redesigned DevEd math courses powered by learning software with an onsite faculty members, tutors and academic coaches to increase personalization and engagement
  • Additional support services are offered, including non-math tutoring, advising, financial aid, supplemental instruction, and peer tutoring.
  • During the 2015-16 year, the ACCelerator served over 13,000 unique students in well over 170,000 interactions.

Accelerator Impact

  • Students who visit the lab at least once each term persist at a higher rate.
  • 4x persistence impact found for DevEd students.
  • Part-time DevEd students and DevEd students with the lowest persistence predictions had even better outcomes.
  • 6.15% increase in persistence for students visiting the learning lab.
  • Results are informing strategic decisions about creating similar learning spaces at other campuses.
  • Impact results have helped validate ACC data and in-house analyses
  • Discussions with math faculty continue to strengthen the developmental math redesign
  • Persistence results leading to further investigation of other metrics related to accelerated learning, particularly for DevEd students.
  • For this kind of approach to work, silos need to be broken down.

 

 

 

Categories
Technology

Unifying Data Systems to Turn Insights into Student Success Interventions

Presenters:

  • Angela Baldasare, Asst.Provost, Institutional Research, The University of Arizona
  • Phil Ice, Vice President, Research & Development, American Public University System
  • Matthew Milliron, Senior Director, Solutions Engineering, Civitas Learning Inc.

Hypothesis

Unifying the data; connect disparate data systems, data and initiatives to gain insight into what’s working & what’s not & for whom

The area that’s of most interest to Phil is the LMS, because that’s where we have the most interaction.  Unfortunately, it’s mostly log file information.  Scroll and click information is not captured.  LTI integration does not help much because it’s based on an iFrame and we lose context.  Instead, they’re using Adobe Analytics (formerly Omniture).  We’re also using social sharing.

Institution-Specific Platform for Innovation

Unified Data Layer (Student Data Footprint – historic and incoming disparate systems) is connected to:

  • Institution-Specific Deep Predictive Flow Models
  • Frontline Apps & Initiatives
  • Robust Testing and Measurement

Matthew then talked about the Civitas Learning Platform components (not exactly a sales pitch, but not too far off).

Prediction

  • UA Historical Overall Fall to Fall Retention Rate = 87%
  • FTIC FT Freshman Historical Average First Year REtention Rate = 80%
  • Prediction for Fall 2015 Cohort = 80%
  • n=6,970 students

Data set used for modeling:

  • Train:  Fall 2012 to Fall 2013
  • Test:  Fall 2013 to Fall 2014

Model accuracy

  • AUC .844
  • 90% accuracy

Discoveries

For FTFT Freshman in their First Term, students with SAT Math >550 persist at a rate +1- percentage points higher than SAT Math <550

For FTFT Freshman in their First Term, an LMS Course Grade on day 14 that is lower than 75% is associated with lower persistence than students with grades over 75%.

For transfer students overall, following course pathways traveled by students who graduated is beneficial for persistence.

For FTIC students who deviate significantly from the course pathway, the effect can be very bad.

Angela mentioned how useful the toolset is for the ability to see a list of students that make up any active filter segment within the too, and dig deeper on the their activity to extract additional actionable insights.

Categories
Technology

Predictive Learning Analytics: Fueling Actionable Intelligence

Presenters:

  • Josh Baron, Assistant Vice President, Information Technology for Digital Education, Marist College
  • Shady Shehata, Principal Data Scientist, D2L
  • John Whitmer, Director for Analytics and Research, Blackboard Inc.

This is all about an ECAR report that was released a few weeks ago.  Report was compiled over about a year and had a large number of contributors.  It’s a great example of collaboration across organizations for profit and non-profit, scientific, etc.

39% is the Key Number

  • In the US, only this percentage completes a 4-year program.
  • This is a national challenge, because the US has slipped from #1 or #2 completion in the world to #12
  • How Can Predictive Learning Analytics Help?  It provides the ability to predict the future with a reasonable level of accuracy give you the ability to intervene on behalf of the student

What is Predictive Learning Analytics?

  • The statistical analysis of historical and current data derived from the learning process to create models that allow for predictions that improve learning outcomes
  • Subset of larger learning analytics field
  • Uses sophisticated mathematical models rather than user-defined rules.  Example:  Academic Early Alert Systems
  • OAAI:  the Open Academic Analytics Initiative
  • Apereo Learning Analytics Initiative

Data Sources, Relevance & Diversity

  • LMS:  academic technology’s first killer app
  • What’s been successful is the penetration and usage of LMS
  • What data from conventional data sources are systematic, significant predictors of course success?  High school GPA?  Race/Ethnicity?  First in Family to Attend College?  NONE OF THEM!
  • Academic Technology data is a systematic predictor of course success – caveat is that the academic material is connected in a deep and meaningful way.  Having a number of triggers help to track actionable details.
  • Embedding Predictive Analytics
  • Strategic Importance other Data Points:  underrepresented student groups.
  • Conclusion:  Learning Data Comes in Many Flavors and Relevance, i.e. Activity (behavioral) data and Static Data (survey data, student aptitude, extra-curricular activities, demographics and prior educational experience).
  • There are very few institutions that employe full-time Predictive Learning Analytics professionals.

How it Works, What is the Data Impact?

  • Historical data and predictive analytics are used to generate a predictive model
  • We want to tease out and surface those patterns that result in successful outcomes
  • After first month, predictive model can provide predictions of the final grades based on the # of content views of the students in the current course offering
  • Examples of what can go wrong:  what if students are viewing the content from mobile application (data incomplete); what if one of the historical course has hundreds of course topics, where other courses have tens of course topics?  (data is inaccurate)
  • Garbage In, Garbage Out
  • Data quality:  accurate, complete, unique, timely, consistent, valid, reliable, integrity

Brightspace Student Success System

  • Created a predictive model for a course
  • A number of screen shots showed how it was implemented with a group of students, with drilldowns on where students were having difficulty.
  • Good visualizations are critical to easily decoding information and making it useful

Data Strategy

  • ETL/Data Integration > Data Warehouse & MDM > BI & Data Visualization > Predictive Analytics

Strategic Implementation Considerations

  • Institutional Stages of analytics usage
  • Organizational leadership, culture & skills
  • gaining access to learning data
  • Ethics & privacy

Institutional Stages of Analytics Usage

  • Basic:  past trends & data observations
  • Automated:  automatically perform analytics & provide results directly to end-users
  • Predictive:  large amounts of data is crunched

Silos are antithetical to successful implementation; investing in new skill sets is imperative.

Gaining Access to Learning Data

  • Activity, clickstream data
  • It’s the fuel on which LA runs
  • Extracting sample data sets is often a good start

Ethics & Privacy

  • Ethics:  using LA for good and not evil
  • Privacy:  balance the need to protect confidentail records while maximizing the benefits of LA
  • Often requires new policies and procedures
  • LA “task force” to address ethics and privacy issues
  • JISC code of practice
  • SURF Learning Analytics SIG

Q&A

  • How do you best approach the introduction of PA to a group of people who don’t even know what it does?  EDUCAUSE has some great white papers on this (“Penetrating the Fog”).  Give examples of products, strategies and solutions.
  • Have you done anything to look at the performance of blended/flipped classrooms?  We get asked this a lot!  We’ve looked at some of the open course offerings of MiT, Blackboard usage patterns.  Many folks are interested in using it for academic course design.
%d bloggers like this: