Technology Uncategorized

Opening Up Learning Analytics: Addressing a Strategic Imperative


  • Josh Baron, Assistant Vice President, Information Technology for Digital Education, Marist College
  • Lou Harrison, Director of Educational Technology Services, NC State University
  • Donna Petherbridge, Associate Vice Provost, DELTA, NC State University
  • Kenny Wilson, Division Chair-Health Occupation Programs, Jefferson College

This is actually a follow-up to one of my recent posts about a webinar I attended by Unicon on learning analytics.  We have representatives from three different LMSes:  Moodle, Sakai, and Blackboard.  Looks like Lou and Josh from that webinar are here…I’m looking forward to learning more about this effort!  Word of warning:  they moved fast, so I missed some detail, particularly around the workflow and data-heavy slides.  My Student Affairs colleagues will want to tune into the question I asked at the end…

Open Learning Analytics:  Context & Background

OAI, or the Open Academic Analytics Initiative:  EDUCAUSE Next Generation learning Challenges (NGLC).  Funded by Bill & Melinda Gates foundations, $250,000 over a 15 month period.  Goal:  leverage big data concepts to create an open-source academic early alert system and research “scaling factors”

LMS & SIS data is fed into a predictive scoring model, which is then fed into an academic alert report.  From there, an intervention is deployed (“awareness” or Online Academic Support Environment – OASE)

Research design:  rolled out to 2,200 students in 4 institutions:  2 community colleges, and 2 historically black colleges and universities.  More detail on the approach and results here.

Strategic Lessons Learned

Openness will play a critical role in the future of learning analytics.

  • Used all open source tools:  Weka, Kettle, Pentaho, R, Python, etc.
  • Open standards and APIs:  Experience API (xAPI), IMS Caliper/Sensor API
  • Open Models:  predictive models, knowledge maps, PMML, etc.
  • Open Content/Access:  journals, whitepapers, policy documents
  • Openness or Transparency with regard to ethics/privacy
  • NOT anti-commercial, commercial ecosystems help sustain OSS

Software silos limit usefulness

  • Platform approach makes everything more useful

NC State Project

  • Getting everyone moving in the same direction is a challenge.
  • The number one priority we have at NC is student success, and we know that data is going to help us get there.  However, we have different vendors approaching us independently, each with their own selling points on what they could do to help us.
  • Lunch and learn sessions, bring people up to speed on what questions to ask, and start thinking about who can generate answers.  It took us 10 months to get everyone together
  • Division of Academic & Student Affairs has purchased EAB; concurrently, we’re working on LAP.  Continued conversations with campus partners will have to happen.

From Proof to Production:  Toward Learning Analytics for the Enterprise

  • Initial steps:  small sample sizes, predictions at 1/4, 1/2, 3/4 points in course, multi-step manual process
  • Goal 1: make it more enterprise-y.  Use large sample sizes (all student enrollments), frequent early runs (maybe daily), automatic no more than 1 click
  • Currently in progress:  rebuild infrastructure for scale; daily snapshots of fall semester data; after fall semester ends look for the sweet spot.
  • Future goals:  refine model even more; segment model by population; balance between models and accuracy; refine and improve models over time; explore ways to track efficacy over time; once we intervene we can never go back to virgin state

Jefferson Project

  • Why is JC seeking LAP implementation?  First time pass rate of Anatomy and Physiology is 54%.  Only 27% re-take.  37% non-persistence rate (DFW).  Need to find ways to help students succeed.
  • How is it going?  We have a 4 year grant.  Compliance letter came in May of 2015.  Implement PREP program in October 2015, LAP roll-out in 10/1/2016, with one year to test.  We use Student Participation System data and feed it into the system.
  • Why use SPS data?  It’s readily available; part of HLC Quality Initiative; less politically charged; shown to correlate with student success; clear map of data schema; data is very robust, more data there than we are presently using; data is “complete” (better than Bb data; less complete than original LAP design).
  • Each instructor will receive an Academic Alert Report.

My question:  have you considered integration of co-curricular data into your models?  YES!  We’re very interested in integration of co-curricular data, because it’s often a better indicator for student success than LMS data.  Vincent Tinto’s research clearly indicates this, but our implementation of this is probably a phase 3 or phase 4 thing.

By Paul Schantz

CSUN Director of Web & Technology Services, Student Affairs. husband, father, gamer, part time aviator, fitness enthusiast, Apple fan, and iguana wrangler.

%d bloggers like this: