Category Archives: Uncategorized

How to Use the EDUCAUSE CDS to Support Student Success

Presenters

  • Susan Grajek, Vice President, Data, Research, and Analytics, EDUCAUSE
  • Laurie Heacock, National Director of Data, Technology and Analytics, Achieving the Dream, Inc.
  • Louis Kompare, Director, Information Systems and Services, Lorain County Community College
  • Celeste M. Schwartz, VP for IT & IR, Montgomery County Community College

Susan kicked off this session by describing what the CDS is.  It’s been around for over 10 years, includes data from over 800 institutions and allows members to use it to:

  • Study their IT org
  • To benchmark against past performance
  • To look at trends over time
  • To start gathering and using metrics
  • To have data available “just in case”

TOP IT ISSUE #4

Improve Student Outcomes Through an Institutional Approach that Strategically Leverages Technology. Data shared today come from module 3 of the CDS

Student Success Technologies Maturity Index

These 6 measurements are set by subject matter experts, and are measured against a 5 point radar scale

  1. Leadership and governance
  2. Collaboration and involvement
  3. Advising and student support
  4. Process and policy
  5. Information systems
  6. Student Success analytics

Maturity Index

  1. Weak
  2. Emerging
  3. Developing
  4. Strong
  5. Excellent

Deployment Index

  1. No deployment
  2. Expected deployment
  3. Initial deployment
  4. targeted deployment
  5. institution-wide deployment

Goal

Provide higher ed institutions with a reliable, affordable, and useful set of tools to benchmark and improve the cost and quality of IT services, improving the value and efficiency of IT’s contribution to higher education.

Process

Complete Core Data > order and configure reports > receive and use reports.  It takes between 40 and 70 hours to complete, but data is saved for auto-filling the following year.  This speeds the re-entry process considerably.

You can also use the reports for benchmarking against other institutions.  You can create your own, and some peer groups are pre-provided for you.

Achieving the Dream’s Institutional Capacity Framework

Montgomery County Community College (near Philadelphia), about 13,000 students, participating in CDS for about 13 years.  Celeste then went on…In the past, we used CDS more on the justification of new staff.  We used to look at numbers of computers for students, but we tend to look at those numbers less today.  What’s really helped us recently are in how we ask questions about technology.  While you only HAVE to complete module 1, I recommend you dip your toes in some of the other modules.  I’ve used SurveyMonkey to extend my reach and gather additional information from other folks, and then moved it into CDS.  The CDS is really helping to drive our own IT strategic plan.

Lorain Community College (near Cleveland), about 12,000 students, participating in CDS for 2 years.  Our enrollment is highly tied to local industry; local business cycles make make our completion rates look terrible!  CDS is the most valuable way I have to find out the various elements of IT in the higher ed world.  It really helps to discover the things that change from year-to-year.

Top 10 IT Issues Sneak Peek

Coming out in January in EDUCAUSE Review.  IT security is the #1 issue.  Three dimensions that will be discussed in the upcoming report:

  • Divest:  change the way you design, deliver and manage IT services.  Eliminate old processes and silos!
  • Reinvest:  to run state-of-the-art technology services, you need to double down on some things, like information security.  Hiring and retaining good talent, along with restructuring that talent to meet the changing needs of delivering IT services.  The ability to change funding models to meet those needs is also important.
  • Differentiate: institutions are now able to apply technology to strategically meet their goals and differentiate themselves from other institutions.  Ability to apply analytics against strategic objectives is hugely valuable to help provide feedback on where we are and what we need to do to improve.

Opening Up Learning Analytics: Addressing a Strategic Imperative

Presenters:

  • Josh Baron, Assistant Vice President, Information Technology for Digital Education, Marist College
  • Lou Harrison, Director of Educational Technology Services, NC State University
  • Donna Petherbridge, Associate Vice Provost, DELTA, NC State University
  • Kenny Wilson, Division Chair-Health Occupation Programs, Jefferson College

This is actually a follow-up to one of my recent posts about a webinar I attended by Unicon on learning analytics.  We have representatives from three different LMSes:  Moodle, Sakai, and Blackboard.  Looks like Lou and Josh from that webinar are here…I’m looking forward to learning more about this effort!  Word of warning:  they moved fast, so I missed some detail, particularly around the workflow and data-heavy slides.  My Student Affairs colleagues will want to tune into the question I asked at the end…

Open Learning Analytics:  Context & Background

OAI, or the Open Academic Analytics Initiative:  EDUCAUSE Next Generation learning Challenges (NGLC).  Funded by Bill & Melinda Gates foundations, $250,000 over a 15 month period.  Goal:  leverage big data concepts to create an open-source academic early alert system and research “scaling factors”

LMS & SIS data is fed into a predictive scoring model, which is then fed into an academic alert report.  From there, an intervention is deployed (“awareness” or Online Academic Support Environment – OASE)

Research design:  rolled out to 2,200 students in 4 institutions:  2 community colleges, and 2 historically black colleges and universities.  More detail on the approach and results here.

Strategic Lessons Learned

Openness will play a critical role in the future of learning analytics.

  • Used all open source tools:  Weka, Kettle, Pentaho, R, Python, etc.
  • Open standards and APIs:  Experience API (xAPI), IMS Caliper/Sensor API
  • Open Models:  predictive models, knowledge maps, PMML, etc.
  • Open Content/Access:  journals, whitepapers, policy documents
  • Openness or Transparency with regard to ethics/privacy
  • NOT anti-commercial, commercial ecosystems help sustain OSS

Software silos limit usefulness

  • Platform approach makes everything more useful

NC State Project

  • Getting everyone moving in the same direction is a challenge.
  • The number one priority we have at NC is student success, and we know that data is going to help us get there.  However, we have different vendors approaching us independently, each with their own selling points on what they could do to help us.
  • Lunch and learn sessions, bring people up to speed on what questions to ask, and start thinking about who can generate answers.  It took us 10 months to get everyone together
  • Division of Academic & Student Affairs has purchased EAB; concurrently, we’re working on LAP.  Continued conversations with campus partners will have to happen.

From Proof to Production:  Toward Learning Analytics for the Enterprise

  • Initial steps:  small sample sizes, predictions at 1/4, 1/2, 3/4 points in course, multi-step manual process
  • Goal 1: make it more enterprise-y.  Use large sample sizes (all student enrollments), frequent early runs (maybe daily), automatic no more than 1 click
  • Currently in progress:  rebuild infrastructure for scale; daily snapshots of fall semester data; after fall semester ends look for the sweet spot.
  • Future goals:  refine model even more; segment model by population; balance between models and accuracy; refine and improve models over time; explore ways to track efficacy over time; once we intervene we can never go back to virgin state

Jefferson Project

  • Why is JC seeking LAP implementation?  First time pass rate of Anatomy and Physiology is 54%.  Only 27% re-take.  37% non-persistence rate (DFW).  Need to find ways to help students succeed.
  • How is it going?  We have a 4 year grant.  Compliance letter came in May of 2015.  Implement PREP program in October 2015, LAP roll-out in 10/1/2016, with one year to test.  We use Student Participation System data and feed it into the system.
  • Why use SPS data?  It’s readily available; part of HLC Quality Initiative; less politically charged; shown to correlate with student success; clear map of data schema; data is very robust, more data there than we are presently using; data is “complete” (better than Bb data; less complete than original LAP design).
  • Each instructor will receive an Academic Alert Report.

My question:  have you considered integration of co-curricular data into your models?  YES!  We’re very interested in integration of co-curricular data, because it’s often a better indicator for student success than LMS data.  Vincent Tinto’s research clearly indicates this, but our implementation of this is probably a phase 3 or phase 4 thing.

The Science of Predictive Analytics in Education

Presenters:

  • Patrick J. Bauer, Chief Information Officer, Harper College
  • Scott Feeny, Director of Policy and Research, Independent Colleges of Indiana
  • Vince Kellen, Senior Vice Provost for Analytics & Technologies, University of Kentucky
  • Jon Phillips, Managing Director – Worldwide Education Strategy, Dell Inc.
  • John K. Thompson, GM, Advanced Analytics, Dell Inc.

This session will focus on innovations in using data insights in decision-making.  What are the dos and don’ts that we’ve learned thus far.  We’ll start with stories from each panelist, then go into Q&A.  All material will be made available later (more to come on that).

Background

Patrick

  • William Rainey Harper College:  NW suburb of Chicage, a 2-year institution. 40,000 full time equivalent students
  • “Project Discover” leader Matt McLaughlin.  We got a title 3 grant to help do this project.  Includes Inclusion, Engagement, Achievement, Onboarding, Intervening, etc.
  • Data has been collected over 6 years.
  • We originally used a proprietary data warehouse
  • Grad rate increase in 10% in 5 years
  • New reactive programs:  early alert, supplemental instruction, completion concierge, summer bridge.
  • These were REACTIVE programs, we wanted PROACTIVE solutions.

Vince

  • University of KY
  • What have we learned?  We’ve integrated virtually everything we can, and are now moving into personalized learning and messaging.
  • Respect complexity in learning analytics!  I recommend reading “Arrival of the Fittest,” a book by Andreas Wagner.  Their research on genomics highlights and models that can help our process.  Instructional complexity is at least as complex as that of genomics.  We don’t have just one paradigm of instructional theory, but dozens.
  • Structure is important:  get the right people on the bus, remove rivalries within your organization, give groups distinct and clear missions, align with organizational strategy.
  • Engage the community:  transparency makes a big difference; democratize analysis; enforce community etiquette, bring in students & faculty researchers; engage the broader higher education community.
  • Use the right tools and techniques:  speed enables fast thinking, fast group decision-making, fast everything; maximum semantic expressiveness and rich detail improves data quality, analytic flexibility; visualization is important.
  • Conclusion:  respect complexity, attend diligently to the very human aspects of this puzzle, ignite the passion of the community, choose and use your tools wisely

Scott

  • I represent the Independent Colleges of IN
  • A statute required student record information needed to be shared back with the state
  • I needed to know how our institutions compared to others
  • We worked with vendor partners (Dell & Statistica) to run descriptive and predictive analytics against the data we had
  • We wanted to do card swipes, meal plans, and more for sub-group comparisons.

John

  • The Statistica product has been made free for higher ed faculty and students
  • I run the Statistica group at Dell
  • We’ve done a lot of work in universities and hospitals
  • We’re moving toward using data for real-time decision-making.  A specific example was given about reduction in surgical infections…pretty powerful stuff.

“Maslow’s Hierarchy of Data Management”

  • The spectrum:  Data Management > Business Intelligence > Analytics
  • The specific levels:  Data Foundation > Basic Reporting > Performance Mgmt > Predictive > Prescriptive

Challenges and Observations

  • Master organizational and technical planning, orchestrating organizational adoption.
  • Bringing in the “executive management hammer” can be useful
  • IR, advisor and counselor pushback, i.e. “you’re coming to take our jobs!”  Dashboards and forms are actually a value-add for these folks that let them do their jobs more effectively.
  • Usability testing and adoption feedback from students were interesting:  “Why do you give us a number?  Why don’t you just give us feedback and actions we can take?”
  • ROL (“Return On Learning”), how can we quantify what you’re seeing?  There is no control group!  Profound payoff is that you’re able to make informed changes to policies that have real impact.
  • Student subgroups with a GPA lower than X (not specified) were much more likely to stop out.  This challenged many people’s beliefs, i.e. “how is this even possible?”
  • University of Iowa cited an avoided cost of $31 million

Next Steps

  • Data sharing with school districts for a full life-cycle on our students as they go through our system
  • Classroom on realtime analytics, such as triggers set by faculty
  • Get a handle on what our students do when they leave, i.e. wage data
  • Improving the advising process
  • Sharing findings with our institutions

A View from the Top: Taking the Mobile Experience to New Heights

Presenters:
  • Hilary J. Baker, Vice President & CIO, California State University, Northridge
  • Santhana Naidu, Associate Vice President of Marketing Communications, Indiana State University
  • Andrew Yu, Founder & CTO, Modo Labs, Inc.

Full disclosure:  CSUN is my home campus, so I have some knowledge of the Modo Labs product…they back our mobile app.

Andrew kicked off the session by talking about Modo Labs.  The whole thing started out of MiT in 2010, but had it’s beginnings in 2007 in the MiT mobile framework (this was back before the Apple iPhone and app store).  At that time, only about 2% of the web traffic at MiT was from mobile devices.  m.mit.edu took about six months to create.  Modern campus mobile apps must serve multiple constituencies, and serve many purposes…as a result, the mobile app charge at MiT required leadership!

Why WAS mobile Important?

Hilary Baker

CSUN is a public & highly diverse university community of 41,500 students and 4,200 faculty and staff in northwest Los Angeles.  We came late to the mobile party.  Campus priorities:  student success and exemplary service.  We needed to develop and launch a CSUN app fast, and with Modo Labs, we were able to launch in just 10 weeks!

  • We used web services to reach into our PS instance to enable add/drop capability into the mobile app
  • Launched our app a few days before Fall semester 2013.  Our download profile was 6,000 1st week, 9,000 2nd week, 17,000 3rd week
  • Next step for us was to enable pay via mobile app and CashNet
  • We’ve since added lots of additional features, including outdoor mapping and wayfinding, dining, campus tours, indoor floor plans, campus shuttle & transportation services
  • Marketing was important, too.  We printed full-color postcards and distributed these campus-wide.  We also featured the mobile app at our new student orientation.
  • To date downloads:  34K Apple, 9K Android
  • Most used features at beginning of term:  schedule, campus map, class search, add/drop
  • Most used features near end of term:  Transit, dining

Santhana Naidu

  • IU is celebrating its 150 year anniversary
  • Located in Terre Haute
  • 13,500 students

Our mobile journey started about 6 years ago.  Leadership realized the importance of having a mobile app.  It was an internal project, driven by IT and the marketing team.  Unfortunately, students didn’t like the app, so we went back to the drawing board.  In 2012 we re-launched our app with Modo Labs.  Some highlights:

  • 22K downloads to date
  • Classes is by far our most popular module (Blackboard, Banner, Catalog)
  • About 75% of users are iOS, 25% Android

Recruitment is my office’s top priority

  • Growth in mobile usage among students: 20% of overall website traffic from mobile
  • 90% of incoming students carry smartphones
  • 40% of admissions traffic is from mobile devices.
  • Campus life content is very popular; students use this information when making a decision to come to campus
  • IT – MarCom Partnership was a major key to our effort’s success:  shared governance
  • Modo helped us with design, programming, app launch, etc.
  • Content entry through the admin console

How Do You Keep a Mobile App Fresh & Engaging?

  • Communication (messaging) – examples provided of University of Massachusetts, DelMar, and Georgetown, College of William & Mary, Massachusetts General Hospital, Notre Dame
  • Use of Publisher module for content management that can be handled by anyone…no coding skills required
  • Personalization with locations and roles
  • iBeacons can be used to highlight items you want people to know about, especially useful for tours and points of interest
  • Geofencing
  • QR codes can also be used to drive people directly to map locations

Why IS Mobile Important?

Hilary Baker

  • PeopleSoft student services (GreyHeller/Modo Labs).  Includes Financial Aid awards, emergency contact info.
  • Parking space availability
  • Indoor-outdoor maps transitioning.  We want to make this feature more seamless.
  • Matador patrol safety – coming in early Spring 2016.  Talked about the CSUN appjam event and shared a video of one of the winning entries in this event.

Santhana Naidu

  • Audience-based content.  Ability to group icons by audience is important for us.
  • Enhancing the campus visit and tour experience using iBeacons.  This really helps our yield activities.
  • Messaging.  We want to be able to target messages by categories and groups; customize how receive messages, message center in-app so users can refer to the messages, easy to use backend interface for the admin, ability for users to opt in or out of certain optional message types, promotions.

Questions

  • Does it integrate with other apps?  Naidu:  yes, we’re using it for maps and tours.  Hilary:  yes, we link out to other CSUN mobile web sites like the Rec Center, Public Safety.  Andrew:  we can work
  • Does your team do the work?  Naidu:  Modo Labs does the heavy lifting for us.
  • What about using the app for faculty and staff?  Hilary:  yes, we have versions for faculty and staff.  Alumni also have a view.  Naidu:  faculty use the Blackboard module, but we haven’t gone much further than that yet.

Visit Modo Labs at Booth #1930; other campuses will also be here at EDUCAUSE giving presentations about their mobile experiences.

  • Del Mar College
  • Dominican University
  • Notre Dame
  • George Washington University

The Cascade Effect: How Small Wins Can Transform Your Organization

Presenter:  Author Daniel Pink, @danielpink

EDUCAUSE tends to pick well-known and sometimes controversial people for their keynote addresses, and this year is no exception.  You may not know who Daniel Pink is, but you probably know something about his work.  Quick aside: a few years ago I picked up “Drive” on my Kindle.  Unfortunately, I only got about halfway through it…I guess I don’t exhibit enough of the book’s title (you can groan now).  Anyway, you may know Daniel’s work from this animation of his TED talk:  https://www.youtube.com/watch?v=u6XAPnuFjJc  I distributed this video to my colleagues in Student Affairs leadership something like five or six years ago…the message is as relevant now as it was then.

After a few short anecdotes, Daniel dug into the core of his keynote, which was largely a recapitulation of the video at the YouTube link above.

Contingent Rewards

  • Aka “if-then” rewards work well for simple tasks over the short-term; they’re algorithmic.  However, they’re not so great for complex and long-term tasks.
  • Once a task calls for “even rudimentary cognitive skill,” a larger reward leads to poorer performance.
  • Social scientists have known this for a while, but organizations have been slow to pick this up.

Fact:  Money is a Motivator

  • BUT…there are nuances to it’s use as a motivator.
  • Salaries have to be fair, i.e. equal pay for similar work and effort.
  • Why?  People are highly attuned to the laws of fairness.
  • Pay people enough so that money is no longer an issue.

3 Key Motivators

A Gallup poll on employee engagement for 2013 and 2014 indicates that close to 7 out of 10 employees in the US are not engaged with their work.  That’s a lot of disengagement!  How to fix?  Through self-direction!  What are the 3 key motivators?   (Incidentally, these motivators are written on the whiteboard in my office at CSUN)

  1. Autonomy.  Management as a “technology” is designed to enforce compliance, which is often at odds when dealing with complex work.  This is particularly true in IT  When employees have some control over their Time, Technique, Team and Task, you get much better results and have a better likelihood of attracting and retaining talent.  Some examples were provided about carving out time to give people “islands of autonomy.”  The Nobel Prize in Physics in 2010 was awarded to Konstantin Novoselov & Andre Geim for their research on Graphene.  This came partially due to the fact that they had “Friday Evening Experiments,” which were self-directed, unfunded work done for 2-3 hours on Friday evenings.  Do you get enough autonomy in your work?  You can give yourself an autonomy audit right here (thanks Dan, I needed that!):  http://www.danpink.com/audit
  2. Mastery.  Making progress in meaningful work is the single biggest day-to-day motivator.  This is intuitive at a personal level, but in an organizational setting it depends on getting meaningful feedback about how you’re doing.  Unfortunately, most workplaces are “feedback deserts.”  Annual performance reviews are kind of ridiculous when our younger staff are used to immediate feedback.  Why is this?  They have a literal lifetime of instant feedback via games, text messages, and Google searches.  This is why many large organizations like GE, Adobe, Accenture and more are getting rid of annual performance interviews.  Instead, they’re doing weekly one-on-ones…with a twist on the fourth week.  Month one:  on weeks one, two and three ask:  what are you working on and what do you need?   On the forth week, ask what do you love and loathe about your job?  Month two:  on weeks one, two and three ask what are you working on and what do you need?  On the forth week, ask how you can remove barriers.  Month three:  on weeks one, two and three ask what are you working on and what do you need?  On the forth week, talk about long-term career goals.  And so on…mix up the fourth week.
  3. Purpose.  If people can see the value and contribution that their work has, then product quality and employee satisfaction improve.  It helps to have Purpose with a large “P” and purpose with a small “p.”  In this case, a large P = transcendent goals, a small p = day-to-day personal contributions.  As a leader, you have to give not just the HOW, but the WHY of what needs to be done.

Homework for Attendees

Next week, have 2 fewer conversations about “how” and 2 more about “why.”