Tag Archives: data

Who Is Doing Our Data Laundry?

Presenter:

  • Brad Wheeler, Ph.D., IU VP for IT, & CIO; Professor of Information Systems, Kelly School of Business

The world is deluged with data, but you may be asking yourself, what should I do? If they don’t do anything to inform decisions to meet the goals we’re pursuing, what good are they? Are you trying to

  • Rapidly remediate info reporting?
  • Enable better financial decisions?
  • Accelerate student success goals?
  • Empower advisors?
  • Benchmark yourself?

The act of working on data to get what you want is a bit like doing laundry:

  1. You put in capital
  2. You put in labor
  3. You add consumables

…and from this, we expect clean, organized clothes.

By “data laundry,” I’m referring to legitimate process of transforming and repurposing abundant data into timely, insightful, and relevant info for another context. It is a mostly unseen, antecedent process that unlocks data’s value and insights for the needs of decision makers.

Our institutions are often quite data-rich and insight-poor.

Two distinct phases to doing data laundry

  1. Data cleaning
  2. Presenting data as information in a context in which it can be used

Data Cleaning

Discovering > Extracting > Re-coding > Uploading > Normalizing

Information Presentation

Enriching > Comparing > Presenting (this is the “Magic Bundle”)

Insource or Outsource

You can buy the equipment and do the work ourselves, or go to the dry cleaners. Even if you go to the dry cleaners, you still have work to do… If you go to a vendor, which is common in higher ed, you’re going to have a significant amount of work. Companies like Apple, Google and Tesla have chosen to do a lot of insourced work.

IU’s Data Laundromat

IBM did an assessment of our organization and they told us that a) we had a lot of data, b) our data was not in the most usable format and c) we were lacking in ability to perform effective analysis.

Decision Support Initiative (2015)

  1. Enable report and dashboard “discovery” via search
  2. Created a factory for Decision Support Initiatives
  3. Agile Methodology (then run, run, run!)

The initiative goal: Improve decision making at all levels of IU by dramatically enhancing the availability of timely, relevant, and accurate info to support decision makers.

It will:

  • Affect people and orgs
  • Affect Data and Technology
  • Improve decision outcomes

Will clean data lead to good decisions?

Maybe, maybe not…

Caution

From Ackhoff’s Management MISinformation Systems, Management Science, written in 1967:

  1. In many cases, decision makers suffer from an overload of irrelevant information more than a lack of relevant information for a decision.
  2. In many cases, decision makers often request vastly more info than needed to optimally make a decision.
  3. In many cases, decision makers do not have refined models of the relevant information that best predict desired outcomes.

What’s up with YOUR data laundry? (Q&A)

How importance is data governance? Boiled down:

  1. Who has input rights? This should be broad.
  2. Who has decision rights? This should be narrow.

At IU, the data belongs to the trustees. Within compliance with laws (FERPA, HIPAA, etc.) and policy, it can be made available to the appropriate folks.

Initiative Impact Analysis to Prioritize Action and Resource Allocation

Presenters

  • Virginia Fraire, VP of Student Success, Austin Community College District
  • Laura Malcom, VP of Product, Civitas Learning Inc.
  • Angela Baldasare, Asst. Provost, Institutional Research, The University of Arizona
  • partnerships@civitaslearning.com
  • civitaslearning.com

University of Arizona

  • Goal: improve 1st year retention rate from 81% to 91% by 2025
  • How do we find and integrate good data to make good decisions that help our students?
  • When I came on board, I found out that we never had a centralized student academic support office
  • SALT office (Strategic Alternative Learning Techniques) – used to support students with learning disabilities. How can we adopt and adapt some of the techniques that worked there?
  • We were using siloed participant data that was not very helpful. It was not transformative and it didn’t tell us much.
  • We came to Civitas for help.
  • In 2009, U of A opened doors to the “Think Tank” to streamline and centralize a number of academic support services offered by nationally certified tutors; mission is to empower UA students by providing a positive environment where they can master the skills needed to become successful lifelong learners.
  • In one year, nearly 11,000 students make more than 70,000 visits and spend 85,000+ hours with support staff.

Think Tank Impact

  • Illume Impact used PPSM to measure 2.7%(pp) overall life in persistence for students using the writing center
  • 3.4% (pp) increase for 1st year students
  • Less than 10% of 1st year students taking advantage of this service!
  • These results will inform strategic campaigns to offer Think Tank services to students as part of first-year experience.
  • 8.2% persistence increase for students who were most at risk

Taking Initiative With Confidence

  • Sharing impact findings with academic colleges to discuss the need for increased referrals to Think Tank.
  • PPSM has changed the conversation with faculty who want rigorous data.
  • Bolstering credibility and validity to Think Tank services.

Austin Community College

Highland Campus is home to the ACCelerator, one of the largest high-tech learning environments in the country.

“The Accelerator”

  • Provides access to 600+ desktop computer stations spread over 32,000 square feet, surrounded by classrooms and study rooms.
  • Offers redesigned DevEd math courses powered by learning software with an onsite faculty members, tutors and academic coaches to increase personalization and engagement
  • Additional support services are offered, including non-math tutoring, advising, financial aid, supplemental instruction, and peer tutoring.
  • During the 2015-16 year, the ACCelerator served over 13,000 unique students in well over 170,000 interactions.

Accelerator Impact

  • Students who visit the lab at least once each term persist at a higher rate.
  • 4x persistence impact found for DevEd students.
  • Part-time DevEd students and DevEd students with the lowest persistence predictions had even better outcomes.
  • 6.15% increase in persistence for students visiting the learning lab.
  • Results are informing strategic decisions about creating similar learning spaces at other campuses.
  • Impact results have helped validate ACC data and in-house analyses
  • Discussions with math faculty continue to strengthen the developmental math redesign
  • Persistence results leading to further investigation of other metrics related to accelerated learning, particularly for DevEd students.
  • For this kind of approach to work, silos need to be broken down.

 

 

 

The 2015 EDUCAUSE MEGA POST

Hello, friends!

As part of my normal conference coverage, I publish a post of posts, which I call a “MEGA POST.”  It’s my attempt to capture all the different sessions I attended.  The 2015 EDUCAUSE annual conference in Indianapolis had so many great sessions, I often found it difficult to pick one over another.  The increased use of data to drive campus decision-making was a hot topic at the conference this year.

I do my best to capture the content of ever session, but I am human…any errata, misstatements or omissions are totally mine.  I hope you find some benefit from my conference experience.  Enjoy!

Tuesday, October 27

  1. EDUCAUSE 2015!
  2. Building an Emerging Technology and Futures Capacity in Your Organization
  3. Cloud 101:  Tools and Strategies for Evaluating Cloud Services

Wednesday, October 28

  1. KEYNOTE:  The Cascade Effect:  How Small Wins Can Transform Your Organization
  2. A View from the Top: Taking the Mobile Experience to New Heights
  3. The Science of Predictive Analytics in Education
  4. Opening Up Learning Analytics:  Addressing a Strategic Imperative

Thursday, October 29

  1. The 2015 Campus Computing Survey
  2. Web Portals
  3. KEYNOTE:  The Second Machine Age:  Work, Progress, and Prosperity in a Time of Brilliant Technologies
  4. Optimizing Business Intelligence at Lehman College/CUNY:  A Road to Change
  5. Predictive Learning Analytics:  Fueling Actionable Intelligence
  6. Unifying Data Systems to Turn Insights into Student Success Interventions

Friday, October 30

  1. How to Use the EDUCAUSE CDS to support Student Success
  2. Progress on Using Adaptive Learning Technology for Student College Success
  3. KEYNOTE:  If You Build It:  The Power of Design to Change the World

Progress on Using Adaptive Learning Technology for Student College Success

Presenters:

  • Yvonne M. Belanger, Senior Program Officer, Bill and Melinda Gates Foundation
  • Jo Jorgenson, Dean of Instruction & Community Development, Rio Salado College; ALMAP Grantee
  • Douglas Walcerz, VP Planning Research and Assessment, Essex County College; ALMAP Grantee
  • Louise Yarnall, Senior Researcher, SRI International; ALMAP Portfolio Evaluator

This is the last concurrent session of the conference.  Most of this presentation will be about specific implementations of adaptive learning at a couple institutions.

Adaptive Learning Market Acceleration Grant Program (ALMAP)

  • 14 grants
  • 17 colleges
  • 9 adaptive learning platforms
  • 22 courses
  • 44% average % of Pell eligible students at grantees
  • 21,644 total students enrolled across 3 terms
  • 699 instructors

Adaptive Tech personalizes instruction and learning.  Courseware provides customized feedback to student on learning gaps.  Courseware tracks progress for instructor support.

ALMAP vision and goals

Expand and build understanding of how US higher ed programs are using adaptive courseware to support student success and completion.

ALMAP Evaluation Portfolio

  • 14 grantees conducted QED student impact evaluations.  Collected instructor / student survey data and cost data.  Collected over 3 academic terms (summer 2013 – Winter 2015.
  • Grantee studies featured 3 different types of comparisons:  lecture vs. blended adaptive; online vs online adaptive; blended vs blended adaptive
  • Evaluator checked rigor of local designs, extracted insights across portfolio.

What Did You Do and Why?

Essex County, 12,000 students.  Math sequence is the biggest barrier to success.

  • How did the adaptive courseware meet your expectations?  In the adaptive classes, students use labs with adaptive courseware, and we ask the students to set goals for the things they want to master.  Invariably, the goals student set for themselves are higher than what they actually achieve.  This is something that we then work with them on.
  • The software worked perfectly for us, did exactly what we expected of us.  However, the adaptive software took our instructors about 2 semesters to get fluent with.

Rio Salado, with 60,000 students.

  • Our courses were fully online, using Pearson’s product.  We looked at student learning outcomes, faculty/staff feedback, and cost analysis.  What we’ve seen in the past is that our students tend to drop out if they were less than successful with their coursework, or if the class was “too slow” for them.
  • We were mostly satisfied with our experiment with adaptive learning.  We had a fluid working relationship with Pearson, and they were amenable to working out difficulties we had with our pilot.  Our writing assessments needed more content for our students’ needs.  While we could pick content from what Pearson had to offer, we could not develop our own.  We had to take our material for the writing assessments to beef up the product.  We videotaped sessions and embedded writing into each lesson to help ensure completion.

Aggregate Evaluation Research Questions

  • What student impacts are noted and in what HE contexts/disciplines?
  • How does using adaptive courseware affect the costs of instruction?
  • How are students and instructors experiencing adaptive courseware?

ALMAP Evidence of Impacts

Significant positive course grade gains were noted when adaptivity was:  part of course redesign (lecture to blended) OR added to online courses BUT NOT when replacing another blended technology.

Product features linked with learning gains: progress dashboards, regular quizzes/feedback; referrals to remedial content and study tips; spaced memorization practice; vendor content (but 1 supported memorization of faculty content)

Course disciplines showing more learning gains:  50% of psychology courses; 42% of math courses; 25% of biology courses; 16% of English courses

Instructor Experience:  78% of instructors reported satisfaction; 57% devoted 1-9 hours to courseware training

Student Experience:  most students reported positive learning gains; students reported different levels of engagement.

Courseware Cost Drivers & ROI

  • Courseware based on instructor content had 8% to 19% higher development and training costs.
  • Most cost reductions occurred when adding adaptivity during course redesign, so cannot attributed to courseware.

How did you change your use of adaptive learning products over time and why?  What’s next for you?

  • In Essex County, we’re not changing our approach at all.  Students going through the adaptive developmental classes are showing greater signs of success in traditional COLLEGE LEVEL math courses later on, which for us are ONLY delivered in a traditional way.
  • At Rio Salado, we didn’t see much difference, but we’re still following the students who went through online versus adaptive classes.  We’re now doing “student learning cafes,” group sessions where students and faculty can share their experiences with using the adaptive learning material.  Students like the ability to move at their own pace, but faculty want improvements in navigation and assessments.  We have 3 grant opportunities that we’re pursuing to do more.

How to Use the EDUCAUSE CDS to Support Student Success

Presenters

  • Susan Grajek, Vice President, Data, Research, and Analytics, EDUCAUSE
  • Laurie Heacock, National Director of Data, Technology and Analytics, Achieving the Dream, Inc.
  • Louis Kompare, Director, Information Systems and Services, Lorain County Community College
  • Celeste M. Schwartz, VP for IT & IR, Montgomery County Community College

Susan kicked off this session by describing what the CDS is.  It’s been around for over 10 years, includes data from over 800 institutions and allows members to use it to:

  • Study their IT org
  • To benchmark against past performance
  • To look at trends over time
  • To start gathering and using metrics
  • To have data available “just in case”

TOP IT ISSUE #4

Improve Student Outcomes Through an Institutional Approach that Strategically Leverages Technology. Data shared today come from module 3 of the CDS

Student Success Technologies Maturity Index

These 6 measurements are set by subject matter experts, and are measured against a 5 point radar scale

  1. Leadership and governance
  2. Collaboration and involvement
  3. Advising and student support
  4. Process and policy
  5. Information systems
  6. Student Success analytics

Maturity Index

  1. Weak
  2. Emerging
  3. Developing
  4. Strong
  5. Excellent

Deployment Index

  1. No deployment
  2. Expected deployment
  3. Initial deployment
  4. targeted deployment
  5. institution-wide deployment

Goal

Provide higher ed institutions with a reliable, affordable, and useful set of tools to benchmark and improve the cost and quality of IT services, improving the value and efficiency of IT’s contribution to higher education.

Process

Complete Core Data > order and configure reports > receive and use reports.  It takes between 40 and 70 hours to complete, but data is saved for auto-filling the following year.  This speeds the re-entry process considerably.

You can also use the reports for benchmarking against other institutions.  You can create your own, and some peer groups are pre-provided for you.

Achieving the Dream’s Institutional Capacity Framework

Montgomery County Community College (near Philadelphia), about 13,000 students, participating in CDS for about 13 years.  Celeste then went on…In the past, we used CDS more on the justification of new staff.  We used to look at numbers of computers for students, but we tend to look at those numbers less today.  What’s really helped us recently are in how we ask questions about technology.  While you only HAVE to complete module 1, I recommend you dip your toes in some of the other modules.  I’ve used SurveyMonkey to extend my reach and gather additional information from other folks, and then moved it into CDS.  The CDS is really helping to drive our own IT strategic plan.

Lorain Community College (near Cleveland), about 12,000 students, participating in CDS for 2 years.  Our enrollment is highly tied to local industry; local business cycles make make our completion rates look terrible!  CDS is the most valuable way I have to find out the various elements of IT in the higher ed world.  It really helps to discover the things that change from year-to-year.

Top 10 IT Issues Sneak Peek

Coming out in January in EDUCAUSE Review.  IT security is the #1 issue.  Three dimensions that will be discussed in the upcoming report:

  • Divest:  change the way you design, deliver and manage IT services.  Eliminate old processes and silos!
  • Reinvest:  to run state-of-the-art technology services, you need to double down on some things, like information security.  Hiring and retaining good talent, along with restructuring that talent to meet the changing needs of delivering IT services.  The ability to change funding models to meet those needs is also important.
  • Differentiate: institutions are now able to apply technology to strategically meet their goals and differentiate themselves from other institutions.  Ability to apply analytics against strategic objectives is hugely valuable to help provide feedback on where we are and what we need to do to improve.