Categories
Education Technology

data visualization of progress towards degree and financial aid awareness

Presenters

  • Marc Fox, Senior Director, Enterprise Systems, CSUS
  • Shiva Pillai, Strategic Data Analyst, CSUS

CSU Grad Initiative 2010: Sac State’s “Challenge”

  • Increase grad rates
  • Systemwide grad rate: just over 50% of students graduate in 6 years
  • Sac State grad rate: just over 43% of students graduate in 6 years
  • Goal: increase 6 year FT from 43% – 51%
  • Goal: increase transfer rate from 63% – 68%

Theme Group & Directives

  • ID students who are making progress
  • Develop new ways to assign registration appts in supporting this goal
  • Find ways to recognize and report a student’s progress toward degree

Approach

  • How to measure?
  • Develop a points system based on student’s progress towards their degree – measuring completion of acad reqs in their academic program?
  • Keep it simple, fair, and easy to explain

Points system for measuring progress to degree (points do NOT equal to units/credit hours)

  • GE reqs: 40 points (lower division GE: A=7.5, B=7.5, C=7.5, D=7.5, E=3.0) Upper division GE: 7.0 (total GE points)
  • Major reqs: 50 points (Declare a major=8, Major=42 points; pre-majors/impacted majors 14, associated majors = 28)
  • Additional reqs: 10 points (<150 units/credits=5, academic good standing=5)

Q: did academic senate need to approve? Yes, they did weigh in.

Registrar’s Math

  • Degree audit driven by requirements group (RG)
  • Partial completion of RG deserves partial points
  • Keep it fair

And then..why not

  • Meters are visible to students, advisors, administrators
  • Real time in self service: progress meters calculated in real time
  • Snapshot for registration appointments: snapshot of the progress meters is taken the weekend prior to assigning registration appointments.

Keys to Degree Toolbox icons/links are located on a static web page that make access to these tools easy.

Self Service – Real Time Meters

  • Only 1 CAAR report value in the SAA_ADB_RESULTS table
  • Meters are calculated real time with SAA_ADB_RESULTS table
  • TCe/Substitutions/Grade changes effective immediately
  • CAAR reports are batch updated at key points in time during term for all undergrads

Registration

  • Priority Groups: vets, students with disabilities, athletes, honors program
  • Grad Seniors: current semester, subsequent semester
  • Total # of units earned: seniors, Grads Juniors, Sophomores, Freshman
  • 2014 seniors were allotted registration appointments based on progress to degree reports

Overall: we’re getting our most efficient and productive students out quicker.

Student Response

  • Easy to understand the progress meters
  • Real time data
  • Accurate
  • Early turning in substitution and waivers
  • Prompt submission of e-transcripts or late transcripts
  • Impact of multiple major changes

Batch Advisement Processes

  • ADMIN – CAAR Batch
  • Last weekend of each month
  • Freshman, Sophomores, and Juniors
  • Seniors run separately
  • Run time @ 36 hours
  • Refresh tables for accurate reporting

Reporting – Query/Data Warehouse

  • Student progress by dept/plan/class level
  • Combine progress to degree data with any other tables

Other stuff we did

  • We created Financial Aid FA Meters for Pell Grant, State Grand, and Federal Load
  • We also added a what-if progress to degree comparison tool within
  • Future development: cloud, mods & bolt-ons, new technology?
  • FluidUI
  • iPaaS
  • Tableau/EAB/Civitas/Salesforce

Demo

Categories
Education Technology

Who Is Doing Our Data Laundry?

Presenter:

  • Brad Wheeler, Ph.D., IU VP for IT, & CIO; Professor of Information Systems, Kelly School of Business

The world is deluged with data, but you may be asking yourself, what should I do? If they don’t do anything to inform decisions to meet the goals we’re pursuing, what good are they? Are you trying to

  • Rapidly remediate info reporting?
  • Enable better financial decisions?
  • Accelerate student success goals?
  • Empower advisors?
  • Benchmark yourself?

The act of working on data to get what you want is a bit like doing laundry:

  1. You put in capital
  2. You put in labor
  3. You add consumables

…and from this, we expect clean, organized clothes.

By “data laundry,” I’m referring to legitimate process of transforming and repurposing abundant data into timely, insightful, and relevant info for another context. It is a mostly unseen, antecedent process that unlocks data’s value and insights for the needs of decision makers.

Our institutions are often quite data-rich and insight-poor.

Two distinct phases to doing data laundry

  1. Data cleaning
  2. Presenting data as information in a context in which it can be used

Data Cleaning

Discovering > Extracting > Re-coding > Uploading > Normalizing

Information Presentation

Enriching > Comparing > Presenting (this is the “Magic Bundle”)

Insource or Outsource

You can buy the equipment and do the work ourselves, or go to the dry cleaners. Even if you go to the dry cleaners, you still have work to do… If you go to a vendor, which is common in higher ed, you’re going to have a significant amount of work. Companies like Apple, Google and Tesla have chosen to do a lot of insourced work.

IU’s Data Laundromat

IBM did an assessment of our organization and they told us that a) we had a lot of data, b) our data was not in the most usable format and c) we were lacking in ability to perform effective analysis.

Decision Support Initiative (2015)

  1. Enable report and dashboard “discovery” via search
  2. Created a factory for Decision Support Initiatives
  3. Agile Methodology (then run, run, run!)

The initiative goal: Improve decision making at all levels of IU by dramatically enhancing the availability of timely, relevant, and accurate info to support decision makers.

It will:

  • Affect people and orgs
  • Affect Data and Technology
  • Improve decision outcomes

Will clean data lead to good decisions?

Maybe, maybe not…

Caution

From Ackhoff’s Management MISinformation Systems, Management Science, written in 1967:

  1. In many cases, decision makers suffer from an overload of irrelevant information more than a lack of relevant information for a decision.
  2. In many cases, decision makers often request vastly more info than needed to optimally make a decision.
  3. In many cases, decision makers do not have refined models of the relevant information that best predict desired outcomes.

What’s up with YOUR data laundry? (Q&A)

How importance is data governance? Boiled down:

  1. Who has input rights? This should be broad.
  2. Who has decision rights? This should be narrow.

At IU, the data belongs to the trustees. Within compliance with laws (FERPA, HIPAA, etc.) and policy, it can be made available to the appropriate folks.

Categories
Technology

Initiative Impact Analysis to Prioritize Action and Resource Allocation

Presenters

  • Virginia Fraire, VP of Student Success, Austin Community College District
  • Laura Malcom, VP of Product, Civitas Learning Inc.
  • Angela Baldasare, Asst. Provost, Institutional Research, The University of Arizona
  • partnerships@civitaslearning.com
  • civitaslearning.com

University of Arizona

  • Goal: improve 1st year retention rate from 81% to 91% by 2025
  • How do we find and integrate good data to make good decisions that help our students?
  • When I came on board, I found out that we never had a centralized student academic support office
  • SALT office (Strategic Alternative Learning Techniques) – used to support students with learning disabilities. How can we adopt and adapt some of the techniques that worked there?
  • We were using siloed participant data that was not very helpful. It was not transformative and it didn’t tell us much.
  • We came to Civitas for help.
  • In 2009, U of A opened doors to the “Think Tank” to streamline and centralize a number of academic support services offered by nationally certified tutors; mission is to empower UA students by providing a positive environment where they can master the skills needed to become successful lifelong learners.
  • In one year, nearly 11,000 students make more than 70,000 visits and spend 85,000+ hours with support staff.

Think Tank Impact

  • Illume Impact used PPSM to measure 2.7%(pp) overall life in persistence for students using the writing center
  • 3.4% (pp) increase for 1st year students
  • Less than 10% of 1st year students taking advantage of this service!
  • These results will inform strategic campaigns to offer Think Tank services to students as part of first-year experience.
  • 8.2% persistence increase for students who were most at risk

Taking Initiative With Confidence

  • Sharing impact findings with academic colleges to discuss the need for increased referrals to Think Tank.
  • PPSM has changed the conversation with faculty who want rigorous data.
  • Bolstering credibility and validity to Think Tank services.

Austin Community College

Highland Campus is home to the ACCelerator, one of the largest high-tech learning environments in the country.

“The Accelerator”

  • Provides access to 600+ desktop computer stations spread over 32,000 square feet, surrounded by classrooms and study rooms.
  • Offers redesigned DevEd math courses powered by learning software with an onsite faculty members, tutors and academic coaches to increase personalization and engagement
  • Additional support services are offered, including non-math tutoring, advising, financial aid, supplemental instruction, and peer tutoring.
  • During the 2015-16 year, the ACCelerator served over 13,000 unique students in well over 170,000 interactions.

Accelerator Impact

  • Students who visit the lab at least once each term persist at a higher rate.
  • 4x persistence impact found for DevEd students.
  • Part-time DevEd students and DevEd students with the lowest persistence predictions had even better outcomes.
  • 6.15% increase in persistence for students visiting the learning lab.
  • Results are informing strategic decisions about creating similar learning spaces at other campuses.
  • Impact results have helped validate ACC data and in-house analyses
  • Discussions with math faculty continue to strengthen the developmental math redesign
  • Persistence results leading to further investigation of other metrics related to accelerated learning, particularly for DevEd students.
  • For this kind of approach to work, silos need to be broken down.

 

 

 

Categories
Technology

The 2015 EDUCAUSE MEGA POST

Hello, friends!

As part of my normal conference coverage, I publish a post of posts, which I call a “MEGA POST.”  It’s my attempt to capture all the different sessions I attended.  The 2015 EDUCAUSE annual conference in Indianapolis had so many great sessions, I often found it difficult to pick one over another.  The increased use of data to drive campus decision-making was a hot topic at the conference this year.

I do my best to capture the content of ever session, but I am human…any errata, misstatements or omissions are totally mine.  I hope you find some benefit from my conference experience.  Enjoy!

Tuesday, October 27

  1. EDUCAUSE 2015!
  2. Building an Emerging Technology and Futures Capacity in Your Organization
  3. Cloud 101:  Tools and Strategies for Evaluating Cloud Services

Wednesday, October 28

  1. KEYNOTE:  The Cascade Effect:  How Small Wins Can Transform Your Organization
  2. A View from the Top: Taking the Mobile Experience to New Heights
  3. The Science of Predictive Analytics in Education
  4. Opening Up Learning Analytics:  Addressing a Strategic Imperative

Thursday, October 29

  1. The 2015 Campus Computing Survey
  2. Web Portals
  3. KEYNOTE:  The Second Machine Age:  Work, Progress, and Prosperity in a Time of Brilliant Technologies
  4. Optimizing Business Intelligence at Lehman College/CUNY:  A Road to Change
  5. Predictive Learning Analytics:  Fueling Actionable Intelligence
  6. Unifying Data Systems to Turn Insights into Student Success Interventions

Friday, October 30

  1. How to Use the EDUCAUSE CDS to support Student Success
  2. Progress on Using Adaptive Learning Technology for Student College Success
  3. KEYNOTE:  If You Build It:  The Power of Design to Change the World
Categories
Student Affairs Technology

Progress on Using Adaptive Learning Technology for Student College Success

Presenters:

  • Yvonne M. Belanger, Senior Program Officer, Bill and Melinda Gates Foundation
  • Jo Jorgenson, Dean of Instruction & Community Development, Rio Salado College; ALMAP Grantee
  • Douglas Walcerz, VP Planning Research and Assessment, Essex County College; ALMAP Grantee
  • Louise Yarnall, Senior Researcher, SRI International; ALMAP Portfolio Evaluator

This is the last concurrent session of the conference.  Most of this presentation will be about specific implementations of adaptive learning at a couple institutions.

Adaptive Learning Market Acceleration Grant Program (ALMAP)

  • 14 grants
  • 17 colleges
  • 9 adaptive learning platforms
  • 22 courses
  • 44% average % of Pell eligible students at grantees
  • 21,644 total students enrolled across 3 terms
  • 699 instructors

Adaptive Tech personalizes instruction and learning.  Courseware provides customized feedback to student on learning gaps.  Courseware tracks progress for instructor support.

ALMAP vision and goals

Expand and build understanding of how US higher ed programs are using adaptive courseware to support student success and completion.

ALMAP Evaluation Portfolio

  • 14 grantees conducted QED student impact evaluations.  Collected instructor / student survey data and cost data.  Collected over 3 academic terms (summer 2013 – Winter 2015.
  • Grantee studies featured 3 different types of comparisons:  lecture vs. blended adaptive; online vs online adaptive; blended vs blended adaptive
  • Evaluator checked rigor of local designs, extracted insights across portfolio.

What Did You Do and Why?

Essex County, 12,000 students.  Math sequence is the biggest barrier to success.

  • How did the adaptive courseware meet your expectations?  In the adaptive classes, students use labs with adaptive courseware, and we ask the students to set goals for the things they want to master.  Invariably, the goals student set for themselves are higher than what they actually achieve.  This is something that we then work with them on.
  • The software worked perfectly for us, did exactly what we expected of us.  However, the adaptive software took our instructors about 2 semesters to get fluent with.

Rio Salado, with 60,000 students.

  • Our courses were fully online, using Pearson’s product.  We looked at student learning outcomes, faculty/staff feedback, and cost analysis.  What we’ve seen in the past is that our students tend to drop out if they were less than successful with their coursework, or if the class was “too slow” for them.
  • We were mostly satisfied with our experiment with adaptive learning.  We had a fluid working relationship with Pearson, and they were amenable to working out difficulties we had with our pilot.  Our writing assessments needed more content for our students’ needs.  While we could pick content from what Pearson had to offer, we could not develop our own.  We had to take our material for the writing assessments to beef up the product.  We videotaped sessions and embedded writing into each lesson to help ensure completion.

Aggregate Evaluation Research Questions

  • What student impacts are noted and in what HE contexts/disciplines?
  • How does using adaptive courseware affect the costs of instruction?
  • How are students and instructors experiencing adaptive courseware?

ALMAP Evidence of Impacts

Significant positive course grade gains were noted when adaptivity was:  part of course redesign (lecture to blended) OR added to online courses BUT NOT when replacing another blended technology.

Product features linked with learning gains: progress dashboards, regular quizzes/feedback; referrals to remedial content and study tips; spaced memorization practice; vendor content (but 1 supported memorization of faculty content)

Course disciplines showing more learning gains:  50% of psychology courses; 42% of math courses; 25% of biology courses; 16% of English courses

Instructor Experience:  78% of instructors reported satisfaction; 57% devoted 1-9 hours to courseware training

Student Experience:  most students reported positive learning gains; students reported different levels of engagement.

Courseware Cost Drivers & ROI

  • Courseware based on instructor content had 8% to 19% higher development and training costs.
  • Most cost reductions occurred when adding adaptivity during course redesign, so cannot attributed to courseware.

How did you change your use of adaptive learning products over time and why?  What’s next for you?

  • In Essex County, we’re not changing our approach at all.  Students going through the adaptive developmental classes are showing greater signs of success in traditional COLLEGE LEVEL math courses later on, which for us are ONLY delivered in a traditional way.
  • At Rio Salado, we didn’t see much difference, but we’re still following the students who went through online versus adaptive classes.  We’re now doing “student learning cafes,” group sessions where students and faculty can share their experiences with using the adaptive learning material.  Students like the ability to move at their own pace, but faculty want improvements in navigation and assessments.  We have 3 grant opportunities that we’re pursuing to do more.
%d bloggers like this: