Student Affairs Technology

Telling the Student Affairs Story: Answering Big Questions with Big Data

  • Adam Cebulski, Assistant Vice President and Chief of Staff, Southern Methodist University
  • Sara Ousby, Director, Student Affairs Assessment, University of North Texas


  • To discuss trends in big data and the implications for higher ed
  • ID strategies for building data warehouses & analyzing data sets
  • Share successes and challenges
  • Story telling
  • Strategies

Landscape of Big Data

3Vs: variety (lots of kinds of data); volume (more info than we know what to do with); velocity (collecting data at a higher rate than ever before).

There are tons of software packages that “do” big data, but buying software is not going to answer your problem! Big data translates into decision making through different processes, and that’s what we’re going to talk about.


Stories are far better at conveying what your data says than just the data itself. NASPA’s analytics study from 2017 identifies the following entry points for big data for predictive analytics: pre-enrollment > academics > motivation & self-efficacy > use of support services > student engagement

Stories are just data with soul. Stories cross the barriers of time, past, present and future, and allow us to experience the similarities between ourselves and through others, real and imagined.

Create a data story

Data + Narrative + Visuals

Case Study: SMU

We have no centralized data system, and we’re a Peoplesoft campus. We centralized OIT and brought on a new CIO from University of Illinois. We have a large Greek population and we experienced 315% increase in AOD offenses in one academic year. We introduced a number of programs and interventions to address this challenge.

  • Why the large increase?
  • Who is most at risk
  • How and when to intervene?
  • Campus partners: IR, OIT
  • Data identification

We’re a Maxient campus, so we did a lot of ETL (extract, transform and load) processes to make this work from a technical perspective., Maxient offers no APIs.

We built a BEAM model: Business Event Analysis & Modeling

  • Customer focused
  • Flexible design
  • Report neutral
  • Prevents rework
  • Saves time

Goal was to build a data warehouse to assist with our analysis and reporting. We started in 2017 and plan to launch in the next week with a dashboard as part of phase one. We needed to hire a data architect and data visualizer: these were university hires that “live” in OIT. At $125K each, these are not cheap resources (but they are an excellent investment).

A BEAM table consists of events and then we think about related events, i.e. sports game, finals, etc. that could be related. At the top we consider a range of other items associated with the charge/sanction, i.e. weather, did we win the game, what class level is the student, etc. We even pull in data about the students, such as if their parents are wealthy donors. This allows us to create a “star schema” which creates a comprehensive picture of the issue. Some of the criteria allow us to set a ranking for each of the events, which in turn allows us to prioritize items. One of the data points is which offices are responsible for addressing the issues. We started with 100, but grew to 279 unique variables that could be associated with a particular conduct case.

These variables allow us to build dashboards that rationalize the data for our staff (intervention or otherwise). The vast majority of people in the system were actually recruits. It’s mostly 1st and 2nd years that get caught up in our system. We were able to change policy immediately based on the insights our system provides.

Case Study: University of North Texas

We are 38,000 students in the DFW metroplex. We are minority majority, public tier one institution. 1st year residential requirement. Majority live in Denton County.

Our Questions

  • What are the differences in retention for students who are engaged on campus?
  • What are the differences in GPA for students who are engaged on campus?
  • Campus partners: Data analytics & IR, IT shared serices
  • Data Collection

We are going to pull card swipe data into our system soon! We’re going through the data dictionary of card swipes now, primarily using Excel and lots of pivot tables. We’re looking right now at correlation information with respect to retention.

We’ve had a lot of growth in card swipe usage. We have 220,000 card swipes into our student recreation center, and we plan to pull in the Career Center’s info next. There does appear to be a difference in retention of card swipers over non card swipers (81.18% vs. 64.02%).

Telling our story and making decisons

  • Focus on completion
  • 1st year students are those leaving at the greatest rates
  • Most impact on FTIC
  • Higher impact on men

Q: Are you planning an ROI analysis?

AC: We quantified every action with a dollar value. Our interventions have already saved over a million dollars so far. We swipe for EVERYTHING (we use CampusLabs).

Q: What does your data cleaning process look like?

AC: it’s awful! And, it’s ongoing. We’ve had to create many transformation tables, and we had a lot of silo’ed data that needed work.

SA: your data dictionary will go a long way in solving this challenge.

Q: are card swipes weighted equally?

SA: yes (for now). But we’re looking at this. Card swiping is now universal across the campus.

AC: we tie our NSSE and use ID Link to tie our data together.


My EDUCAUSE 2013 Mega Post

One the things I try to do when I attend conferences is to make a detailed record of all the sessions I attend, with the exception of keynotes, which tend to get really good coverage from other folks.  I live blog the events as I attend them, which hopefully helps those who committed to other sessions, and then I do one of these “mega posts,” which summarize all the posts I attended.  Based on my itinerary, 2013 seems to be the year of big data and analytics.  I’m willing to bet a lot of my fellow attendees will agree 🙂

I’ve been in higher education for just over seven years now, and somewhat amazingly, this was the very first EDUCAUSE event I’ve ever attended.  Why didn’t anyone tell me about this conference?  It was an extremely worthwhile event, at least for me…one of the meetings I had will likely save my division close to $50,000 each year!  That savings will go a long way toward providing students at CSUN with more and/or better services.  There were lots of great sessions to attend, with lots of smart folks sharing what they’re doing with IT on their campuses.  I’ll definitely be back next year.

Without any further ado, here’s my EDUCAUSE 2013 mega-post…please drop me a line and let me know if this helps you!


Friday, October 18 (last day of EDUCAUSE was a half day)


Thursday, October 17 (my busiest day)


Wednesday, October 16 (spent a few hours prowling the vendor floor and visiting with my accessibility colleagues)


Tuesday, October 15 (each session was a half-day long)



Turning Big Data Analytics into Personal Student Data

Title:  Turning Big Data Analytics into Personal Student Data


  • Shah Ardalan, President, Lone Star College System
  • Christina Robinson Grochett, Chief Strategist – Innovation & Research, Lone Star College System


SLIDE:  The Challenge

  • Why is our educational ranking getting worse as technology becomes faster and bigger
  • Why is the US GDP still hanging around 2.0
  • Why is the unemployment rate not reduced to an acceptable level?
  • Whay are there 4 million unfilled jobs in the U.S?


SLIDE:  The Buzz

  • Analytics
  • Cloud Computing
  • SaaS
  • BYOD
  • Big Data


Assumption is that big data can solve our big problems


The DOE MyData Button

In October 2012, the DoE announced they will add a “MyData” download button to allow students  to download their own data into a simple, machine-readable file that they could share at their own discretion, with 3rd parties that develop helpful consumer tools.


The Solution:  

What it is:

The Technical Spec

  • HEY, QUICK QUESTIONS:  do students get hired off data?  NO
  • …analytics  NO
  • …reports  NO
  • documents:  YES  (transxripts, diplomas, resumes, etc.)


Education and Career Positioning System, MyEdu Vault

Self Assessment:  values, interests, skills, personality type.  Shows jobs available.


WOW, this is a lot like the Pathways tool my team built:



Is this available for anyone?  Yes.  It’s available for $50 / year by the student, not the institution.



Learning Analytics for Educational Design and Student Predictions: Beyond the Hype with Real-Life Examples

Title:  Learning Analytics for Educational Design and Student Predictions:  Beyond the Hype with Real-Life Examples


  • Nynke Kruiderink, Teamleader, Educational Technology Social Sciences, University of Amsterdam
  • Perry J. Samson, Professor, Department of Atmospheric, Oceanic & Space Sciences, University of Michigan-Ann Arbor
  • Nynke Bos, Program Manager, Educational Technology, University of Amsterdam


SLIDE:  Lessons Learned:  February 2012 – Present

Our Proof of Concept is Two-Tiered

  1. Interviews with lecturers, professors, managers
  2. Gather and store data in central place for easy access

General things we learned

  • Emotional response to “Big Brother” aspect of accessing data
  • Data from LMS not detailed enough (folder based, not file based)
  • 50% of learning data available
  • Piwki, not secure enough


Next Steps

  • Focus group:  learning analytics
  • Professor Erik Duval – KU Leuven  (his advice: undertake one project involving all that proves learning analytics is useful)

What is the Problem?

Recorded lectures

Recording of F2F lectures

No policy at the University of Amsterdam

Different deployment throughout the curriculum

Not at all (fears/emotional)

Week after the lecture

Week before the assessment


Student vs. Policy

Students demanded policy

QA department wanted insight into academic achievement before doing so

Development of didactic framework

Research:  learning analytics



  • Two courses on psychology
  • Courses run simultaneously
  • Intervention in one condition, but not the other


Data Collection

  • Viewing of recorded lecture
  • Lecture attendance per lecture
  • Final grade on the course (with more segmented view)
  • Some other data (grades on previous courses, distance to the lecture hall, gender, age, etc.)


Lessons Learned

  • Let people know what you are doing
  • Data preparation:  fuzzy, messy
  • Choose the data
  • Simplify the data
  • Keep an eye on the prize


LectureTools:  Analytics

Females in class were much more likely to ask questions using a clicker (lovingly referred to as the “WTF” button).

90% of students at University of Michigan in the meteorology class cited felt they would have gotten the same grade if they had never opened the textbook








Ethics and Analytics: Limits of Knowledge and a Horizon of Opportunity

Title:  Ethics and Analytics –  Limits of Knowledge and a Horizon of Opportunity


  • James E. Willis III, Ph.D., Purdue University
  • Matthew D. Pistilli, Ph.D., Purdue University


(See my related post, “The Edward Snowden Affair: A Teachable Moment for Student Affairs and Higher Ed“)


Highly interactive session, sat in groups of five and discussed among ourselves.

SLIDE:  Lack of Discussion of Ethics in Data and Technological Innovation


  • Cutting-edge technologies are being developed every dy
  • Cost/profit margins are determined with speed of development
  • Typeicall education lines are split between science adn humanities/technolgoy and ethics
  • Difference between what can be done and what should be done


SLIDE:  Where to Go from Here?

  • Begin the discussion now
  • Have the difficult converations
  • Bring together the stakeholders:  data scientists, engineers, managers, ethicists
  • Establish a framework to adapt quickly to questions of whether or not an action or research agenda should occur


SLIDE:  Ethical Discussions = Innovation

Logic of negation

  • Why shouldn’t we do this?
  • What should we do instead?

Different paths emerge from divergent conversations

  • Stakeholders have different voices and understandings of potential paths of development


SLIDE:  Potter’s Box


  • ID values
  • Appealing to ethical principles
  • Choosing Loyalties
  • Responsibilities and recommendations

How Potter’s Box is useful for ongoing discussions

If you know a student is at risk, what do you do?  What is your responsibility?


SLIDE:  Research Questions

  • What do the ethical models look like?
  • How are these models deployed rapidly – at the speed of technology?
  • How are these models refined with time?


SLIDE:  Group Discussion 1

What are the current projects going on in learning analytics today?  What are the potential ethical pitfalls that surround these developments?  Why are they potentially harmful?  Are these things always wrong or are they contextually wrong?

Some of the ideas generated by the room:

  • Type 1 / Type 2 error:  will the student pass this class?  What’s my prediction – will the student pass or fail the class?  How accurate is your prediction – did your model work?
  • Is it right to get information from the students?  Where does the locus of control lie?  Does the institution need to take more responsibility for this?
  • University One (FYE equivalent in Canada) – informal collection and sharing with the next year.  Are we branding / labeling students appropriately?  Are we sharing this information appropriately?  Who should know what and at what point?  Will that affect their future studies?
  • Top-down approach using a product with funding from the Gates Foundation (similar to the InBloom project).  Goal is to make a product that analyzes what students should be taking.  Pitfalls:  don’t know model, don’t know the raw data.


SLIDE:  Group Discussion 2

What is the role of “knowing” a predictive analytics – once something is known, what are the ethical ramifications of action or inaction?  What are the roles of student autonomy, information confidentiality, and predictive modeling in terms of ethical development of new systems / software / analytics?

  • How much do we really know?  If we have a high level of confidence, what is our responsibility?
  • Could it be solved by giving students opt-in versus opt-out?
  • Discovering things, going down certain paths that may raise ethical questions about students who may not succeed…i.e. financial burdens that may be incurred due to failure.


SLIDE:  Group Discussion 3

How might we affect the development of future analytics systems by having ethical discussions?  What are possible inventions that could come from these discussions?

  • See more active participation from the students, with a better understanding of how their data is going to be used.
  • Obligation to allow students to fail; channeling people into things they can be successful with.
  • EU regs:  disclosure, portability, forget
  • Portability:  health care analog; your info can go from one hospital to another, and that data has real-world benefits to patients.
  • Information can also say a lot about faculty too.


SLIDE:  Group Discussion 4

What are the frontiers of ethical discussions and technology?  Are there new frameworks?  How can we affect the future by today’s discussions?

  • Maybe you can redefine what student success is?  Maybe it isn’t graduation rates…
  • How do we affect the future?  It’s all of our roles to be a part of this conversation!  You don’t really have the option to say “I don’t have the time for this”



%d bloggers like this: