Category Archives: Education

Who Is Doing Our Data Laundry?

Presenter:

  • Brad Wheeler, Ph.D., IU VP for IT, & CIO; Professor of Information Systems, Kelly School of Business

The world is deluged with data, but you may be asking yourself, what should I do? If they don’t do anything to inform decisions to meet the goals we’re pursuing, what good are they? Are you trying to

  • Rapidly remediate info reporting?
  • Enable better financial decisions?
  • Accelerate student success goals?
  • Empower advisors?
  • Benchmark yourself?

The act of working on data to get what you want is a bit like doing laundry:

  1. You put in capital
  2. You put in labor
  3. You add consumables

…and from this, we expect clean, organized clothes.

By “data laundry,” I’m referring to legitimate process of transforming and repurposing abundant data into timely, insightful, and relevant info for another context. It is a mostly unseen, antecedent process that unlocks data’s value and insights for the needs of decision makers.

Our institutions are often quite data-rich and insight-poor.

Two distinct phases to doing data laundry

  1. Data cleaning
  2. Presenting data as information in a context in which it can be used

Data Cleaning

Discovering > Extracting > Re-coding > Uploading > Normalizing

Information Presentation

Enriching > Comparing > Presenting (this is the “Magic Bundle”)

Insource or Outsource

You can buy the equipment and do the work ourselves, or go to the dry cleaners. Even if you go to the dry cleaners, you still have work to do… If you go to a vendor, which is common in higher ed, you’re going to have a significant amount of work. Companies like Apple, Google and Tesla have chosen to do a lot of insourced work.

IU’s Data Laundromat

IBM did an assessment of our organization and they told us that a) we had a lot of data, b) our data was not in the most usable format and c) we were lacking in ability to perform effective analysis.

Decision Support Initiative (2015)

  1. Enable report and dashboard “discovery” via search
  2. Created a factory for Decision Support Initiatives
  3. Agile Methodology (then run, run, run!)

The initiative goal: Improve decision making at all levels of IU by dramatically enhancing the availability of timely, relevant, and accurate info to support decision makers.

It will:

  • Affect people and orgs
  • Affect Data and Technology
  • Improve decision outcomes

Will clean data lead to good decisions?

Maybe, maybe not…

Caution

From Ackhoff’s Management MISinformation Systems, Management Science, written in 1967:

  1. In many cases, decision makers suffer from an overload of irrelevant information more than a lack of relevant information for a decision.
  2. In many cases, decision makers often request vastly more info than needed to optimally make a decision.
  3. In many cases, decision makers do not have refined models of the relevant information that best predict desired outcomes.

What’s up with YOUR data laundry? (Q&A)

How importance is data governance? Boiled down:

  1. Who has input rights? This should be broad.
  2. Who has decision rights? This should be narrow.

At IU, the data belongs to the trustees. Within compliance with laws (FERPA, HIPAA, etc.) and policy, it can be made available to the appropriate folks.

Is Cloud Identity Ready for Higher Education?

Presenters:

  • Jon Allen, Assistant VP & CIO, Baylor University
  • Kevin Phan, Associate CIO, Pepperdine University
  • Mahmud Rahman, Director of Systems and Banner Services, Mills College
  • Dennis McDermott, CMO, SVP Global Marketing, Fischer International Identity LLC

Level Set

What was the state of your IDM efforts prior to your recent project? What were the biggest challenges you were addressing with your IDM deployment?

JA: we didn’t have a lot in the way of IDM. We had scripts, and Oracle database and batch files. Life cycle becomes difficult in situations like this! We knew we needed to manage it much better than we were. Should we build the car or drive the car?

KP: we’re similar to Baylor: disparate systems, everything was manual, Peoplesoft as system of record, AD authentication, batching and scripts for accounts management; no meaningful events for updating accounts. Going to an off-the-shelf system helped us manage things better.

MR: we had a pretty good system, fed data from Banner into LDAP. However, our system would breakdown, and our system didn’t do deprovisioning well.

Setting a Course

What components did you look for in an identity management solution? Which were most important to you and why?

JA: our search happened about four years ago. Traditional on-premises solutions were great and polished, but they didn’t necessarily work well with the systems we had on our campus. It was more a business and knowledge problem than a technical problem. Very few consultants understood our systems or what we do. We understood the routine functionality of in/out and when things were supposed to happen, but our edge cases were killing us. Audit was made difficult because access forms were being sent by email.

KP: it took us over a year to review the various vendors. Fischer’s system worked simply and easily for us…one connector to our Peoplesoft tables and we were ready to go.

MR: we’re a small school and we had to rely on others’ research to help guide us. We’ve been on hosted platforms for years now with Blackboard and Google, so our fear level was low. Vendors that understand the specific needs of students (meaning of stop outs, incompletes, etc.) was very important to us, and it’s surprising how few vendors actually do get this.

Resistance to Change?

How much resistance did you receive regarding outsourcing your IAM infrastructure? Who was resistant and how did you win them over? What would you say to those who prefer home-grown solutions?

JA: since I was the one bringing this project to the table, there was little resistance (I’m normally the one who slows projects down!). A big part of getting people on board was sharing what it would do for them as stakeholders, i.e. HR provisioning of new staff and faculty. Once HR saw what it would do for them, they were completely on board.

KP: we had political resistance. We overcame that by demonstrating cost savings with our CFO. We also were able to translate business value by showing reduction in number of help desk tickets. Convincing internal IT folk was the hard part…giving up control was WAY more challenging than it should have been.

MR: we had no resistance. Most people don’t see IDM as something important unless they can’t access resources. Sysadmins are no longer the ones who have to deal with the day-to-day ordinary functions. Our time spent on IDM is a lot smaller now.

Deployment

What was your approach for deploying IAM? How did you mitigate risk to achieve project success?

MR: we should have had more conversations with HR and Admissions first (there was turnover at the time, which continues). The people responsible for setting flags and attributes initially have moved on, so IT is playing a significant teaching role for the organization. The process allowed us to get a lot more granularity with respect to roles, which we accomplished before through creating exceptions (build exceptions into patterns).

KP: learning what our customers’ pain points were guided what we did first. Password management was a big problem, so we tackled that password self-service portal first. Second phase was the top 30 action codes in Peoplesoft. Most of the time, I had to “be a parent” to the project team when addressing challenges around control.

JA: we limited the scope of the systems to key systems first, including Banner and O365.

Sharing Outcomes

What are the top factors that made your project successful? What would you do differently? What would you say about home-grown IAM?

MR: we have a very smart Banner programmer! We also had a lot of cooperation from other IT staff, particularly sysadmins and the help desk. Our vendor also understood Banner well, which helped a lot. Also, my boss backed me up (huge). If I had to do anything differently, I would probably create more role granularity and more conversations with certain groups on campus like the provost’s office.

KP: top factors were understanding business value and translating to the business units. Understanding systems and data, looking a few steps ahead, identifying potential issues that might come up, having honest conversations with your team, all of these were important. What would I change? It took two years to complete because we didn’t apply enough resources to it.

Baylor Case Study

Why IDM? Security lifecycle, compliance, one of the main controls left. It’s the who, why, what, where, when of people accessing your systems. It’s the keys to the kingdom, it’s nothing sacred, it’s security.

IDM is hard! It’s the ultimate of integrations and it’s something we must have. Project failures are rarely technical. Systems worked where people understood higher ed and IDM. Consultants must know your business.

You need to clearly understand integration, UI (for follow-through and understandability), and you also need some flexibility to address special use cases.

Six months from start to Go Live. Staff must be bought in; testing is critical. Timelines are achievable if stakeholders are available and willing to work in a collaborative way.

Full provisioning: account creation, licensing managed and authorization. However, it doesn’t have to be completely automated. For example, we have a termination list: replaces non-interactive emails, allows for audit trail, deprovisioning the most critical part of the IDM. When we flipped the switch, we had to deal with the edge cases, which allowed us to clean up a lot of the data (source of authority).

IDM is a life cycle. Identity is constantly changing, and perfection is not possible.

Lessons learned: communication (need more of it), wrong assumptions (you can’t assume that HR understands their role – they’re worried about payroll), we want real time access (mistakes like name changes or account deletions are real-time too). Testing is good, but you’re not going to catch everything.

Going forward: more integrations, further refinement, expanding reach to applicants.

Great results: account provisioning/removal smoother, processes are documented.

A Researcher, an Advisor, and a Marketer Walk Into a Predictive Analytics Tool

Presenters

  • Philip Needles, VP of Student Services, Montgomery County Community College
  • Celeste Schwartz, VP for Information Technology and Chief Digital Officer, Montgomery County Community College
  • Stefanie Crouse, Academic Advisor/Assistant Professor, Montgomery County Community College
  • Angela Polec, Executive Director of Marketing & Communications, Montgomery County Community College
  • David Kowalski, Executive Director of Institutional Research, Montgomery County Community College

FUN FACT: this session was interrupted by an evacuation signal, which effectively demonstrated the ability of technology to move lots of people back and forth ūüôā

We learned a lot along the way with Predictive Analytics, we’re here to share the good and bad we’ve experienced along the way.

DK: Once we got the tool, we were like: what do we do with it now?

CS: decision to purchase was made at a conference between the campus president, myself and a vendor. It was “risk investment,” and we weren’t sure if it would be successful. It was less about the tool and more about the people. Initial expectation was very exploratory…we believed that PA would help the institution, but we weren’t exactly sure to what extent. We wanted to be able to take (short term) actions based on what the tool would tell us, and it has done that for us.¬† It took us 18 – 24 months to mash the data and data models loaded into the vendor’s system. We were the first campus to use some of the features of the vendor’s product, which was an interesting experience. We needed our IR executive director to play a critical communications role that would allow our campus to make the leap to implement actions based on our findings. Being inclusive of the various areas of the campus was an objective.

Step 1: Be prepared to invest time & resources without an immediate return. MC3 Time to initial return about two years.

Step 2: Get the right people at the table. MC3 Team about 12 people.

DK: having the latitude to get the right people at the table from the colleges was hugely beneficial. Data the tool provides is around associations and trends. We used an iterative process that helped us make progress as we worked with different campus constituents.

Question: how much time did each team member contribute to the team? DK: it varies depending on the area of inquiry. AC: we have a very active agenda, and everyone at the table is empowered to take action. The more people who have access to the raw data, the better…it helped us work more effectively.

Step 3: Identify Your Research Questions

AP: we had to gain a level of familiarity with the tool to interpret what it was telling us. PA tools can be giant rabbit holes that take you into some scary places. We pulled back a bit to think about hypotheses, and then go into the tool with that intent in a more thoughtful way.

AC: we have different sessions, a 15 week, and two 7 week sessions. Second 7 week session had way more drop-outs; some students were using this session to “beef up their units.” What was happening is that students were simply forgetting to go to their new class! We implemented a text message communication plan to remind them about their class…some of the responses were hysterical: “I TOTALLY FORGOT ABOUT THIS CLASS, THANK YOU!!”

Step 4: Empower the Team to Take Action

2016-2017 about 10 interventions enacted. If your institution is committed to making data-driven decisions, this just helps to drive further assessment and action. For us, it just “fits in” to the way we do our work.

Step 5: Evaluate Your Impact

DK: we do a few things to evaluate impact (the tool has propensity score matching). Some were more successful than others. For example, students who participate more in class tend to persist better; supplemental instruction courses also help a lot.

AP: we always want to “move the needle,” but we also want to influence student behavior. For example, we want them to register early.

Step 7: Review, Revise, Repeat

AP: we’re constantly looking to see how we can improve, bringing new faculty into the fold. How do we put what we’ve learned into the hands of students to help influence positive behaviors. The data has to get out of IR and into the hands of advisors and communications folk and other appropriate folk. What can we do to assist “at risk” students BEFORE we get their GPA?

Digital Literacy: The State of the Art

Presenter:

  • Bryan Alexander, President, Bryan Alexander Consulting, LLC

Resources

Bryan began by talking about the NMC’s Horizon Project Strategic brief…

In 2016, we interviewed all the members of the NMC membership for quantitative and qualitative responses.

Definitions of literacy based on A/B testing:

  • Cultural
  • Cognitive
  • Constructive
  • Communicativve
  • Civic
  • Critical
  • Creative
  • Confident

The 7 Elements of Digital Literacies

  • Media literacy
  • Information literacy
  • Digital Scholarship
  • Learning Skills
  • ICT literacy
  • Career & Identity management
  • Communications and Collaboration

Mozilla Web Literacy

  • Problem-Solving
  • Communication
  • Creativity
  • Collaboration

UNESCO Media & Information Laws of Literacy

Much longer, more political definitions I can’t quite capture as they’re so long. Highlights the difference¬†between production vs. creation

Common Elements

  1. Media literacy > Info literacy > digital literacy
  2. Technical, social and personal capacities
  3. Learners as social, participatory makers

What skill is most important to you? CRITICAL THINKING! Creativity and Problem-solving are right up there, too. Very few digital literacy programs actually exist. Typically, responsibility for this is embodied in a single individual on campus. Conclusion: a university-wide approach is needed.

Literacy Types We Derived from Survey

  1. Universal literacy: basic familiarity with using basic digital tools
  2. Creative literacy: universal literacy, infused with more challenging technical skills (coding, animation, etc.), along with digital citizenship and copyright knowledge.
  3. Literacy across disciplines: different disciplines use different types of tech that are specific to those disciplines.

What Else Should We Add

  • Infographics
  • Data literacy
  • Inc. info and media lit in universal
  • Cyberbullying
  • Privacy
  • Awareness of data used by marketers

Key Findings: Conclusions

  • Should be institutions that provide this kind of information; tech runs at a rapid pace.
  • Implementations should be strategic, i.e. include the whole campus.
  • Students are much more present as makers – make this a part of the pedagogy.
  • Industry partners should be leveraged to prepare students for life outside of school
  • Collaboration across domains is a must
  • Critical social-political engagement
  • Curricula for different areas require different frameworks for implementation of technical literacy; there are also striking differences between regions.
    • Europe: capacities; national & continental frameworks
    • Africa: job skills, economic growth
    • Meddle East: media literacy
    • US: decentralized learning

Where Do We Go From Here?

  • Dealing with fear of fake news
  • Social crisis mode
  • We need to take automation seriously
  • Don’t forget the old stuff!
  • Models of trust

Responsible Use of Student Data in the Digital Era

Presenters

  • Martin Kurzweil, Director, Educational Transformation Program, ITHAKA
  • Mitchell Stevens, Associate Professor, Stanford University

Mitchell gave an introduction describing who he is and what he does at Stanford (MOOCs were mentioned…), and reviewed the agenda:

  1. Why this session?
  2. Past and possible futures of student records
  3. Applying and evolving principles of responsible use

Mitchell also shared the Draft Principles for Responsible Use of Student Data, a hard copy of which was provided to all session attendees.¬† Attendee introductions followed…

Anticipated Takeaways

  • Overview of the current landscape of data-ethics discussions in postsecondary education
  • Recognition of how these discussions are “living” questions on our campuses
  • Tractable principles and policies of potential use in your own institution / organization

MOOCs via Stanford’s Coursera Product had an opportunity to surface the question: “What is the course taker’s status?” Are they a) a student, b) a customer, or c) a human subject? Depending on the course taker’s status, laws governing data usage is vastly different. The Asilomar Convention for Learning Research in Higher Education discussed language describing human beings from an ethical perspective. This convention came up with a concept for “learner,” which is distinct from a “student.” Contractual language surrounding these terms are vastly different.

ru.stanford.edu

Fact 1: There Now is No Default Platform

The classroom:

  • Classroom is physically and temporally bounded location
  • Exists in nested jurisdictions – college, district/system, US state, nation, – with relations negotiated over generations
  • Implies special sovereignty over content and evaluation for instructors

The web:

  • Does not entail physical or temporal boundaries
  • Commingles multiple jurisdictions whose relations are now being negotiated
  • Implies no particular sovereignty over content and evaluation

Fact 2: The Academic Record is Being Remade

Yesterday:

  • Each person had one official college record
  • schools held records exclusively, in trust, in perpetuity
  • Available data for comprising records were thin, controlled by instructor-sovereigns and their designates, and difficult to integrate with other data

Today:

  • There is a rapid proliferation of academic providers and mechanisms for recording accomplishment
  • Schools have lost their cartel on records generally but retain their fiduciary obligations over their own students’ records
  • Available data for comprising records are rich, varied, jointly held, and easy to integrate with other data

Some points we’re going to cover

  • Institutional Practices to Improve Student Learning & Support
  • Data that are granular, collected in larger sets, are longitudinal, or are linked across systems
  • Application for educational improvement
  • Research to build basic knowledge
  • Representation of learning and achievement

Applications

  • Enrollment management
  • Institutional programs and policies
  • Early alert
  • Adaptive courseware

Great Diversity in Data Use

  • 2016 KPMG survey: 41% of respondents use student data for predictive analytics; 29% have internal capacity to analyze own data
  • 2016 Campus Computing Survey: <20% rated their institutions’ data-analytics investments as “very effective”
  • Ithaka S+R Faculty Surveys: minority are using any form of technology in instruction, although 63% want to

Concerns

  • Privacy
  • Consent
  • Algorithmic bias
  • Opacity
  • Self-fulfilling prophecies
  • Institutional interest != student interest

Five Questions

  1. What data goes in the record, what does not, and who decides?
  2. Do educators/researchers have a responsibility to use student data in some ways?
  3. Do educators/researchers have a responsibility to not use data in some ways?
  4. Whose data (and records) are they?
  5. Do we have adequate language for talking about these things?

Principles of Responsible Use

  • Shared understanding:¬†Data themselves are “joint ventures” between students, faculty, campus systems (LMS, SIS, etc.).
  • Transparency: credits are evaluative in nature; students can rightfully expect to understand how data about them is generated (and have that explained to them)
  • Informed improvement: institutions have an obligation to constantly improve themselves based on the data they collect and use.
  • Open futures: education should create opportunity, not foreclose on it. Data used for predictive purposes should be used to expand student opportunities.

We then read the DRAFT Principled for Responsible Use of Student Data document and reviewed at each table. A few of the things we discussed as a group:

  • Contractual language and understanding with 3rd party vendors who aren’t researchers
  • “Ownership” of data
  • Data has a life cycle
  • Principle of data use are not the same as data privacy
  • Needs to be a theory-driven, principled reason for collection of every piece of data
  • Who gets access to this student data?
  • Shared responsibility between students and administration (higher ed is held to a higher standard than other organizations, why can’t we have a EULA-like standard)
  • Reasonable security standard

The session then reviewed three scenarios and discussed as a group.

Governing Responsible Use

  1. Who should be involved in interpreting and adjudicating principles of responsible use? Who should NOT participate in the process?
  2. What challenges do you anticipate to implementing principles of responsible use?
  3. What kind of cross-institutional coordination, support, or resources would be valuable?