Categories
Technology

Learning Analytics for Educational Design and Student Predictions: Beyond the Hype with Real-Life Examples

Title:  Learning Analytics for Educational Design and Student Predictions:  Beyond the Hype with Real-Life Examples

Presenters:

  • Nynke Kruiderink, Teamleader, Educational Technology Social Sciences, University of Amsterdam
  • Perry J. Samson, Professor, Department of Atmospheric, Oceanic & Space Sciences, University of Michigan-Ann Arbor
  • Nynke Bos, Program Manager, Educational Technology, University of Amsterdam

 

SLIDE:  Lessons Learned:  February 2012 – Present

Our Proof of Concept is Two-Tiered

  1. Interviews with lecturers, professors, managers
  2. Gather and store data in central place for easy access

General things we learned

  • Emotional response to “Big Brother” aspect of accessing data
  • Data from LMS not detailed enough (folder based, not file based)
  • 50% of learning data available
  • Piwki, not secure enough

 

Next Steps

  • Focus group:  learning analytics
  • Professor Erik Duval – KU Leuven  (his advice: undertake one project involving all that proves learning analytics is useful)

What is the Problem?

Recorded lectures

Recording of F2F lectures

No policy at the University of Amsterdam

Different deployment throughout the curriculum

Not at all (fears/emotional)

Week after the lecture

Week before the assessment

 

Student vs. Policy

Students demanded policy

QA department wanted insight into academic achievement before doing so

Development of didactic framework

Research:  learning analytics

 

 Design

  • Two courses on psychology
  • Courses run simultaneously
  • Intervention in one condition, but not the other

 

Data Collection

  • Viewing of recorded lecture
  • Lecture attendance per lecture
  • Final grade on the course (with more segmented view)
  • Some other data (grades on previous courses, distance to the lecture hall, gender, age, etc.)

 

Lessons Learned

  • Let people know what you are doing
  • Data preparation:  fuzzy, messy
  • Choose the data
  • Simplify the data
  • Keep an eye on the prize

 

LectureTools:  Analytics

Females in class were much more likely to ask questions using a clicker (lovingly referred to as the “WTF” button).

90% of students at University of Michigan in the meteorology class cited felt they would have gotten the same grade if they had never opened the textbook

 

 

 

 

 

 

Categories
Technology

Creating a Data Governance Program

Title:  Creating a Data Governance Program

Presenter:  Mike Chapple, University of Notre Dame

 

This presentation was one of those EDUCAUSE decided to webcast.  Primarily focused on events of last year, but will cover some things done over the last 5 – 10 years.

It All Starts with a Story…

One day, the President was wondering…how many students do we have?

Naturally, a lot of potential answers depending on who you ask.

SLIDE:  how Notre Dame views data governance, using a building to illustrate

Access to Data (Roof)

  • Quality & Consistency (current focus)
  • Policies & Standards (current focus)
  • Security & Privacy
  • Compliance
  • Retention & Archiving

Technology (Foundation

 

Data Driven Decision Making (D3M) = Business Intelligence (as it’s known everywhere else)

  • Definitions need to be agreed upon (i.e. – what is a student)

 

SLIDE:  Governance Model

  • Executive Sponsors (EVP, CIO)
  • Campus Data Steward
  • Unit Data Stewards
  • Coordinating Committees (Info Governance Committee, D3M Steering Committee)

 

SLIDE:  Domain Objectives

  • Data Steward(s) appointment
  • Data definitions and business rules
  • Data quality guidelines and monitoring process
  • Regulatory compliance plan (if applicable)

 

SLIDE:  Building Data Dictionary

  • Term, i.e. “Active Student”
  • Definition:  PLAIN ENGLISH DEFINITION
  • Source System, i.e. Banner
  • Source Detail, i.e. SQL query which explains gory details of how you get the data

 

SLIDE:  Data Definition Components

  • Definition
  • Source System / Detail
  • Possible Values
  • Data Steward
  • Data Availability
  • Classification

 

SLIDE:  Start with Executive Support

This is pretty much an admonition; it really helps.  At Notre Dame, responsibility for this campus function landed with IT.

 

SLIDE:  Identify and Involve Stakeholders

Each item to be defined takes a meeting…it’s very time-consuming because you need to have representation from each area.  Data is owned by the university, not specific departments!

Notre Dame uses a “RACI” matrix for each defined term

R – responsible (office)

A – accountable (who keeps the group on-track)

C – consult (you have a seat at the table)

I – inform (people who need to know)

The matrix is distributed to all stakeholders so they can fill it in with their preferences.

 

SLIDE:  Reconcile Differences Visually

ND had two competing student numbers:  “Registrar Count” and “IR Count”

IR count = Externally reportable enrolled student

“Registrar Students” includes some folks like students on leave, zero credit students, etc.

Use a stacked bar, starting with externally reportable enrolled students, adding registrar student populations on top of that.

 

SLIDE:  Give the group a Starting Point

  • Start with a draft
  • Counting matters!  Definitions help address this possible problem.
  • Don’t use Jargon!

 

Security and Privacy

Risk-based security program

  1. Highly Sensitive (SSNs, CCs, Driver’s Licenses, Bank Accounts, ePHI)
  2. Sensitive (Everything else)
  3. Internal Information (info that would cause minimal damage is disclosed)

 

Compliance

We have to be responsive to new legal realities, since our campuses are like small cities and any law passed probably affects some area on our campus.

All data must be auditable.

  • 75% of orgs have at least one person dedicated to IT compliance
  • 76% of orgs have a corporate executive-level compliance office or council

Build compliance plans

  • Document everything with respect to regulations, i.e. HIPAA
  • Everything should be substantiated

 

Questions

With so many stakeholders, how did you address and resolve differences in data definitions?  We didn’t really have very many of those disagreements, because each area was represented in each set of meetings, and there was a solid bond among the reps from each area.

What do you do with data NOT in the data warehouse?  You just have to find some way to “chunk the work out.”  The output of the program must be pristine, so naturally priorities must be set.

Did ND work with IU, since most of this is the same?  No.

What tools are you using to manage metadata?  Google Docs for now, great for getting started, but it’s not conducive to long-term maintenance.  We’re actually building our own graph database.  This tool will ultimately expose this data for other tools.

Any principle for prioritization?   Steering committees prioritize based on BI needs of the institution.

Is there an ongoing need for a campus data steward versus a department data steward?  In some areas, the data is general or applies to many different populations.  Campus steward plays an important coordination role.

Do you consider your work the beginning of a master data management program?  Yes!

Do you see shadow systems as being a problem?  We’re not really far enough along to have experienced this problem yet.  Data is not widely available yet.  We refer to this phase “taking it from the team to the enterprise.”

This is for administrative data, yes?  Yes, it does NOT include research data.

Categories
Technology

Ethics and Analytics: Limits of Knowledge and a Horizon of Opportunity

Title:  Ethics and Analytics –  Limits of Knowledge and a Horizon of Opportunity

Presenters:

  • James E. Willis III, Ph.D., Purdue University
  • Matthew D. Pistilli, Ph.D., Purdue University

 

(See my related post, “The Edward Snowden Affair: A Teachable Moment for Student Affairs and Higher Ed“)

 

Highly interactive session, sat in groups of five and discussed among ourselves.

SLIDE:  Lack of Discussion of Ethics in Data and Technological Innovation

Why?

  • Cutting-edge technologies are being developed every dy
  • Cost/profit margins are determined with speed of development
  • Typeicall education lines are split between science adn humanities/technolgoy and ethics
  • Difference between what can be done and what should be done

 

SLIDE:  Where to Go from Here?

  • Begin the discussion now
  • Have the difficult converations
  • Bring together the stakeholders:  data scientists, engineers, managers, ethicists
  • Establish a framework to adapt quickly to questions of whether or not an action or research agenda should occur

 

SLIDE:  Ethical Discussions = Innovation

Logic of negation

  • Why shouldn’t we do this?
  • What should we do instead?

Different paths emerge from divergent conversations

  • Stakeholders have different voices and understandings of potential paths of development

 

SLIDE:  Potter’s Box

Definition

  • ID values
  • Appealing to ethical principles
  • Choosing Loyalties
  • Responsibilities and recommendations

How Potter’s Box is useful for ongoing discussions

If you know a student is at risk, what do you do?  What is your responsibility?

 

SLIDE:  Research Questions

  • What do the ethical models look like?
  • How are these models deployed rapidly – at the speed of technology?
  • How are these models refined with time?

 

SLIDE:  Group Discussion 1

What are the current projects going on in learning analytics today?  What are the potential ethical pitfalls that surround these developments?  Why are they potentially harmful?  Are these things always wrong or are they contextually wrong?

Some of the ideas generated by the room:

  • Type 1 / Type 2 error:  will the student pass this class?  What’s my prediction – will the student pass or fail the class?  How accurate is your prediction – did your model work?
  • Is it right to get information from the students?  Where does the locus of control lie?  Does the institution need to take more responsibility for this?
  • University One (FYE equivalent in Canada) – informal collection and sharing with the next year.  Are we branding / labeling students appropriately?  Are we sharing this information appropriately?  Who should know what and at what point?  Will that affect their future studies?
  • Top-down approach using a product with funding from the Gates Foundation (similar to the InBloom project).  Goal is to make a product that analyzes what students should be taking.  Pitfalls:  don’t know model, don’t know the raw data.

 

SLIDE:  Group Discussion 2

What is the role of “knowing” a predictive analytics – once something is known, what are the ethical ramifications of action or inaction?  What are the roles of student autonomy, information confidentiality, and predictive modeling in terms of ethical development of new systems / software / analytics?

  • How much do we really know?  If we have a high level of confidence, what is our responsibility?
  • Could it be solved by giving students opt-in versus opt-out?
  • Discovering things, going down certain paths that may raise ethical questions about students who may not succeed…i.e. financial burdens that may be incurred due to failure.

 

SLIDE:  Group Discussion 3

How might we affect the development of future analytics systems by having ethical discussions?  What are possible inventions that could come from these discussions?

  • See more active participation from the students, with a better understanding of how their data is going to be used.
  • Obligation to allow students to fail; channeling people into things they can be successful with.
  • EU regs:  disclosure, portability, forget
  • Portability:  health care analog; your info can go from one hospital to another, and that data has real-world benefits to patients.
  • Information can also say a lot about faculty too.

 

SLIDE:  Group Discussion 4

What are the frontiers of ethical discussions and technology?  Are there new frameworks?  How can we affect the future by today’s discussions?

  • Maybe you can redefine what student success is?  Maybe it isn’t graduation rates…
  • How do we affect the future?  It’s all of our roles to be a part of this conversation!  You don’t really have the option to say “I don’t have the time for this”

 

 

Categories
Technology

Silver Linings Playbook: Hard Earned Lessons from the Cloud

Session Title:  Silver Linings Playbook:  Hard Earned Lessons from the Cloud

Presenters:

  • Bob Carozzoni, Cornell University
  • Erik Lundberg, University of Washington
  • Bill Wrobleski, University of Michigan

The presenters agreed that they are all slightly insane, which is a good sign, IMO 🙂

Presentation Survey:  tinyurl.com/edu-silver

 

Session Goals:

  1. Challenge you to rethink the implications of cloud computing
  2. Provoke you to think beyond the status quo
  3. Inform you of what we’re seeing as we move forward
  4. Learn from you by asking questions

 

SLIDE:  It’s a BYOA (Bring Your Own App) kind of world:

  1. Consumers are more in contorl
  2. IT decisions are increasingly being made by non-IT pros
  3. Big, long software selction processes are a thing of the past

We generally don’t have the time to talk through the tech that touch the consumer.

 

SLIDE:  If no one follows, are we leading?

With SaaS, vendors go straight to the consumer (business).  Consumers are driving the bus.

 

SLIDE:  IT can lead in a new way, by:

  • Partnering
  • Inspiring
  • Coaching
  • Brokering
  • Enabling

 

SLIDE:  Slow Central IT means no central IT (be on the train or be under it)

Reality is that IT is never fast enough, even if you’re operating at peak efficiency and in the best possible way.

  • Concept to delivery must be faster than ever in a cloud world
  • You need to learn to move at the speed of cloud
  • A great but slow implementation is an unsuccessful implementation

 

SLIDE:  If you can’t take risks, end users will do it for you

  • Users take risks, often without even realizing it
  • No longer is risk managed through a single central decision /contract
  • We don’t have to be reckless, but we have to rethink how we look at and manage risk

Thinking about risk needs to be shifted.

 

SLIDE:  On-premise systems are too risky

  • Cloud providers survival depends on a rock solid security posture
  • They’re bigger targets, but they also make bigger security investments
  • Your IT security officer may soon be leading the charge to the cloud!  (there’s a big difference between security and compliance)

FISMA compliance may drive some ISO’s to push for the cloud

 

SLIDE:  big vendors rock (and suck)

  • Big vendors offer stability.  Also deep pockets for innovation, though not always in the direction we need
  • Small vendors are nimble, and will actually listen to higher education
  • Mastery of vendor management will become a critical IT skill

Independently we can’t influence big vendors, but they are responsive to consortia like Internet2, EDUCAUSE, etc.  While universities were cauldrons of innovation in the past, today we’re just small fish.

 

SLIDE:  Candlestick makers usually don’t invent lightbulbs

  • Transformative changes rarely come from someone immersed in operations
  • It’s hard to see over the horizon when you’re in the weeds
  • Liberate your change leaders so they can focus on change

Be careful in selection of your change agents and leaders!  People with positional authority are often NOT the people who can actually move the logjam forward.  You have to be intentional about giving people the free time and space to innovate.

 

SLIDE:  You need to be more fliexible than your cloud vendor

  • Customization creates lock-in, “version freeze,” and raises the cost of updates
  • Alter you business processes not the SaaS app
  • The days of customization are coming to an end

Using SaaS gives you economies of scale; you may need to stop thinking about app customization and start thinking about changing your business processes.  Expectations of your customers are different when using services provided by say, Google.

 

SLIDE:  Watch your Wake!

  • Cloud adoption, like any change, disrupts lives of real people
  • Find ways to support those affected by your transformation efforts
  • Help people work outside their comfort zones, but keep it outside their fear zone

The intellectual challenge is different now..instead of integration it’s a new intellection challenge.

 

SLIDE:  Experience can be a boat anchor

  • Deconstruct habitual thinking that’s based on old paradigms
  • Old  “What problem are we trying to solve?”
  • New: “What new opportunity does this present?”
  • Most of what we have learned about procurement, rsik management, project management, financial models and staffing is changing

Consumers don’t shop based on requirements, we shop based on what we want (right color, size, cost, etc.).  How we staff will change.

 

SLIDE:  Don’t Fight Redundancy

  • Cloud makes duplication more affordable
  • Feature overlap from our perspective is micro-specialization from the end-user perspective
  • Sometimes duplication is  bad, but it isn’t bad as a principle

We don’t worry about duplication in consumer products like ovens, toaster ovens, toasters, bagel toasters 🙂  Thinking about the middleware and integration points is probably a more useful exercise.

 

SLIDE:  Welcome to the Hotel California (check-in but never leave)

  • Make an exit strategy part of the selection and on-boardiing process
  • Getting your data isn’t hard, getting your meta data is
  • Architect to control key integration points, to minimize lock-in

Does your exit strategy work?  Have you ever tested it?  Automobiles are very dangerous, but have we stopped driving them?  No!  They’re far too useful.  We insure them!

Our focus is changing.

 

SLIDE:  Reviewed Summary of Survey

 

AUDIENCE QUESTIONS

Biggest risk is the network…how do you address that?  Students have multiple ways to get to a network, whether it’s 3G, wifi, etc.  Even identity is a risk if it’s housed on campus.  Even the network itself is a security consideration.

 

Have you found a good way to deal with risk incurred by click-through agreements?  We work with procurement to review P-card stuff to see what people are buying here and there.  Click-throughs have never been tested in court…yet.

 

Can we take advantage of the work done by other campuses?  YES!  Aggregation of vendor negotiations is a good thing for everyone, sort of like a “buying club.”

 

What if the negotiated contract ISN’T good enough for your counsel?  The university needs to make that decision for itself.

 

What strategies have you used to get traditional procurement folks to get over their concerns with risk?  We’ve dealt with it from a relationship perspective.  You need to become best friends with your procurement folks.  We developed a default addendum that covers every single issue that might “light up” our procurement folks.  Our addendum supersedes your contract.  This shows that we’ve heard their concerns

 

Cloud “stuff” is very much like a cafeteria where consumers pick and choose the services they want…what services do you see IT continuing to value and maintain?

 

 

 

 

Categories
Technology

How Good are your IT Services?

Title:  How Good are your IT Services?

Presenter:  Timothy M. Chester, VP for IT, University of Georgia (@accidentalcio)

Tim brings a passion around CIO and organizational performance.  “Credibility ultimately comes from how well IT services are perceived by students, faculty and staff.”  Tim will talk about his TechQual tool.  Came into position at University of Georgia as AVP, was promoted to VP.  Day-to-day relationships are what drive success, not the position.

Web site for this project:  TechQual.org

Goals and Outcomes

  • Review the context for creation of the TechQual + Survey and Tools
  • Demonstrate the linkage between customer satisfaction and IT organization credibility
  • Understand how assessment of IT outcomes can drive continuous improvement processes
  • How to design and crae a TechQual+ survey to assess IT service quality
  • …etc (slides went too fast)

TechQual grew out of LibQual (a library assessment tool).  Tim was also involved in an accreditation evaluation of a research university, where a number of focus groups and surveys were done regarding delivery of IT services.  People who responded ran the gamut:

  • Good perception
  • Meh attitude
  • Angry with services (led by one furious person)

Do you want your leadership’s perception of IT services to be driven by random individuals in each of the above groups?  No, you want to own this narrative.  Despite Tim’s engineering background (which is why he got hired into his current position), his conversations tend to be around aspirations, dreams, and gaps.

 

The Power of Analytics

Showed us radar graphs of questions to visualize IT strengths and weaknesses.  The radar graphs specifically address the following areas:

  1. Connectivity & Access
  2. Technology & Technology Services
  3. The End User Experience

A series of questions identify a) user expectations, and b) user perceptions.  Colors indicate differences between the two.  Items closer to the “hub” indicate lower priorities.  This method can be broken down by respondent constituent groups.  Each group brings their own set of priorities.  Faculty are always a standard deviation below every other respondent group.

 

Tableau Visualization

Tim gave a demonstration of the power of visualization of data using a tool named Tableau.

 

IT Services that deliver value

IT these days is often simply administering services, not actually running it.

 

The Credibility Cycle

Ellen Kizsis book – “The New CIO Leader”

Initial Credibility > Resources & Expectations > Outcomes > Results > Back to Credibility

Alternate path:  Poor results > Reduced Credibility > (Cycle of Overcommitment and Underperformance) > Diminished Authority

 

The IT Delivery Ecosystem

  • Strategic Leadership
  • assesment & Planning
  • Operational Best practices
  • Foundations

 

SWOT Analysis

Group was divided into 4 groups and went around the room identifying their own organization’s Strengths, Weaknesses, Opportunities, and Threats.  Top 5 from each was highlighted.  This was a pretty powerful exercise.

 

TechQual+ Project Outcomes

Measures that conceptualize the effective delivery and use of technology, or effective IT service outcomes, from the perspective of those outside the IT organization

A set of easy-to-use Web-based tools that allows institutions to create surveys based on the TechQual+ core instrument, communicate with respondents, and analyze survey results.

A peer database that allows institutions to make comparisons of IT service outcomes on the campus against the performance of other institutions, aggregated by Carnegie basic classification.

 

3 Core Commitments

  1. Connectivity and Access
  2. Technology and Collaboration Services
  3. Support and Training

 

Naturalistic Inquiry

 

What is a Positive Outcome?

People tend to make a positive evaluation when the IT service is:

  1. Delivered Consistently
  2. Communication is timely, relevant, and in an easy-to-understand form
  3. Increases collaboration opportunities with others

 

Navigating a TechQual+ Survey

Tim showed us what a respondent sees.

 

Higher Education TechQual+ Major Influences

SERVQUAL, an approach that conceives of service quality as a range of expectations that can be assessed by measuring three different dimensions of service

  • Minimum Expectations
  • Desired Expectations
  • Perceived Performance

 

Survey Design & Setup

Tim reviewed the layout of the TechQual+ web site and tools available to a University TechQual+ survey administrator.  This included Options, Core items, Custom Items, Other Questions, Instructions, Preview.  The system has the ability to tailor the way you communicate the message that your survey is available to respondents by a) generic link that can be used anywhere or b) a tool that lets you upload tailored respondent lists (students, faculty, staff).

 

Sampling Concepts

  • N = the entire population under study
  • n = the sample of respondents that is representative of N
  • Random Sampling = method for choosing n to ensure that n is truly representative of N

UGA always selects 25% of population each year to help avoid “survey fatigue.”  The TechQual+ web site has a downloadable tool that will do a random sampling for you based on data you provide (lname, fname, email, field1, field2, field3, etc.).

 

To Get Good Response Rates

Tim reviewed some of the e-mail communication tools built into the TechQual+

  • 4 e-mail message notifications is the sweet spot (pre-survey, survey, plus two reminders)
  • Should be personalized (salutation and signature)
  • ReplyTo e-mail should be a high-ranking person in the organization
  • Link needs to be “above the fold”

 

Peer Comparison Functions

Tim demonstrated the ability of the system to compare institutional surveys.  Need 50 valid responses in a survey in order for it to be added to the peer comparison data.

 

ETS Planning and Continuous Improvement Cycle

Data drives decision-making; this data goes into presentation “road show” that helps tell the IT story.  Monthly Status and Activity Reports are extremely important!

 

 

%d bloggers like this: