Categories
Education Technology

A Global University Experience: how The Minerva Schools at KGI’s Technology allows its Students to become Global Citizens

Presenter: Eric Bonabeau

With Minerva, we have a chance to start from scratch, with first principles…we know how to maximize learning, based on a set of metrics. Let’s build a delivery mechanism that will maximize learning.

Putting 300 students in a big auditorium and lecturing them should be illegal! Well, let me walk that back…the issue is that classes of that size will not allow for students to meaningfully interact with the (likely world class) professor. In a city like Los Angeles, Berlin, San Francisco, the city is your campus, in smaller towns, the campus is your city.

93% of employers said a demonstrated capacity to think critically, communicate clearly and solve complex problems is more important than a job candidate’s undergraduate degree. This is the reason Minerva exists.

What is the most common job in America today? What will it be in ten years? Self-driving cars were not something that would have been predicted ten years ago…what’s next?

“A future proof education”

  • Practical knowledge
  • Engaging Classrooms
  • Global Immersion
  • Accessible Admissions

We get students involved in lots of co-curricular activities.

Tuition is $12,000/year, which is inexpensive by American standards. However, for the rest of the world this is pretty expensive. Out of 12,000 applicants, we only accepted 300 students. We want innovators and global citizens.

Specific Aspects

  • Thinking critically
  • Thinking creatively
  • Communicating effectively
  • Interacting effectively

Habits of mind/automatic cognitive reflexes

Use plausibility checks to determine whether claims are reasonable; use principles of effective debating, ID your audience & tailor oral & written work accordingly, and more.

It’s a journey toward mastery; active learning is key

Eric then showed a video that sampled class instruction and interaction with students.

 

 

Categories
Technology

Teaching with Tools of Engagement: Polls, Gamification, Badges, Leaderboards, Ohmage & Participatory Sensing

Presenters:

  • Rose Rocchio, Director of IT, UCLA
  • Rob Gould, Professor, UCLA

Web resources:

The classroom landscape is changing!  Technology can be leveraged in many different ways…about 80% of students have smart phones.  Engagement tools are really permeating the marketplace due to the ubiquity of mobile devices.  Today we’re going to look at ways some technological tools are impacting classroom engagement and provide a couple demonstrations of projects that are being done by UCLA and UC Berkeley.

When Analytics Meet Gamification:  The Pedagogy

Gamification provides reciprocal validation.  There is a content “gallery” that is used to share a collection of images for a course.  There is a points configuration tool for instructors that provides a way to assign points for adding to this gallery, i.e. (give a comment=5 points, get a comment=3 points, give a +1=1 point, etc.).  These points are aggregated into a leaderboard.  Professors provide weekly “missions” for students to complete, i.e. a lesson plan.

Results:

  • No correlation between total Engagement Index and final exam
  • No correlation between mission points and final exam
  • Strong correlation between mission completion and final exam

Rob Gould:  Our Collaboration with LAUSD

  • Using mobile app for engagement in the LAUSD
  • Part of NSF umbrella project called “Mobilize”
  • Partnership between several UCLA departments (statistics, CS, Center X, Graduate School of Education and INformation Science and LAUSD
  • Create and implement data science curricula in high school to enhance STEM learning

Curricula

  • Exploring CS (3 week unit)
  • Algebra 1 (3 2-week units)
  • Biology (3 week unit)
  • Introduction to data science:  year-long course

The Introduction to Data Science created an alternative pathway through high school mathematics.

  • Traditional:  Algebra 1 > Geometry > Algebra 2 > PreCalculus
  • Alternative:  Algebra1 > CS/Geometry > IDS > Statistics

Introduction to Data Science

  • Professional Data
  • “Big data” > I prefer “Everyday data”

This creates a bridge of “participatory sensing” leading to statistical and computational thinking.  This idea is gradually gaining traction, because student can now collect data everywhere they go with their mobile devices.

Some PS Campaigns

  • Measuring snacks:  measure what you’re eating (cost, when, who you were with, how did you feel when you were eating it, etc.)
  • Stress / chill moments:  measure how you feel at certain points of the day
  • Design their own

A dashboard view provides students with additional visibility about the data that they’re collecting.

 

Categories
Technology

My EDUCAUSE 2013 Mega Post

One the things I try to do when I attend conferences is to make a detailed record of all the sessions I attend, with the exception of keynotes, which tend to get really good coverage from other folks.  I live blog the events as I attend them, which hopefully helps those who committed to other sessions, and then I do one of these “mega posts,” which summarize all the posts I attended.  Based on my itinerary, 2013 seems to be the year of big data and analytics.  I’m willing to bet a lot of my fellow attendees will agree 🙂

I’ve been in higher education for just over seven years now, and somewhat amazingly, this was the very first EDUCAUSE event I’ve ever attended.  Why didn’t anyone tell me about this conference?  It was an extremely worthwhile event, at least for me…one of the meetings I had will likely save my division close to $50,000 each year!  That savings will go a long way toward providing students at CSUN with more and/or better services.  There were lots of great sessions to attend, with lots of smart folks sharing what they’re doing with IT on their campuses.  I’ll definitely be back next year.

Without any further ado, here’s my EDUCAUSE 2013 mega-post…please drop me a line and let me know if this helps you!

 

Friday, October 18 (last day of EDUCAUSE was a half day)

 

Thursday, October 17 (my busiest day)

 

Wednesday, October 16 (spent a few hours prowling the vendor floor and visiting with my accessibility colleagues)

 

Tuesday, October 15 (each session was a half-day long)

 

Categories
Technology

How Good are your IT Services?

Title:  How Good are your IT Services?

Presenter:  Timothy M. Chester, VP for IT, University of Georgia (@accidentalcio)

Tim brings a passion around CIO and organizational performance.  “Credibility ultimately comes from how well IT services are perceived by students, faculty and staff.”  Tim will talk about his TechQual tool.  Came into position at University of Georgia as AVP, was promoted to VP.  Day-to-day relationships are what drive success, not the position.

Web site for this project:  TechQual.org

Goals and Outcomes

  • Review the context for creation of the TechQual + Survey and Tools
  • Demonstrate the linkage between customer satisfaction and IT organization credibility
  • Understand how assessment of IT outcomes can drive continuous improvement processes
  • How to design and crae a TechQual+ survey to assess IT service quality
  • …etc (slides went too fast)

TechQual grew out of LibQual (a library assessment tool).  Tim was also involved in an accreditation evaluation of a research university, where a number of focus groups and surveys were done regarding delivery of IT services.  People who responded ran the gamut:

  • Good perception
  • Meh attitude
  • Angry with services (led by one furious person)

Do you want your leadership’s perception of IT services to be driven by random individuals in each of the above groups?  No, you want to own this narrative.  Despite Tim’s engineering background (which is why he got hired into his current position), his conversations tend to be around aspirations, dreams, and gaps.

 

The Power of Analytics

Showed us radar graphs of questions to visualize IT strengths and weaknesses.  The radar graphs specifically address the following areas:

  1. Connectivity & Access
  2. Technology & Technology Services
  3. The End User Experience

A series of questions identify a) user expectations, and b) user perceptions.  Colors indicate differences between the two.  Items closer to the “hub” indicate lower priorities.  This method can be broken down by respondent constituent groups.  Each group brings their own set of priorities.  Faculty are always a standard deviation below every other respondent group.

 

Tableau Visualization

Tim gave a demonstration of the power of visualization of data using a tool named Tableau.

 

IT Services that deliver value

IT these days is often simply administering services, not actually running it.

 

The Credibility Cycle

Ellen Kizsis book – “The New CIO Leader”

Initial Credibility > Resources & Expectations > Outcomes > Results > Back to Credibility

Alternate path:  Poor results > Reduced Credibility > (Cycle of Overcommitment and Underperformance) > Diminished Authority

 

The IT Delivery Ecosystem

  • Strategic Leadership
  • assesment & Planning
  • Operational Best practices
  • Foundations

 

SWOT Analysis

Group was divided into 4 groups and went around the room identifying their own organization’s Strengths, Weaknesses, Opportunities, and Threats.  Top 5 from each was highlighted.  This was a pretty powerful exercise.

 

TechQual+ Project Outcomes

Measures that conceptualize the effective delivery and use of technology, or effective IT service outcomes, from the perspective of those outside the IT organization

A set of easy-to-use Web-based tools that allows institutions to create surveys based on the TechQual+ core instrument, communicate with respondents, and analyze survey results.

A peer database that allows institutions to make comparisons of IT service outcomes on the campus against the performance of other institutions, aggregated by Carnegie basic classification.

 

3 Core Commitments

  1. Connectivity and Access
  2. Technology and Collaboration Services
  3. Support and Training

 

Naturalistic Inquiry

 

What is a Positive Outcome?

People tend to make a positive evaluation when the IT service is:

  1. Delivered Consistently
  2. Communication is timely, relevant, and in an easy-to-understand form
  3. Increases collaboration opportunities with others

 

Navigating a TechQual+ Survey

Tim showed us what a respondent sees.

 

Higher Education TechQual+ Major Influences

SERVQUAL, an approach that conceives of service quality as a range of expectations that can be assessed by measuring three different dimensions of service

  • Minimum Expectations
  • Desired Expectations
  • Perceived Performance

 

Survey Design & Setup

Tim reviewed the layout of the TechQual+ web site and tools available to a University TechQual+ survey administrator.  This included Options, Core items, Custom Items, Other Questions, Instructions, Preview.  The system has the ability to tailor the way you communicate the message that your survey is available to respondents by a) generic link that can be used anywhere or b) a tool that lets you upload tailored respondent lists (students, faculty, staff).

 

Sampling Concepts

  • N = the entire population under study
  • n = the sample of respondents that is representative of N
  • Random Sampling = method for choosing n to ensure that n is truly representative of N

UGA always selects 25% of population each year to help avoid “survey fatigue.”  The TechQual+ web site has a downloadable tool that will do a random sampling for you based on data you provide (lname, fname, email, field1, field2, field3, etc.).

 

To Get Good Response Rates

Tim reviewed some of the e-mail communication tools built into the TechQual+

  • 4 e-mail message notifications is the sweet spot (pre-survey, survey, plus two reminders)
  • Should be personalized (salutation and signature)
  • ReplyTo e-mail should be a high-ranking person in the organization
  • Link needs to be “above the fold”

 

Peer Comparison Functions

Tim demonstrated the ability of the system to compare institutional surveys.  Need 50 valid responses in a survey in order for it to be added to the peer comparison data.

 

ETS Planning and Continuous Improvement Cycle

Data drives decision-making; this data goes into presentation “road show” that helps tell the IT story.  Monthly Status and Activity Reports are extremely important!

 

 

Categories
Technology

Change Management in Higher Education

October 15, 2013

Speaker:  Jim Russell, City University of New York

Topic:  Change Management in Higher Education

 

DISCLAIMER:  this session is not about revision control, it’s more about how we handle new products, services, hardware, etc.  Sometimes it’s called “adoption and learning.”  Assumption is that some of the attendees are not beginners…going to talk about the constructs that they found useful at City University.

 

Agenda-on-the-fly:

  • Fomalizing
  • Commitment
  • “Breaking through”
  • Change capacity (“heat map”)
  • Communication
  • Buy-in

 

Structure of Seminar:

  • Theory:  Change Models
  • Practice:  change plans, player, and potholes
  • Practice:  measuring change readiness

 

We went around the room and asked “who’s doing change management?”  People gave some “elevator speeches” about how you “sell” change management to campus constituents.  With any upgrade or change, ROI cannot be realized unless we manage the people component of the equation.  We then looked at Prosci’s 5 tenets of change management:

  1. We change for a reason
  2. Organizational change requires individual change
  3. Organizational outcomes are the collective result of individual change
  4. Change management is an enabling framework for managing the people side of change
  5. We apply change management to realize the benefits and desired outcomes of change

ADKAR

Awareness, Desire, Knowledge, Ability, Reinforcement

Based on two premises, which are sometimes overlooked:  it is people who change, not organizations.  Successful change occurs when individual change matches the stages of organizational change.

There is a problem with ADKAR though:  how do you define “Desire” in higher ed?  This is one of the most difficult to develop because change comes from the outside, or change is “not part of my job.”  In higher education, we have a problem not with leadership, but “followship.”  How do we get past this problem of followship?

Connect to your people on the cognitive and emotional levels.  Connect to the heart via the student experience.  Understand what matters to people, including their fears, hopes and anxieties.  Focus groups and meetings before and after help a lot.  Allow time for feedback and venting, acknowledge the change and difficulty.  Lead consistently toward desired objectives.  Be sure to leverage opinion leaders and use feedback loops; respond to concerns to ensure people are heard and valued.  Reach out to faculty members who complain the most (“the loyal opposition”).  When communicating, use specifics!  Also appeal to the entire brain with story-telling, imagery, personal accounts, real world analogies.  Avoid numbers, charts and graphs.  Keep in mind change saturation.

 

Kotter’s 8 Steps of Leading Change

  1. Establish a Sense of Urgency
  2. Creating the Guiding Vision
  3. Developing a Change Vision
  4. Communicating the Vision for Buy-in
  5. Empowering Broad-based Action
  6. Generating Short-term Wins
  7. Never Letting Up
  8. Incorporating Changes into the Culture

 

Basic Steps of Change Plans

  1. Identify Business Process Owner and Governance
  2. Ensure the Future State is Clearly Defined
  3. Identify Stakeholders / Identify Key Changes
  4. Assess the Requirements to Support the Changes
  5. Establish Action Plan, Communicaiton Plan and Training Plan
  6. Execute those Plans
  7. Monitor and be willing to change
  8. Shift to Continuous Process Improvement

 

Group work was a case study on a policy change related to social media.  What elements should be in a change plan for a campus social media policy?

Engagement needs to occur earlier, using shared governance groups to get the word out.  What is the objective in developing such a policy?  That’s often a missing piece in such discussions.  What about enforcement of such policies?  Need to have a better idea of how this stuff works in higher education.  What’s missing in the conversations that need to happen to make these things happen?

 

CUNY CASE STUDY

Loose federation of colleges

  • While leadership could be assessed, “followship” had a mixed history
  • Diversity of mission led to diversity of procedures
  • Some colleges believe that their processes are integral to their unique identities

Each unit needs to be guided toward “ownership” through engagement

CUNY adopted model of “change liaisons” to work with resistors:

  • Recruit influential leaders with local followings
  • Don’t be afraid of the loyal opposition- you’ll need them eventually anyway
  • Don’t be afraid to reject liasons
  • Have a dev plan for the liaisons

 

Communication Assessment in Higher Ed

Assess your history on large-scale projects and communication

  • Who is the audience?
  • how do you get to them?
  • What do you want to say?
  • When?
  • How can students help?

 

The Group Voted to Discuss this:  Assessment of Change Readiness

%d bloggers like this: