Categories
Technology

Optimizing Business Intelligence at Lehman College/CUNY: A Road to Change

Presenters:

  • Ronald Bergmann, VP-CIO, Lehman College/CUNY
  • Richard Finger, Director, Graduate Studies, Lehman College/CUNY
  • Lei Millman, Oracle DBA, OBIEE Admin, Lehman College/CUNY

Ron Bergmann introduced himself and touted the Frye leadership program, encouraging involvement in the program.

Lehman College

  • A CUNY school, located n the Bronx
  • About 12,000 students, with 90 graduate and undergraduate programs

BI Solutions in Higher Education

  • Big data has changed the way higher education makes decisions  takes action
  • Dynamic, easy-to-use tools have given higher education leaders more power for decision making than ever.
  • What is your road map

Questions

  • What are your key BI needs and goal?
  • Where does data reside?   How is data shared, aggregated and analyzed?
  • What tools are you using/what’s the best fit?
  • What data is important to a key customer?
  • How would you describer your data “culture?”
  • What factors support/resist changes in the use of data?

Lehman College Dashboard (LCD) – BEFORE

  • Data unconnected
  • Users did not have a reporting tool
  • Devs had limited options
  • Report development took too much time

Lehman College Dashboard (LCD) – NOW

  • BI solutions using OBIEE
  • A common ecosystem for producing/delivering enterprise reports:  enrollment, graduation, faculty workload, etc.
  • Easy access to LCD in a comprehensive format
  • Actionable data drives more informed decisions

How Did We Get Here?

  • Culture Change / Buy-in
  • Data Governance
  • Stakeholder expertise and input
  • Access to multiple existing data sources and reports (enrollments, budget/expenditures, admissions, analysis, dashboards, data sharing with other CUNY schools, etc.)

User Experience at Lehman: Preparing end-users for the BI Tool

Adoption of BI has led to big changes in culture and expectations.  It has also led to significant changes in campus processes.  This section of the presentation included many screen shots of the reports that they run.  While the reports look pretty simple, they’re really helping us with enrollment management and understanding the effectiveness of our interventions.

  • Understand the difference between official and unofficial data
  • Develop a “data dictionary” to avoid confusion
  • Articulate report parameters
  • Understand that report writing is a collaborative effort
  • Be prepared to test and modify reports in draft form

LCD Example:  One Dashboard, Multiple Data Integrations

LCD provides actionable data.  The information LCD provided allowed us to hit our enrollment targets this past fall semester.

  • Current Semester Enrollment
  • Student Retention
  • Cohort Overall Analysis
  • Student Financial Aid Information
  • Student List by Advisors, Individual Student Detailed Academic Information Dashboard

Questions / Comments

  • Can you tell us about the data governance group?  We have it, but not every office is represented.  If I had a choice, I’d get everyone on board earlier into a formal governance group.
  • Can you talk about your data dictionary?  How big is it, how do you share it?  We created 2 document, one for users and the other for IT staff.  We share the users document with the campus.
  • Do you track interventions in the system?  Not yet, but we do use it to manage our communications outreach and advisor appointments.
Categories
Technology

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies

Presenter:

Andrew McAfee, Author, MIT, @amcafee

Erik Brynjolfsson and I wrote our book because we were confused about technology.  It’s doing things now that it’s not supposed to do…it’s affecting the real world in ways we don’t really understand.

State of Understanding a Decade Ago

  • Book:  The New Division of Labor.
  • Dealt with the question: “what are humans good at, and what are computers good at?”
  • Give all the rote work to computers, and leave the pattern-matching and complex communication to humans.  Example:  driving a car in traffic.
  • Andrew then related his experience of riding in a Google self-driving car through 3 phases of personal experience:  raw abject terror (first 10 minutes);  passionate interest in what was going on (next 20 minutes); mild boredom (rest of the ride).  My own thought at this point:  “I’ve seen the future, and it’s really boring”
  • Andrew then went through the example of IBM’s Watson computer participating in the game show Jeopardy!  Watson versus people in 2006 was terrible.  Watson today is now as good as – or better than – the best human champions.  Andrew included a photo of Ken Jenning’s funny parenthetical comment on his last question against Watson:  “I for one welcome our computer overlords.”  Indeed!

Minds and Machines

  • We need to rethink this combination…machine abilities are growing to match those of humans.
  • How did this happen?  Andrew alluded to Hemingway’s quote (regarding going broke):  first it happened gradually, then it happened suddenly.
  • A rough calculation of the “tipping point,” using Kurzweil’s first half/second half of the chessboard square-doubling analogy; 1958 was the first year the BEA measured computing power doubling every 1.5 years.  Thus, 1958 + 32*1.5 = 2006

 A Change in Approach

  • Rules-based approach is inferior and doesn’t work very well (i.e. learning a language as an adult using verb conjugation books).  There are too many rules to learn!
  • Kids learn language through listening and absorbing inductively what’s going on.  Humans are pre-wired for language.
  • “We know more than we can tell” – Michael Polanyi
  • The game of Go is way more complex than chess, and to date computers have not been able to beat the best human players.  However, this will likely change before the end of this year.  How?  Because we’re going to give computers a goal of maximizing the score in a game, via trial and error.  An example of this was shown with the game Breakout

Self Assessment

  • Let’s do a self assessment.  Compared to the people around me, I’m “” (score yourself on a scale of 1-100)
  • I have good intuition; I make good predictions; I’m a good judge of character
  • Now average the 3 values
  • We’re bad at self-judgement and are predictably irrational

Geeks Versus HIPPOS

  • Geeks (who are evidence and data-driven) versus HIPPO (HIghest Paid Person’s Opinion)
  • Robert Parker is the HIPPO of the wine world
  • Orley Ashenfelter (a wine geek) came up with a remarkably accurate algorithm that made wine HIPPOS largely irrelevant

What Do Humans Still Bring to the Table?

  • We have advanced social skills
  • We have good intuition
  • We have creativity
Categories
Technology

Web Portals

Presenters:

  • Jody Couch, Program Director, Student Administration Systems, University of Texas at Austin
  • This is a Constituent Group Discussion
Jody:  our campus is currently on a mainframe-based ERP and Portal.  She then went around the room and asked some “campus demographics” questions, i.e. what kind of schools are you from, what kinds of systems are you using, etc.

What is the Most Useful Thing About Your Portal?

  • Single Sign On
  • Access to registration, payment, viewing grades.  Students want more purposeful uses for the portal – a platform for actionable tasks.

Q&A

  • When students are involved in the design of the portal, adoption rates go through the roof; we find that they often “live in there.”
  • How many groups are using student input?  Many accept student input via UserVoice, focus groups?
  • Killer features?  Adding other constituent groups, a “My Checklist” feature, parent portal.
  • Anyone else grown beyond students?  Most said yes, and many said they support alumni too.
  • One or multiple portals?  Seems like a mixed bag, although role-based views makes this question a bit superfluous.
  • Who is doing customizable portals?  A couple are, but there are concerns about overhead.  Some provide a single customizable tab, others allow for movement of pagelets.

How many are allowing for customizing content within the portal?

  •  CA Community College is using uPortal with an angular front-end “recommendation engine” style approach.
  • Others are using their content management system to pull content in.
  • Some put all their stuff that would normally live on the portal on their public sites.

What are positive / negative learnings about the usage of your portal?

  • Many are using it as a bookmark farm so they can get to email and other services.

What are people using for SSO?

  • Mostly CAS and Shibboleth.
  • For some, it’s a “carrot-and-stick” approach for granting access to putting your stuff on the portal.  SSO seems to be the “killer app”

What do you hate about your portal / wish was better?

  • Self serve password resets are needed!  For those that don’t have, they make up the bulk of some orgs’ ticket requests.
  • Person in charge of network security makes it hard for changes.
  • Mobile-friendliness

What’s Your Mobile Strategy?

  • Responsive web design
  • Put some functionality into a mobile app
  • Our mobile app functions as a portal
  • Some doing both an independent mobile app AND portal

What About Accessibility?

  • We learn via the school of hard knocks!
  • Use students with disabilities for testing
  • We do accessibility testing like security testing

Support Models?

  • Some do not have full-time dedicated staff to support their portal
  • I have 2 full-time Luminis developers that deploy on a 6-week release cycle
  • We have a LifeRay intranet to engage employees and we’re looking for an organizational home for it, but are having trouble.  Does anyone have a blog from the provost?

Vendors

  • Are people moving to cloud yet?
  • Very few are hosted off-site, CA Comm Colleges is working with Unicon on hosting uPortal at AWS.
  • rSmart is another option.
  • CampusEAI is not as great a vendor as we were led to believe when we first contracted with them.
  • Folks on Datatel sound like they’re not receiving the same level of support they used to.
  • Luminis 4 to 5 migration question:  did anyone else look at other options?  There doesn’t seem to be a clear path forward.  Luminis 5 only handles 500 concurrent users, so you’re gonna need lotsa servers!
Categories
Technology

The 2015 Campus Computing Survey

Presenters:

Resources:

This is the 26th year of the National Survey of Computing, eLearning, and Information Technology in US Higher Education.  It’s the largest survey of it’s kind in the US.  This is a survey that I’m aware of, but I don’t think I’ve ever actually read (although I might have attended this session last year, I can’t remember).  I figure this is the year that I change that.

Intent of the project has been to provide insight for IT planning and policy.  There are 35 corporate sponsors of this project – none of which have ever seen individual campus stats.  Here are some top-level details about the survey’s data collection:

  • 417 campuses
  • Web-based data collection
  • Survey period:  9/17 – 10/21
  • 75% of participants also participated last year

Highlights

  • Priorities of focus on instruction, staffing, user support, advancing campus completion agenda, IT security
  • Big diff in CIO assessments of the things we do/provide vs. the things we buy
  • Great faith in adaptive learning & digital curricular resources
  • Transition to cloud

Challenges

  • Talent retention
  • Digital curricular resources make learning more efficient & effective for students
  • 3rd party cloud services are an important part of campus plan to offer high performance computing services

Top Priorities

  1. Assist faculty integrate tech into instruction
  2. Hiring / retaining qualified IT staff
  3. User support
  4. Upgrading / enhancing network security
  5. Leveraging IT resources for student success

Some High-Level Details

  • Among the range of priorities that we all have, there are lots of service items, and not nearly as many related to the things we buy.
  • CIOs Have Great Faith in the Benefits of Digital Tech for Instruction.
  • Rating the IT Infrastructure:  lowest rankings are services, highest are hardware.
  • CIO Assessments of Digital Resources and Services for Disabled Users:  only 50% have a strategy for ADA/Section 503 compliance.  This is litigation waiting to happen.
  • Mobile technologies over laptops!
  • CIOs rate the effectiveness of campus investments in IT.  Most scores are rather low.
  • Challenge of Effective IT User Support:  we think we’re doing better than our users think we are.
  • Budget cuts are still pervasive and affect us deeply.  Cuts versus gains across investments are interesting (refer to the report).
  • Disaster Plans:  most campuses have plans and even update them regularly.  22% DO NOT have a strategic plan for network and data security (this is an amazing stat to me).
  • Declining Confidence in MOOCs.  Completion rates are atrocious (although enrollment is voluntary).  Infrastructure could be a problem here.
  • We’re experiencing major cost over-runs / unexpected costs in our ERP deployment activities.
  • Two Views of the Cloud:  things may happen faster than we expect, but less than 25% think we’ll have mission critical systems in 5 years.  IT pros affirm the strategic importance of cloud computing.  There’s still significant concern over the security of the cloud.  Migration to the cloud is slow due to perceived risk, trust, control, limited options.  Interestingly, LMS has largely moved to the cloud. No mass movement to the cloud in 5 years.
  • Growing use of video lecture
  • Encouraging Faculty to Use Open Source / OER Content for Courses
  • Institutional demography of LMS providers:  decline in Blackboard, Canvas growing fast.  Market is volatile.  The LMS largely does not affect learning outcomes, but is used as a material delivery service.
  • Mobile apps are huge and an expected service.

Wonderful quote by Casey on his experience:  “In my 25 years of doing this survey, IT appears to be driven by epiphany and opinion, not evidence.”

Vendors: What You Need to Know

  • Partner is Not a Verb
  • Trust is the coin of the realm
  • No “logo buddies”
  • You are not your client
  • Your price is not your client’s cost
  • It’s a neural network
Categories
Technology Uncategorized

Opening Up Learning Analytics: Addressing a Strategic Imperative

Presenters:

  • Josh Baron, Assistant Vice President, Information Technology for Digital Education, Marist College
  • Lou Harrison, Director of Educational Technology Services, NC State University
  • Donna Petherbridge, Associate Vice Provost, DELTA, NC State University
  • Kenny Wilson, Division Chair-Health Occupation Programs, Jefferson College

This is actually a follow-up to one of my recent posts about a webinar I attended by Unicon on learning analytics.  We have representatives from three different LMSes:  Moodle, Sakai, and Blackboard.  Looks like Lou and Josh from that webinar are here…I’m looking forward to learning more about this effort!  Word of warning:  they moved fast, so I missed some detail, particularly around the workflow and data-heavy slides.  My Student Affairs colleagues will want to tune into the question I asked at the end…

Open Learning Analytics:  Context & Background

OAI, or the Open Academic Analytics Initiative:  EDUCAUSE Next Generation learning Challenges (NGLC).  Funded by Bill & Melinda Gates foundations, $250,000 over a 15 month period.  Goal:  leverage big data concepts to create an open-source academic early alert system and research “scaling factors”

LMS & SIS data is fed into a predictive scoring model, which is then fed into an academic alert report.  From there, an intervention is deployed (“awareness” or Online Academic Support Environment – OASE)

Research design:  rolled out to 2,200 students in 4 institutions:  2 community colleges, and 2 historically black colleges and universities.  More detail on the approach and results here.

Strategic Lessons Learned

Openness will play a critical role in the future of learning analytics.

  • Used all open source tools:  Weka, Kettle, Pentaho, R, Python, etc.
  • Open standards and APIs:  Experience API (xAPI), IMS Caliper/Sensor API
  • Open Models:  predictive models, knowledge maps, PMML, etc.
  • Open Content/Access:  journals, whitepapers, policy documents
  • Openness or Transparency with regard to ethics/privacy
  • NOT anti-commercial, commercial ecosystems help sustain OSS

Software silos limit usefulness

  • Platform approach makes everything more useful

NC State Project

  • Getting everyone moving in the same direction is a challenge.
  • The number one priority we have at NC is student success, and we know that data is going to help us get there.  However, we have different vendors approaching us independently, each with their own selling points on what they could do to help us.
  • Lunch and learn sessions, bring people up to speed on what questions to ask, and start thinking about who can generate answers.  It took us 10 months to get everyone together
  • Division of Academic & Student Affairs has purchased EAB; concurrently, we’re working on LAP.  Continued conversations with campus partners will have to happen.

From Proof to Production:  Toward Learning Analytics for the Enterprise

  • Initial steps:  small sample sizes, predictions at 1/4, 1/2, 3/4 points in course, multi-step manual process
  • Goal 1: make it more enterprise-y.  Use large sample sizes (all student enrollments), frequent early runs (maybe daily), automatic no more than 1 click
  • Currently in progress:  rebuild infrastructure for scale; daily snapshots of fall semester data; after fall semester ends look for the sweet spot.
  • Future goals:  refine model even more; segment model by population; balance between models and accuracy; refine and improve models over time; explore ways to track efficacy over time; once we intervene we can never go back to virgin state

Jefferson Project

  • Why is JC seeking LAP implementation?  First time pass rate of Anatomy and Physiology is 54%.  Only 27% re-take.  37% non-persistence rate (DFW).  Need to find ways to help students succeed.
  • How is it going?  We have a 4 year grant.  Compliance letter came in May of 2015.  Implement PREP program in October 2015, LAP roll-out in 10/1/2016, with one year to test.  We use Student Participation System data and feed it into the system.
  • Why use SPS data?  It’s readily available; part of HLC Quality Initiative; less politically charged; shown to correlate with student success; clear map of data schema; data is very robust, more data there than we are presently using; data is “complete” (better than Bb data; less complete than original LAP design).
  • Each instructor will receive an Academic Alert Report.

My question:  have you considered integration of co-curricular data into your models?  YES!  We’re very interested in integration of co-curricular data, because it’s often a better indicator for student success than LMS data.  Vincent Tinto’s research clearly indicates this, but our implementation of this is probably a phase 3 or phase 4 thing.

%d bloggers like this: