UC San Diego’s Process Palooza: Leaning Toward Success

This is the first session of the last day of the conference at 8:00 AM, so if anything below is unclear or muddled, it was NOT the presenter, but me!

Presenter

  • Mojgan Amini, Director Process Management & Continuous Improvement, IT Services, University of California, San Diego

Resources

Process Improvement Movement Engulfs UCSD

Backstory: Goal #5 of UCSD’s strategic plan is: “Creating an agile, sustainable, and supportive infrastructure by ensuring a dedication to service, people, and financial stewardship.” Various forms of the word “efficient” are used 11 times to further explain the goal.

  • Translation: be nimble, do more with less.
  • That means any savings in time or money gets directly reinvested back to the university’s core mission.

Continuous Improvement Mindset

  • Lean Six Sigma (LSS) is all about eliminating waste and reducing defects.
  • Process driven, data centric and people oriented approach to achieving business goals.
  • Over 1,600 LSS practitioners at UCSD who have earned Yellow, Green and Black belts.

Every new employee in IT gets registered for LSS training.

Momentum was Building

  • Community to bring together change management, continuous improvement and project management.
  • Major ERP effort with process optimization at its core.
  • Investment in LSS training and significant outcomes of LSS projects was increasing.

Harness the Collective Knowledge

We had the skillset, the interest and the need. All that was left was a way to bring LSS practitioners together. So we created an event called “Process Palooza,” a three-in-one event which included:

  • Process improvement competition – aka The Great LSS Race
  • LSS Learning tracks for everyone
  • Networking and community building

We weren’t quite sure what to expect, but we ended up with 500 people show up to this event! Whiteboards and collaboration, no laptops. Opportunities for networking and making connections included networking lounges and booths for departments and workgroups to showcase their projects, provide info, or ask for help

Real Processes, Real Results

The competition element of Process Palooza realized amazing results:

  • AS travel request process: $96K savings, 41% reduction in processing time, 50% reduction of rework, 50% reduction of advising time, increased customer satisfaction by 80%, increased process understanding by 100%.
  • Transportation Services new employee commuted: By implementing only a small number of improvements, achieved $105,000 in labor savings, 30% reduction of in-person procession, Full implementation of improvements slated for Fall 2018.

Business Excellence Community of Practice

A vehicle to carry forward the movement and continue to grow a continuous improvement mindset and our professionals. Benefits to university include: COPs can help you scale; create a support network for members, uniting change management, project management, continuous improvement and more.

  • Started in Fall 2017
  • Executive Board and Committes
  • Over 170 Members
  • Monthly events
  • Mailing list and Slack channel
  • http://becop.ucsd.edu

Enterprise System Renewal

Our LSS process is reshaping the way UCSD does business. This includes: space management, financials, document management, admissions, identity management, HR/payroll, faculty review, data & analytics, research admin, student information, degree progress.

Lean Bench: software matters, but processes take center stage. We crowd sourced process improvement talent; a select set of campus expert practitioners of LSS methodology who engage and lead crucial efforts around process improvement, operational excellence, and business efficiency.

We track and measure all implementations and store this information in one place.

 

A Collaborative IT-IR Model to Enhance Data-Informed Decision Making

I attended this session because I knew it would include a lot of useful and practical information, and the presenters are really good at what they do 🙂 As I type this, people are filtering into the session…it doesn’t look like it’ll be a standing room only session, but it will be well-attended. The speakers moved through their materials quickly, and conversation was pretty lively…any errors or omissions are entirely my own!

Presenters

  • Timothy Chester, VP for IT, University of Georgia
  • Jonathan Gagliardi, Associate Director, American Council on Education (ACE)
  • Gina Johnson, Assistant Executive Director for Partnerships & Membership, Association for Institutional Research (AIR)

Timothy Chester, Jonathan Gagliardi, and Gina Johnson

 

 

 

 

 

 

Gina framed the session from the perspective of collaborations between IT and IR departments and breaking down silos between different units. She also challenged us to stick around and not bolt for the door, because the session will include collaborating with neighbors 😉 She then talked a bit about collaborations between AIR and EDUCAUSE.

Agenda

  • Areas for collaboration
  • Tools for developing collaboration
  • Discussion & Q&A
  • Final Thoughts

The higher ed community needs to get on the same page about using data effectively! An American College Presidents and AACRAO/ACE study showed that:

  • Very few college presidents identified the use of IR/evidence as an area of future importance
  • People from the same campus held different perspectives on data access, quality, and future use
  • Yet, areas considered to be important all require a strong research and evidence base

Jonathan Gagliardi on his role & perspectives about using data

Higher ed in the US is facing unprecedented challenges to improve access and attainment despite stagnant or diminished funding. We can use data to uncover and fix structural issues we have.

  • Things are getting more expensive, and students are footing more of the bill.
  • Not a lot of huge gains in graduation rates (based on Federal numbers)

These forces have eroded the public confidence in higher education.

Many campuses face challenges related to data use: data quality, accessibility, accuracy; data use not as widespread as it should be; data are siloed; lack of resources for training and professional development; resource limitations.

Spectrum of analytics

The following aspects of the spectrum of analytics should be done continuously and collaboratively:

Descriptive > Diagnostic > Real Time > Predictive > Prescriptive

Data need to be timely, accurate, integrated, and relevant regardless of the nature of the analytics being used. One of these dimensions may be prioritized over the other, depending on the insight needed.

“Analytics is a translation function”

Maturation of analytics functions requires an elevated cross-divisional model that takes into consideration many important factors:

  • Culture & politics
  • Technical infrastructure
  • Resources
  • The IR/IT nexus

We need a strong relationship with IT to make all this happen! IR and IT should strategically collaborate.

Timothy Chester on Areas of Collaboration

  • 30% is technical, 70% is about people!
  • IT vs. IR legacy silos and structures need to go

What do we need to do when we get a seat at the table?

From Build to Buy to Buy AND Build

IT focus historically (1960s – 1980s) was to buy and build systems, but over time those systems were highly layered. Later, (1990s – 2010s) our focus was to buy systems from vendors that we’d have to “bolt on to.” Latest phase (2010s – present) is composing systems, i.e. we buy pieces that we need from vendors and then spend time integrating them. IT and IR are in a great place now to coordinate.

Business Process Management: Discover, Document, Improve, Implement. BPM depends on using data effectively. Bringing units together and reducing redundancy will help your internal processes.

Data governance is a big piece of the puzzle as well. Discover > Define > Apply > Measure & Monitor. This process is an ongoing effort.

Having a seat at the table is having an opportunity to influence decision-making. Trust, credibility, respect and personal relationships are more important than hierarchical influence!

The Credibility Cycle

Rinse & repeat the following:

  1. Initial Credibility
  2. Resources & Expectations
  3. Outcomes
  4. Results

Get poor results, then you get into

  1. Reduced Credibility
  2. Cycle of Overcommitment & Underperformance
  3. Diminished Authority

The more and better you are at “blocking and tackling,” the more credible you’re going to be with building effective relationships and being successful.

The Best Teams are Diverse Teams

Every good team needs people who are good at the following:

  • Generate ideas
  • Promote ideas
  • Fulfill ideas
  • Validate ideas

IT/IR are becoming business organizations…we need to grow beyond just technical expertise and into a strategic advocate for change. Your job is more than to just drop data off at the table, you need to provide candid advice on the issues of the day. Senior leaders want advocacy based on analysis.

Questions for Discussion

  1. Using the duties & functions of IR to orient yourself to the focus of IR – What are some areas of crossover in work between iT and IR (Common KSAs)
  2. What are some areas of clear delineation between the work of IT and IR (unique KSAs)?

The session broke up into groups to discuss these questions and then shared with the larger group.

The Bias Truth of AI Models

Arrived about 20 minutes early to Thursday’s keynote. The Bellco theater is filling up more slowly than yesterday…maybe folk were out late last night? It’s about 7:59 to the presentation as I type these words, so I guess it’s just gonna be fewer people in the hall, which will make leaving a little easier. I’m keen to hear this presentation, as the technology we use is NOT a neutral force in the world.

Near empty Bellco Theater

Presenter

  • Teddy Benson, Director of Data Integration, Walt Disney World, Parks and Resorts

Started with a quick informal poll of audience members who are studying or working in the field of AI (it was not a huge number of folk). Question: How many think that bias, in general, is bad? A mix of half-asleep hands go in the air, most attendees abstaining.

What is AI?

To Teddy (as a youngster), AI was exemplified by C-3PO and R2-D2 from the Star Wars movie. It’s borrowing characteristics from human intelligence and applying them as algorithms in a computer friendly way. machine learning is a subset of AI, Deep Learning is a subset of machine learning. AI started in 1952 with a study by the Navy to build a neural-net computer made by Frank Rosenblatt. Neural nets are layered which allow for nuanced creative problem-solving. AI has become popular in the last 10 years because of theoretical algorithms made cheaply available by cloud computing such as Azure and AWS.

Today what we use for AI is known as narrow or weak – AI designed to perform a narrow task (e.g. facial recognition, internet searches, driving a car).

In the future we may see the creation of AGI – Artificial General Intelligence. Designed to successfully perform any intellectual task that a human can do.

Passive AI: search auto-fill, delivery of targeted ads, entertainment recommendations, product recommendations, etc.

Active AI: IBM’s Watson info retrieval, knowledge representation, automated reasoning; Salesforce’s Einstein smart CRM; Amazon’s Alexa virtual assistant; Microsoft’s Cortana personal digital assistant; etc.

AI Types

Symbolic AI: rules and knowledge has to be hand-coded and human-readable.

Non-Symbolic AI: lets the pre-generated model to perform calculations on it’s own. Downside: it’s a black box…hard to understand what’s happening inside the AI and/or the model.

Where are we in the evolution?

  • Type 1: Reactive machines (chess playing)
  • Type 2: Limited memory (self driving cars)
  • Type 3: Theory of mind (understanding the world)
  • Type 4: Self-awareness (understanding consciousness)

What is Bias?

Attaching of positive or negative meaning to elements in our environment based on personal or societal influence that shape our thinking.

Conscious bias is to be aware, intentional and responsive.

Unconscious bias, on the other hand, refers to being unaware of bias being introduced into the system.

What is AI Bias?

When a machine is biased, it is unable or less able to adapt to various training models, preferring one route as a primary mechanism.

The makes the developed AI algorithm rigid and inflexible, unable to adjust when a variation is created in the data at hand. It is also unable to pick up on discreet complexities that define a particular data set.

As our use of AI continues to grow the problem of bias in AI will continue to be very pervasive and not yet even fully realized.

Google image search (circa 2012) on “What is a CEO?” shows that only 11% of company CEOs are women, when the actual number was about 27%. A similar search on “What is a telemarketer?” the opposite effect was true (50% versus actual of 64%).

  • Was it the model causing the bias?
  • Was it the data that the model was given?

In 2016, Microsoft released Tay.ai to the world and after 24 hours, was reduced to spouting racist and other awful statements. What caused this chatbot to become a xenophobic Nazi-lover? LEARNED BEHAVIOR. Tay learned its behavior based on the people who were tweeting at it – taking its conversational cues from the WWW. Given that the internet can (and often is) at times a massive verbal garbage firehose of the worst parts of humanity, it is no surprise that Tay began to take on those characteristics.

Word2Vec tries to find associations to words via vectors in interconnected space and ties them together. With language translation, this theory supposes you can take the cascading vectors to translate languages. Google’s translation of articles online fails at this due to vector bias. Vector bias, data influence/inference: nuance in languages are not weighted appropriately in vector space.

Types of Bias

  • Data-driven: missing or one side data set
  • Bias through interaction: learned behavior
  • Emergent bias: providing matching data to requested data
  • Similarity bias: matching like for like
  • Conflicting goal bias: stereotyping results

How to Prevent AI Bias?

  • Know your environment
  • Know your data and where it came from
  • Verify your model logic – code reviews
  • Spot check during training of models

Building Your Digital Transformation Ecosystem with LTI Advantage

This session moved pretty fast (and included some very dense slides which were impossible capture in text), so any omissions or mistakes in my notes are entirely my fault!

Presenters

  • Rob Abel, CEO, IMS Global Learning Consortium
  • Michael Berman, Chief Innovation Officer and Deputy CIO, California State University, Office of the Chancellor
  • Vince Kellen, Chief Information Officer, University of California San Diego
  • Jennifer Sparrow, Senior Director of Teaching and Learning Technology, The Pennsylvania State University

Resources

What is LTI Advantage and IMS Global?

LTI Advantage (and Insights – for analytics) is a strategy as much as an interoperability standard. It’s an integration standard for LMS and tools that connect to an LMS.  It’s based on OAuth2 and JSON web objects, plus extensions for names & roles provisioning, assignment and grade services, deep linking and custom extensions.

There are 25 LTI Advantage early adopters, which include the usual suspects like D2L, Canvas, etc.

LTI Insights

Which LTI-enabled tools are being launched?

  • How frequently and when?
  • For which courses?
  • Are the tools actually being used? By how many unique users?
  • What are the usage trends?
  • What types of devices? Mobile?
  • Which LTI-enabled tools received PII, and what information is shared, exactly?

Why is this important?

LTI addresses 5 of the top 10 EDUCAUSE 2018 top 10 issues. Our orgs are often working with hundreds of suppliers, and integration is a BIG challenge.

JS: If a tool is IMS-compliant, it’s much easier for us to fast-track tools into our ecosystem.

MB: in our case, our system is a lot more decentralized so we’re trying to explain the value that LTI brings to our campuses.

VK: we want to make sure that our entire edtech ecosystem is LTI-compliant. It’s complicated and it’s not owned by any one entity. Standards of integration will help us to deliver a better teaching and learning environment.

JS: having the data streams come out in a way that does NOT require a lot of manipulation is a huge benefit for us and allows to be more precise with our predictive analytics and help us get our students to graduation.

RA: integration and analytics together – which LTI provides – allow us to do our jobs more effectively. Any supplier or institution can participate, which is probably unique to higher ed.

VK: data integration is a real rate limiter.

Question: what about extending LTI beyond the LMS, say, to the SIS? We’re working on that via the IMS EduAPI. EduAPI a set of industry standard extensible APIs to support user provisioning, common source ID and administrative data exchange.

 

Service Boot Camp: From Service Ownership to Product Management

My intent in coming to this session is to get a better handle on the IT Service Management wave that’s (seemingly) washing over higher education IT. It’s definitely a hot topic in 2018, and I know these presenters know what they’re talking about and are good at what they do. In my mind, I’ve distilled ITSM down to “categorize and define the things you do for the people you serve.” From this comes a common language by which you can talk about, plan for and quantify that work. Well, at least that’s what it looks like to me from the outside…

Presenters

  • Todd Jensen, IT Service Management, University of Nebraska – Lincoln
  • Luke Tracy, Enterprise Architect, University of Michigan – Ann Arbor
  • Chas Grundy, Manager, Product Services, University of Notre Dame

Resources

Morning Section

Question to the group: How many services does your organization offer? Service document calls out 8 service categories, 52 services and a bunch of service offerings.

Service Catalog: Service Categories > Services > Service Offerings

Analogy: you can think about a service offering as a box sitting on a services shelf

Service Owner (SO) is accountable for the delivery of an IT service and the service offerings within.

  • Ensures that the service receives strategic attention and appropriate resources
  • Is responsible for the service as a whole through its entire lifecycle and is accountable to the person in charge of overall IT service delivery
  • SO skills include communicator, strategic thinker, leadership, service level understanding, resource allocation, listening, storytelling, business analysis, financial planning & budgeting, metrics, public speaking, relationship management, service assessment, etc.

Service Offering Manager (SOM) Defined: is responsible for the delivery of an IT service offering.

  • Purpose of this role is to ensure comprehensive, efficient, and transparent management of and communication about the IT service offering.
  • Accountable to the SO for the design, implementation, and ongoing maintenance and support of the offering.
  • SOM Skills: everything associated with running a service offering, more tactical than the SO.

Where to SOs and SOMs sit in the org chart? This is not in the ECAR org chart, and it really depends on your organization.

Service Offering Manager (SOM) = Product Manager (for the purposes of this presentation).

A service owner generally thinks at a higher level than a product manager. They manage the product mix (ecosystem) for the university and customers. They plan the service strategy, roadmap, and business plan. They also consider which product is right for which customer/use case, and determine whether the products are appropriately managed – including metrics.

Service Reviews 101

A service review is an annual look at what we have, where it’s going, why we think so, what we should do. At Michigan, they call this a “MESA,” or Michigan Enterprise Strategic Assessment. A hard copy of a document was provided to help in conducting this assessment.

Structure: SOs build the service review and present to standards and architecture each year. S&A will set a schedule for service reviews so you have time to prepare. It should take a few hours to complete, depending on the SO’s knowledge of the service…should take 15-20 minutes to present.

  • Identify/define the service owner name
  • Make a list of the service offering ecosystem, which includes details about support model, user adoption, incidents, annual spend, FTE to deliver offering
  • Service roadmap lays out the life cycle from Evaluate > Ramp up > Fully Available > Ramp Down > Retire
  • Statuses: Evaluating, Recommended, Not Recommended, Retired
  • Product Mix: MESA. Places each product into one of the categories based on where it is in the service or product lifecycle.

Market Analysis

  • Industry: what is happening in general industry, what are the trends and expectations for where it’s heading. Who are the leaders and innovators? Where do our offerings fit?
  • Higher ed: what offerings do our peers use? Use the AAU Private Universities standard benchmark group if possible. What are the trends in higher ed?
  • What other projects do we have going on; what dependencies do we have?
  • Benchmarking: look at what our peers are doing and reach out to them when possible.

Service Checklist

  • Information security review
    • Privileged accounts
    • Cloud security assessment (HECVAT)
    • Data handling
  • Service catalog is up to date
    • ID appropriate backups
    • Business applications
    • Dependencies

Customer Value Model

Treacy and Wiersema model for strategic differentiation.

Customer Intimacy / Product Leadership / Operational Excellence

Organizations that focus on ONE of the above traits performed best.

  • Customer Intimacy: Total solution, tailored to the customer’s needs. “They are the experts in my business,” “I got exactly what I needed,” “They do everything with me in mind.”
  • Product Leadership: Continually redefine state-of-the-art. “Premium priced, but worth it,” “Can you believe their new product?” “I would never settle for anything less.”
  • Operational Excellence: Reliable, best price, hassle-free service. “Great price and quality,” “Their products last and last,” “So convenient, in and out in a flash,” “Consistency is their middle name.”

Four rules of the Customer Value Model

  1. Provide best offering in the marketplace by excelling in a specific dimension of value
  2. Maintain threshold standards on other dimensions of value
  3. Dominate your market by improving value year after year
  4. Build a well-tuned operating model dedicated to delivering unmatched value

Elements of the Operational Machine

Each of the following include the three traits of the Customer Value Model described above.

  • Customer Expectations
  • Operating Structure
  • Core Processes
  • Culture
  • Mantra
  • Formula

Service Strategy

Strongly encourage using a template, consistency makes it easier for readers to digest.

  • What is it?
  • What are the needs?
  • What is the strategy?
  • What are the initiatives?
  • What does success look like?

Product Decision Framework

  1. Clearly define the service: audience, use cases, etc.
  2. ID the products you want to promote, excluding any we don’t want to encourage
  3. Pick one “preferred” or default product that people should use unless they meet one of the other cases/requirements
  4. ID differentiating use cases that result in a different product choice
  5. Build a feature chart/matrix of all recommended solutions
  6. Design the decision tree always resulting in a single product recommendation (if needed)
  7. Produce a one-page guide for training and distribution (if needed)

Afternoon Section

This section is more about the service offerings management component; what does it mean to deliver services? We’re talking about the component of the Service Strategy that is…

Operation: Strategy > Concept > Deployment > Manage > Retire

Reviewed the Product Management Value Proposition Lifecycle

“The product manager is the CEO of the product.”

Product Management is the intersection of Users, Tech and Business

Most of the product lifecycle is taken up by the manage phase, because a product may be in use for many years.

The group then went through a product scorecard, which had a hard copy handout.

  • For each category, rate how well it meets the need
  • Fill in notes to explain the ratings
  • Create a plan for scores you want to change

Manage

  • Use your own product!
  • How do users feel about my product?
  • How do people learn how to use it?
  • What features do they want?
  • Is this still the right product?
  • Use listening posts: blogs, news sources, and more.
  • Build the vendor relationship: provide feedback or feature requests, stay in contact, join customer advisory boards, etc.
  • Change management matters!
  • Use metrics
  • Use chocolate bars

Tactics for Retiring a Service Offering

When the product can no longer meet the needs of the user, the technology, and the business, it is time to retire the project.

Language matters!

  • Retirement vs. End of Life
  • Sense of opportunity vs. Sense of loss
  • Etc.

We reviewed a worksheet hand out on retiring a service offering. Consider Customer use:

  • Use case or core functionality
  • Hard requirement or nice-to-have?
  • Alternative solutions or approaches

Change Management

  1. All at Once: hard cutover. Best when compatibility is an issue or support is untenable.
  2. Golden Path: promote better alternative, organic transition; seek critical mass. Best when end users strongly prefer the new solution.
  3. Adoption Curve: soft launch – opt-in first, then replace for all users. Best when you need to learn or adjust over time.
  4. Phased: Live in both worlds, roll out by group or location. Best when solutions can coexist and support is biggest concern.
  5. Ferry Boat: Pass costs along to remaining users; costs increase as users abandon ship. Best when a few users are clinging to the old solution.

Listen for, Pivot around, Help solve…THE PROBLEM.

…and then we played the Product Management Game, and I got to play the role of a CIO!

The Product Management Game

 

 

 

 

 

 

Me as a CIO

Me as a CIO

Debrief: was this a realistic experience, did you get any experiences that you could relate to? Yes!

Wrap Up, Key Takeaways

  • Define terms
  • ID service owners
  • Create tools for service ownership
  • Provide training for service owners
  • Establish processes to sustain the change
  • Measure, evaluate, and improve

 

Continuing Adventures in Higher Ed & Technology