Categories
Education Technology Uncategorized

EDUCAUSE 2019 mega-post

My annual EDUCAUSE “post of posts.” Any error, omissions, or misrepresentations are completely on me; please let me know if you find anything truly awful that needs editing. Enjoy!

Monday, October 14 (Pre-Conference)

Tuesday, October 15

Wednesday, October 16

Categories
Education Technology

Where to Start with Student Success Initiatives

Presenters

  • D. Christopher Brooks, Director of Research, EDUCAUSE
  • Leah Lang, Director of Analytics Services, EDUCAUSE
  • Kathe Pelletier, Director of Student Success Community Programs, EDUCAUSE

Agenda

  • Buzz about student success initiatives?
  • Brief tour of data and definitions
  • Student success maturity and tech deployment maship
  • End users, students & advisors

Are Student Success Initiatives Worth It?

Presenters tossed this question to the audience via a poll:

  • Recruiting & retention tool
  • Surface implementation of StarFish
  • Fragmented, not everything “fits into” student success

Some headlines say guided pathways are effective, others say early-warning systems are a mixed bag…sometimes they work and sometimes they don’t. Jury is out on nudging efforts. Implementation is NOT just technology work, it’s human work. Implementation is not the same as adoption! This work takes significantly more effort.

DCB shared a slide about some of his initial research on IPAS. Central IT units were among of the first groups that had the ability to put these kinds of systems into practice (i.e. they manage the course systems and other enterprise systems). Strong leadership was a big contributing factor to successful projects. Example includes president support of efforts at Colorado State. The end users (faculty and students) were comparatively left out of the process with these initial rollouts. As IPAS has evolved over the last few years, working with NASPA and other higher education organizations has significantly taken concerns (and limitations) about the growing use of student success tools into account. There are significant differences/concerns among Student Affairs, IR ad IT professionals agreement with respect to their statements on data and analytics.

Today’s Mashup: CDS and eTrack

  • CDS gathers data on IT service maturity, tech deployment, etc.
  • ETrack gathers data on student impressions & use of technology.

LL shared the kinds of questions that make up the CDS survey questions that measure Student Success Technology Maturity, technology deployments, both for students and faculty. We’re in a state of being fairly well established for student success maturity (leadership support, collaboration & culture, process and decision-making). We have a rubric for each dimension of maturity. Degree auditing is the most mature of the deployed technologies; it’s “mainstream” at 61% – 80% adopted.

There are four key end-users of the consumers of these services: advising and student affairs, faculty, students, IR and leadership Areas at the higher end of the maturity level are those that have deployed more of the technology.

Technologies for Students

  • Degree audit
  • Credit transfer & articulation systems
  • Education plan and tracking (popping up in the last two years)

DCB talked about focus of initiatives in support of student success. Groups researched include 1st year students, sophomores, transfer-in students, student athletes, students of color, LGBTQIA students, nontraditional students, 1st generation students. These populations were measured against student pipeline, academic progress & success, efficiency of degree completion, career pathways & postgrad outcomes, and student ability to afford higher education. Longitudinally speaking (2017-2019), student have positive ratings of the usefulness of online student success tools. Interestingly, I noticed that the highest-rated items are those that address traditional “checklist” items that are administratively important to graduation.

Technologies for Advising and Student Affairs

  • Student advising and case management (big gaps in maturity here)
  • Degree auditing

Staff with access to student-level data from institutions’ early-alert systems. Most impact to academic advisors, faculty advisors, counseling services, FYE staff, tutoring services, senior leaders, and to a lesser extent res life staff and recreation services. Interventions in use at institutions to improve student access include academic advising, referrals to student services, counseling, intrusive advising, program development, roommate mediation/placement, and “nudge” campaigns. We may need a rethink of how we do advising.

Takeaways and Lessons from the Field

  • If you don’t have degree audit, you should do that!
  • Deployment and adoption are not the same thing.
  • Continuous improvement culture will really help you determine where you need to build, pivot agains, or trash altogether.
  • Diverse and visible leadership is important to success.
  • This is human work.
Categories
Education Technology

Using the air-educause-nacubo analytics statement to inspire urgency

Presenters

  • Keith McIntosh, CIO, University of Richmond, @Keith_McIntosh
  • Elizabeth Clune-Kneuer, Director of Analytics and Reporting, Prince George’s Community College, @eliza_c_k
  • Robinson Neidhardt, Strategy Director of Technology, Association for Institutional Research (AIR), @rneidhardt
  • Betsy Reinitz, Director of Enterprise IT Programs, EDUCAUSE, @btippens
  • Lindsay Wayt, Director, Analytics, NACUBO, @WaytNoMore

Resources

“Analytics is the use of data, statistical analysis, and explanatory and predictive models to gain insight and act on complex issues.” Why is analytics important? It enables leaders like you to be data-informed.

BR provided an introduction to the statement, which has six principles:

  1. Make an institutional commitment to analytics
  2. Prepare for detours on road to success
  3. It’s a team sport, build your dream team
  4. Invest what you can
  5. Has a real impact on real people, avoid the pitfalls (ethics & data security, privacy)
  6. Time to act is now

RN: Keith, you’ve improved use of analytics on your campus, how did you get started?

KM: I’ve been there since 2016. We had no internal capacity, so our admissions leadership outsourced this aggressively. Once we got the process rolling, our IT and IR teams initiated our data warehouse project.

LW: can you talk about analytics journey, Elizabeth?

ECK: we have a blended responsibility for our campus data management and BI processes. Training was about three hours and very intensive and actually caused more harm than good (it’s too complicated, I don’t know where to begin). I took a risk by saying “we’re not doing this right now.” We instead provided open-office hours for folk who needed assistance with data, providing consulting services for people who had questions they needed answers to. We set up some working groups, anchored by professionals who are custodians of particular data sets. This helped people feel more comfortable and less isolated about working with a range of analytics data. This generated additional work for us, which is actually a good thing, because we’re able to help them move the conversation forward in more advanced ways.

RN: how did you build capacity for campus to have more productive conversations about data?

ECK: we asked what do people want to know? What will help you understand this better? We created a series of data discussions about things people should know more about, i.e. enrollment perceptions versus reality. The president attended our first meeting, which demonstrated the value of this effort to the campus. Our second meeting coincided with our KPI/planning efforts. Instead of doing two topics, we decided to do “KPI Cafe” meetings to address things at a pace we could handle. This had the side effect of being a good way to promote our efforts and raise our profile as leaders in this space. This was a great professional development opportunity for our younger staff. We worked with our marketing team to help us make our messaging more digestible (and branded) via infographics. This has allowed us to make better products.

LW: Mac, what’s been your approach to addressing analytics on your campus?

KM: in my role as a leader, I need to listen to what’s happening in cabinet and what my colleagues need to do their job well. Strategically, what does the campus need with technology? I need to be a regular participant and provide the logistics needed to make sure the toolset works. We also have to provide training and support. In my first seven cabinet meetings, we did not discuss data once. I had to convince folk that data IS a part of your job. We’re looking at having more conversations with our advancement team, similar to what we did for enrollment management before. We needed to create governance around our data, because there are still ownership concerns (it’s institutional data, not your personal data). Just in the last week, we created a data governance committee, and we now have a monthly meeting for our IR, IT and planning office.

Question: Mac, can you tell the story of your 18-month journey of getting folk to understand the data governance and stewardship needs on your campus? We hired a couple consultants to help us wrap our minds around this, it would have been hard to do this ourselves. We needed to understand exactly how we’re doing business today, and the gaps preventing us from getting to where we want to be in the future. I gave a presentation on the results of this effort to cabinet, and they blessed our way forward. From here, we were able to establish data governance committees. We’re still working through data siloing, but there at least is an understanding that this is a long-term process.

ECK: we needed to make institutional changes that took time for our community to understand. The efforts need to be teachable, especially where we run into problems of implementation.

Question: building your dream team and investing take money? How did you convince people to spend money on these things?

KM: stewardship is a pillar of our campus’ strategic plan. We want to improve things without being “additive,” can we do things more efficiently? In order to do this, we need to have people who are honest data brokers who can handle that.

ECK: we have two teams looking at analysis and effectiveness. Lines between our groups have blurred. Each team needs to be able to support the other. We’ve done focus groups to help us understand the assessment team’s analytics data. Asking for help is…helpful! I’ve moved analysts from one team to the opposite side in certain cases (leading versus interpreting data analysis efforts).

Question: talk about your efforts around transparency.

ECK: we talk about what we’re going to look at, it’s not a black box. Once we unpack that, people are generally more comfortable with talking about our work. We talk about these things in our data stewardship meetings, which also helps.

Categories
Education Technology

Advising Office of the Future

Presenters

  • Ed Venit, PhD, Managing Director, Strategic Research, EAB
  • Jammie Wilbanks, AVP for Academic Success, Wiregrass Georgia Technical College

EAB provides consulting services and products around enrollment, student success, and student services. To kick things off, Ed shared a quote by Rutherford B. Hayes from a letter about his advisor to his family. This introduced the topic of advising pretty well.

Millennials are now adults. We’re now working with Gen-Z and “post-traditional” students. Some broad generalizations about these groups were shared, i.e. Gen-Zs more independent, financially savvy, never known a world without wifi, etc. Post-traditional are older, work, attend part-time, etc. The nature of the advisor has changed. Traditionally they helped with registration and some specialization. Today, they’re more likely to also help with career prep, academic performance, financial well-being (aka overall success and holistic guidance).

Today, advising is a network of centrally managed success advisors committed to driving measurable student outcomes. Proactive case management: approach student first, assigned caseloads, access to data and technology. Five steps in comprehensive and continuous approach to student support:

  1. Advisor identifies risk factors
  2. Identifies critical times for outreach
  3. Executes outreach
  4. Advises students in person and refers as necessary
  5. Closes the loop and monitors whether students have improved

Jammie talked about Wiregrass Technical College; they’ve been an EAB campus since 2016, and a Navigate customer since 2017. Shared how some students were making multiple visits to various campus offices unnecessarily. My thinking is that perhaps their internal processes are bureaucratic and students aren’t getting the answers they need? But I digress…

Wiregrass’ journey to coordinated care: in 2014 was a decentralized faculty advising model. In 2015, we moved to a centralized, professional advising model. In 2017, we added Navigate to improve the student experience in advisement sessions (recording notes about each encounter). In 2019, we added coordinated care to add workflow and student support.

We also modified our Change-of-Major and Dual-Major requests. The original process involved a paper-based process that took up to two weeks to complete. Old workflow: advising > financial aid > VA office > registrars > admissions > advising. New workflow: advising creates a case and the exact same workflow occurs, only electronically. This eliminated 5,904 student office visits. A similar situation existed for SAP appeals. Old process had at least six steps and could take days or months to complete. The new process takes just days with much less student “hustle.”

How does advising cross over to other departments and how can technology break down existing silos? One answer provided by an audience member: better integration of systems can help. Sharing information between advisors was also suggested as a means of avoiding student delays because of wrong advising.

61% to 72% retention improvement in two years.

Next priority is working toward fully coordinated care using Navigate: advisors scheduling students with any services to provide case management and more holistic support.

Lessons learned: every interaction should improve the student experience; flagging isn’t enough; reduce the number of steps in student processes (remember, the processes themselves did not change).

Categories
Education Technology

Digital Redlining

Presenter

  • Chris Gilliard, Professor, Macomb Community College, @hypervisible (a great follow)

Chris started by talking about patents, specifically Facebook’s “Customer Recognition System.” This product supposedly establishes a “trust level” and provides feedback to businesses about whether a customer can reasonably afford a particular product. Chris then provided a narrative about the electronic locking of car doors when threatening-looking black men walk near cars at a red light. Systems and situations like the ones noted above are closely tied to ideas of inclusion and who gets access to what. If systems can’t recognize you (i.e. cases of racial or gender identity misidentification), then systems put that on you.

Another narrative: a facial recognition research project on a university campus took photos of student faces for several months. The images were stored in a database and used for further analysis. This data was then later shared with a company in China for use by their government. Yikes.

A definition of redlining was shared, along with a map of Detroit’s Black Neighborhoods in the 1940s. Maps like this tell you a lot about where things are, like sewers, dumps, and so on. There’s a site called mapping inequality that provides downloadable maps like this. Birwood Wall (1951) is a 6 foot tall wall in Detroit that delineated black versus white neighborhoods. The wall doesn’t actually stop anyone, which is kinda the point.

Digital redlining: the practice of creating and perpetuating inequities between racial, cultural, and class groups specifically through the use of digital technologies, digital content, and the internet. … It can also be used to refer to inequities caused by the policies and practices of digital technologies.

Facebook used to provide an “ethnic affinity” which was part of the way they identified you (note this is not how you identify yourself). This is used for targeting of advertisements. It’s also illegal in some cases (i.e housing, as described above). FB says they no longer allow this, but research from Northeastern and USC show that this still occurs because of the very nature of the way it works.

I became aware of content filtering on my campus by working with my students on a project researching revenge porn (now called “nonconsensual intimate imagery” – exercising the old euphemism treadmill, haha). The filter thought my students were looking for porn, but that isn’t at all what they were looking for. The tool filtered things in a very deterministic way. The more resources a campus has, generally speaking the more access to information they have…ALL information.

Chris then shared a fiction short story which read like a Black Mirror show pitch (it was good). “There are many parallels between prison tech and ed tech” (lots of laughter at that). Chris collects examples of absurd or extractive practices in technology. An example was provided of a kid-tracking app that is essentially the same as tech used to track ex-cons. Another example provided where Jeff Bezos described a school where “…the child is the customer.” Another app tracks face metrics while a child is reading on a tablet. Another company is called BrainCo that monitors kids’ EEG activity, which reports to teachers to gauge student attention levels. Then he shared a slide with a recent EDUCAUSE article title “Analytics Can Save Higher Education. Really.” Chris then shared a few articles about how college admissions offices use data to track and rank student prospects before they apply. Many learning analytics tools take on a paternalistic tone in their operation; having opt-in and opt-out policies are really important to think about.

Algorithms predict, but some tasks are not or should not be focused on prediction…and I don’t think that education is a predictive task.

%d bloggers like this: