Tag Archives: technology

NASPA Needs a Technology Core Data Service, and Why This Matters to You

Who You Gonna Call?

Who do you call when you have a burning question about technology? Chances are good you have a picture of “that one techie” in your mind right now. You know their name, and you probably have their extension memorized. Beyond that, your knowledge of who does what with technology on your campus likely gets hazy. If you’re part of a system of universities, you may rely on “birds of a feather” colleagues at other campuses you meet with on a regular basis. No doubt you have colleagues who use the same software as you to administer departmental programming, can quote verse about the hoops you have to jump through to get the data you need, how your staff deals with social media, and so on. If you’re lucky, you get to go to conferences and have an informal network of professionals to lean on. Wouldn’t it be nice if there was an unbiased resource you could rely on to provide benchmarking information about technology-related topics germane to higher education? Something like this actually exists…sort of.

What’s a Core Data Service (CDS), Anyway?

The idea for a multi-organizational technology assessment in higher education is not new or original, nor did it materialize out of thin air. Since 2002, EDUCAUSE – the world’s largest community of IT leaders and professionals in higher education – has conducted an annual assessment of hundreds of campuses. The activities around this assessment culminate in a product they call the Core Data Service, or CDS. What’s in it? Benchmarking data on staffing, financials and a variety of technology services. It’s a fantastic reference for higher education technology professionals, especially leaders who need to know where they stand with respect to their peers. The problem with the EDUCAUSE CDS is that it does not collect data or provide insights that are particularly useful to student affairs professionals.

Why NASPA Needs Its Own Version of a CDS

Members of the Technology Knowledge Community (TKC) recognized the importance of technology to the profession many years ago. They believed it was such an important part of our work, they were able to successfully add it as a NASPA Professional Competency Area in 2010: https://www.naspa.org/images/uploads/main/ACPA_NASPA_Professional_Competencies_FINAL.pdf Unlike EDUCAUSE, NASPA has no benchmarking tool focused on technology that we are aware of. We believe that a NASPA CDS would be a valuable resource for any NASPA member who needs to make decisions about the use of technology in their programs. A Core Data Service is a natural extension of the assessment culture that has been built in our profession; we think it should be a core product of the organization.

You might be asking yourself “why don’t we just ask EDUCAUSE to adapt their instrument so it can collect this data for us?” First, the overlap between NASPA members who participate in EDUCAUSE and vice-versa is rather small…the connection between organizations is probably not where it needs to be to make this happen (yet). Second, the vast majority of the technology we use in student services – particularly software-based – is not universally important to everyone in our organizations. Third, technology staffing models vary drastically from campus to campus. Hopefully, EDUCAUSE will continue to evolve and the data needs for student affairs will be more fully included. Until that time, however, adapting the concept for our needs at this time makes a lot of sense.

Enterprise Versus Niche Software

You may have heard the term “enterprise” invoked in hushed tones during campus meetings with IT and wondered what it meant. The way the word is used implies great importance. Generally speaking, “enterprise” refers to a product or service that everyone (or nearly everyone) in an organization depends on to do their job. When enterprise services go down, everyone panics. In the higher education software world, enterprise usually means the SIS (Student Information System), HR/Finance, portals, and email/calendaring tools. Enterprise software is expensive and complex, and requires a significant investment in professional IT resources. For many campuses, the responsibility for managing these systems lies with a Centralized IT department. As a general rule, enterprise software feeds, stores, and works on data that is considered to be the “source of truth” for an organization. They’re critical systems by definition.

Doesn’t every operational area in student affairs also depend on software? And isn’t that software just as important to what we do? In terms of complexity and usage, some of our systems rival enterprise software. Do you lead a Career Services department? There are software systems for you. How about Student Housing? You have multiple software options to choose from for managing residential life. Health Services? Check. Judicial Affairs/Student Conduct? Check. Clubs & Organizations? Disability Resources? Assessment? Check, check, check. Our software is important to us, but it isn’t universally important to everyone on campus. That’s what makes student services software niche software.

The bottom line here is that you probably want to know which software packages your peers use most often. It’s a reasonable question you’ve probably asked more than once.

Student Services Technology Support Varies Widely

Despite the fact that technology is enshrined as a NASPA professional competency, there’s little consistency around how we fund and staff it. Support models used by campuses to deliver student services technology vary widely (and wildly). Some campuses have a highly centralized IT division that coordinates services for every functional area on campus. Other campuses have multiple, decentralized technology units. Student affairs divisions may have a large or small technology department – or none at all – depending on the services needed. It’s fair to say that there are as many technology delivery models as there are members in the TKC!

We Have an Instrument That Just Might Work

In 2017, David Sweeney of the Texas A&M University system published the results of a system-wide student affairs software survey. This assessment provided TAMU’s Senior Student Affairs Officers with information about “…the distribution of ‘student affairs’ typical software packages and platforms…” and “…contract data with the aim of finding opportunities to share software across multiple units if indicated and desired.”* David’s survey spurred interest among several of us in the TKC in developing a similar but more expansive survey, with the intention of incorporating other pertinent details. After much discussion, we decided to measure the following:

  1. Institution (size, basic demographics)
  2. Student Affairs organization (services offered)
  3. Student Affairs IT (staffing level, type of support)
  4. Applications and Services

As a group, we felt that all four of these components would be useful for SSAOs (Senior Student Affairs Officers). We also felt that they would present a host of emergent benefits, such as improved collaboration between universities, leveraging our combined voices when communicating with vendors, providing hard data for NASPA’s assessment team, and so on. To that end, we developed a Qualtrics survey, currently hosted by the University of Pittsburgh. The survey is accessed by a link on the SAIT Pros web site at www.saitpros.org. SAIT Pros is a free “non-denominational” association for people who do technology work in student affairs. You don’t have to be an IT geek to join, membership is free, and we host a Slack team where people can share what they know about products, services and processes, all without having to worry about vendors listening in. In our first year of running this assessment, we had 27 participating campuses, which indicates to us that our idea has merit. We asked for TKC sponsorship for a session to talk about this project at the national conference in Los Angeles, which the TKC granted. Thank you, TKC!

Our hope is that the TKC and the broader NASPA community also see value in a “NASPA Technology CDS.” Next steps include reaching out to the Assessment, Evaluation and Research Knowledge Community (AERKC) to identify potential improvements for version 2 of the survey and possible areas of collaboration with the TKC.

Paul Schantz is Director of Web & Technology Services for the Division of Student Affairs at California State University, Northridge. He currently serves as the EdTech representative to the TKC (NASPA), is the Chair of the Student Affairs IT Community Group (EDUCAUSE), and a co-founder of SA IT Pros.

A version of this post was originally published on the NASPA Technology Knowledge Community blog. This project was discussed during a technology session at the 2019 NASPA national conference in Los Angeles.

Resources


Telling the Student Affairs Story: Answering Big Questions with Big Data

  • Adam Cebulski, Assistant Vice President and Chief of Staff, Southern Methodist University
  • Sara Ousby, Director, Student Affairs Assessment, University of North Texas

Goals

  • To discuss trends in big data and the implications for higher ed
  • ID strategies for building data warehouses & analyzing data sets
  • Share successes and challenges
  • Story telling
  • Strategies

Landscape of Big Data

3Vs: variety (lots of kinds of data); volume (more info than we know what to do with); velocity (collecting data at a higher rate than ever before).

There are tons of software packages that “do” big data, but buying software is not going to answer your problem! Big data translates into decision making through different processes, and that’s what we’re going to talk about.

Storytelling

Stories are far better at conveying what your data says than just the data itself. NASPA’s analytics study from 2017 identifies the following entry points for big data for predictive analytics: pre-enrollment > academics > motivation & self-efficacy > use of support services > student engagement

Stories are just data with soul. Stories cross the barriers of time, past, present and future, and allow us to experience the similarities between ourselves and through others, real and imagined.

Create a data story

Data + Narrative + Visuals

Case Study: SMU

We have no centralized data system, and we’re a Peoplesoft campus. We centralized OIT and brought on a new CIO from University of Illinois. We have a large Greek population and we experienced 315% increase in AOD offenses in one academic year. We introduced a number of programs and interventions to address this challenge.

  • Why the large increase?
  • Who is most at risk
  • How and when to intervene?
  • Campus partners: IR, OIT
  • Data identification

We’re a Maxient campus, so we did a lot of ETL (extract, transform and load) processes to make this work from a technical perspective., Maxient offers no APIs.

We built a BEAM model: Business Event Analysis & Modeling

  • Customer focused
  • Flexible design
  • Report neutral
  • Prevents rework
  • Saves time

Goal was to build a data warehouse to assist with our analysis and reporting. We started in 2017 and plan to launch in the next week with a dashboard as part of phase one. We needed to hire a data architect and data visualizer: these were university hires that “live” in OIT. At $125K each, these are not cheap resources (but they are an excellent investment).

A BEAM table consists of events and then we think about related events, i.e. sports game, finals, etc. that could be related. At the top we consider a range of other items associated with the charge/sanction, i.e. weather, did we win the game, what class level is the student, etc. We even pull in data about the students, such as if their parents are wealthy donors. This allows us to create a “star schema” which creates a comprehensive picture of the issue. Some of the criteria allow us to set a ranking for each of the events, which in turn allows us to prioritize items. One of the data points is which offices are responsible for addressing the issues. We started with 100, but grew to 279 unique variables that could be associated with a particular conduct case.

These variables allow us to build dashboards that rationalize the data for our staff (intervention or otherwise). The vast majority of people in the system were actually recruits. It’s mostly 1st and 2nd years that get caught up in our system. We were able to change policy immediately based on the insights our system provides.

Case Study: University of North Texas

We are 38,000 students in the DFW metroplex. We are minority majority, public tier one institution. 1st year residential requirement. Majority live in Denton County.

Our Questions

  • What are the differences in retention for students who are engaged on campus?
  • What are the differences in GPA for students who are engaged on campus?
  • Campus partners: Data analytics & IR, IT shared serices
  • Data Collection

We are going to pull card swipe data into our system soon! We’re going through the data dictionary of card swipes now, primarily using Excel and lots of pivot tables. We’re looking right now at correlation information with respect to retention.

We’ve had a lot of growth in card swipe usage. We have 220,000 card swipes into our student recreation center, and we plan to pull in the Career Center’s info next. There does appear to be a difference in retention of card swipers over non card swipers (81.18% vs. 64.02%).

Telling our story and making decisons

  • Focus on completion
  • 1st year students are those leaving at the greatest rates
  • Most impact on FTIC
  • Higher impact on men

Q: Are you planning an ROI analysis?

AC: We quantified every action with a dollar value. Our interventions have already saved over a million dollars so far. We swipe for EVERYTHING (we use CampusLabs).

Q: What does your data cleaning process look like?

AC: it’s awful! And, it’s ongoing. We’ve had to create many transformation tables, and we had a lot of silo’ed data that needed work.

SA: your data dictionary will go a long way in solving this challenge.

Q: are card swipes weighted equally?

SA: yes (for now). But we’re looking at this. Card swiping is now universal across the campus.

AC: we tie our NSSE and use ID Link to tie our data together.

Collegiate Esports: The Biggest World You’ve Never Heard Of

Presenters

  • Eugene Frier, Texas Wesleyan University
  • Kathy Chiang, Arena Coordinator, University of California, Irvine

“Esports have grown WAY beyond “kids playing video games in the basement while drinking Mountain Dew and eating Cheetos.” The events are geeing bigger and bigger; the industry is becoming more professional; and audience and revenue growth are big and growing fast. Brand are involved in a big way: sports teams, broadcast media, mainstream brands, and Esports teams all field competitive Esports teams. What’s still “kind of weird” to some of us is normal for our students.

Timeline: student organizations > independent leagues > developer leagues > varsity programs

Origins – Community

  • Student clubs: independent, grassroots; unofficial support from developers
  • Collegiate Starleague (CSL): founded in 2009; modeled after South Korea’s StarCraft ProLeague

Origins – Developers

  • TESPA: founded in 2012, acquired by Blizzard in 2013, heroes of the dorm 2015
  • College League of Legends: NACC in 2015, uLOL in 2016, College League of Legends in 2017.

Origins – Colleges & Universities

  • Creation of varsity programs: started in 2014; Three active in 2015, six active in 2016, 27 active in 2017

Current Day Colegiate Esports

It’s a Multi-Layered Ecosystem

  • Student orgs: focused on member development; some of the biggest and most well-funded orgs on campus
  • Varsity programs: 100+ programs in 2018, often built like athletic teams. Scholarships range from a couple thousand per year to full-ride.
  • Twitch student: huge advocate for student voices; pathways to create and monetize streams.
  • Collegiate leagues: 3rd party and developer/publisher leagues; advocates for students, varsity programs; competitive league + support infrastructure.
  • Scholarships and prize winnings.

Student-led vs. Varsity

UCI case study: 2011-2019: LoL pushed the boundaries of what we could do at a university. Top two accomplishments: world-viewing party in 2013 (a highly social event – over 800 attendees at our first event, then 1,800 at our next one!); we expanded into an umbrella organization that ended up being the largest group on campus. We currently give scholarships for League of Legends and Overwatch.

Texas Wesleyan University case study 2017-2019: what can we do to stay relevant and competitive and engage with students? I spoke with over a dozen universities and companies to figure out what we could do. After speaking with Athletics, we realized it would fit better into Student Affairs programming. We based it on the three pillars of Competition, Creation, Community. What made it land with senior administration was how this program connects with “the murky middle,” or students who don’t fit into traditional modes of student engagement. This program can be the “hook” for students who aren’t connecting in other areas. Most of our students do a lot more than gaming!

Benefits to Current Generation

  • Old expectations: local, segmented (age, gender, region), static, trusting in authority.
  • New expectations: global, segmented by ability, dynamic, democratic.

More Opportunities

  • Involvement beyond competition: production, content creation, shoutcasting, management, digital marketing, event planning
  • Soft skill development
  • Meaningful involvement for students who don’t always see their interests represented on campus

Conclusion

  • College esports are here
  • Meet expectations of a digital age
  • Opportunity to engage with students
  • Growth

Applying the Technology Competency on Your Campus

Presenters

Resources

  • There’s a Google Drive link coming that contains all the information
  • #ApplyTechComp

Pretty good turnout for this session, considering it’s at 8:30 and down in the convention center’s basement 🙂 Got an opportunity to finally meet Lisa Endersby in person and catch up with some #SATech friends. Let’s see what Jeremiah has in store for us…

Lisa introduced Jeremiah and made a few shameless plugs for other sessions at the conference.

 

Agenda

  • Competency Background
  • Michigan Tech Background
  • Our Process
  • You and Your Campus

Competency Background

  • Provides a game plan and establishes what we should be doing
  • Tech was incorporated into many different areas in bits and pieces, and talk about a standalone technology competency began in earnest in 2010
  • Special thanks to: Matthew Brinton, Joe Sabado, Josie Ahlquist, Lisa Endersby
  • Established rubric in October 2016! This is a tool that will help members of the student affairs profession to utilize and engage with the competency areas on their campuses.

Michigan Tech Background

  • 7,000 students, founded 1885
  • Our Student Affairs division contains advancement, which is a bit unique.
  • January 2012: a charge from Dr. Les Cook to form a committee to address the 2035 vision of “High Tech, High Touch.” Central idea behind the group to consider how we embrace and push the technology agenda.
  • Technology Advance Committee: multi-member group from all areas of SA and Advancement; research & present seminars/trends and work with professional development committee and leadership team to provide recommendations.
  • Challenge: small surveys work great, redundancy of seminars, needed a plan

Our Process

  • Large doc; how to apply, how to inform, how to standardize?
  • Break down the competency
  • Assess the areas: technical hardware/software; professional dev (networking); technology like SoMe and collaboration tools.
  • We let our IT division know we were planning to do this assessment. Bring them into the conversation!
  • Use your professional networks!
  • Every department has its own SoMe accounts; we needed to figure out what was going on and who was in charge of things. Transition was a concern .
  • How to evaluate? Create a baseline evaluation and rubric survey for all staff members. NASPA HAS DONE THIS FOR YOU!
  • Our survey: questions a user can self-rate; comfort levels; open questions; 50 questions in total including department identification.
  • Our VP helped to hype the survey, including how we planned to use the information to inform increased resources/training.
  • CampusLabs is the backbone of our survey.
  • 39.75% response rate; largely mid-ranged responses; additional areas of professional development needed
  • Wanted to figure out where our people were uncomfortable. It turned out that a lot of our people didn’t know where to turn for help.
  • You can use our assessment for your own campuses, and we encourage you to use it!
  • Next Steps: present findings to SA and Advancement directors; meet with professional dev committee for recommendations; assist in professional development; reassess one year from initial survey.
  • We’re right in the middle of this process…we hope to see improvement next year!

You & Your Campus

  • This is very accessible, and the model we think is useful for any size campus
  • Join TKC
  • Self-assessment:Figuring out what you’re comfortable with is important
  • Training resources: YouTube, knowledge base, ticket database
  • Reach out and ask! People out there have had the exact same problem as you in the past.
  • What to do at the campus level?  Join the TKC; create a committee (does not have to meet on a regular basis), talk to others; use the rubric/create an assessment; training resources; reach out and ask.
  • To get people to complete your assessment, tell them what you’re doing and what they’re going to get out of it.
  • The main thing is to TRY SOMETHING! Now is the time to jump on this!

Questions

  • How were the survey results shared with your IT division? How were they received, and did it result in changes in service/collaboration between divisions? Our IT department gets 250 tickets a day, they’ve been able to use our assessment to help streamline some processes and develop some training materials to help improve services.
  • Did you have others in your division who were interested in participating in the competency area? Yes, but we were able to use this assessment and model as a starting point.

Student Mobile Takeover: Announcing the Winners of the Great Mobile Appathon

Presenters

  • Mark Albert, Director, University Web & Identity Services, The George Washington University
  • Andrew Yu, Founder and CTO, Modo Labs, Inc.
  • Matthew Willmore, mobileND Program Manager, University of Notre Dame

Goal was to get the tools for managing web apps into the hands of non-technical people at universities, so that they could make amazing apps themselves.

Schools participating in this event iteration included:

  • George Washington
  • Harvard
  • Florida State University
  • Notre Dame
  • Arizona State University

FSU

  • 14 teams, 56 students competing in total
  • Students and university benefited from this competition
  • We like the fact that through this competition, we can see exactly what student want
  • Students enjoyed the experience
  • “NutitioNOLE” was the winner at FSU
  • Eat, move learn

George Washington

  • Great way to raise awareness of the platform
  • Better understand how students wish to use their mobile devices
  • Better understand the gap between the app and student needs
  • To get the word out, we did posters, postcards, email blast, reminders to students in class
  • 80+ students; 12 teams competed
  • Outstanding ideas from our students
  • Modo’s support was great
  • 2nd place: parking app
  • 1st place: Gworld – campus ID card: dining/retail, printing, load $$, places to study

ASU

  • Fun and competitive environment to find out what our students want
  • Marketed via web site, My ASU banner ads, email
  • 10 teams, great wide-ranging ideas
  • Of our judges, each had a different winner
  • 2nd place: travel on campus
  • 1st place: ASUFit – targets fitness culture and social engagement

Harvard

  • Driven by student interest; strong culture of hackathons; event that allowed non-programmers to participate
  • Marketed via Student IT interest groups, student houses, SoMe, school CIOs
  • Intense, collaborative, inspiring
  • 2nd place: dining app that includes nutritional information so students can choose the correct
  • 1st place: bliss, a resource for maintaining mental health

Notre Dame

  • Always seeking opportunities to engage students in real-world development and design
  • Equal interest in students with and without technical chops
  • First opportunity for us to see how well students could use Publisher
  • Proved to us that we can use students more to manage our mobile app material
  • Marketed via: campus flyers, table tents, email, banner and home screen icon, co-promotion with other like events
  • 7 teams
  • 2nd place: Rate My Plate – allows students to provide feedback about dining services.
  • 1st place: Mary’s View – highly visual way to find events of interest around campus; incorporates maps so students can find events near their location.

Judges & Judging Criteria

  • Chris Barrows, NYU
  • Jenny Gluck, Syracuse
  • Julia Zaga, Uber
  • Santhana Naidu, Indiana State
  • Sarah Hoch, GE Power
  • Eric Kim, Modo Labs
  • Judging Criteria: address challenge of improving campus life; creativity and innovation; design/user experience; completeness

Harvard’s “Bliss” App is the winner!