Categories
Education Technology

A Researcher, an Advisor, and a Marketer Walk Into a Predictive Analytics Tool

Presenters

  • Philip Needles, VP of Student Services, Montgomery County Community College
  • Celeste Schwartz, VP for Information Technology and Chief Digital Officer, Montgomery County Community College
  • Stefanie Crouse, Academic Advisor/Assistant Professor, Montgomery County Community College
  • Angela Polec, Executive Director of Marketing & Communications, Montgomery County Community College
  • David Kowalski, Executive Director of Institutional Research, Montgomery County Community College

FUN FACT: this session was interrupted by an evacuation signal, which effectively demonstrated the ability of technology to move lots of people back and forth 🙂

We learned a lot along the way with Predictive Analytics, we’re here to share the good and bad we’ve experienced along the way.

DK: Once we got the tool, we were like: what do we do with it now?

CS: decision to purchase was made at a conference between the campus president, myself and a vendor. It was “risk investment,” and we weren’t sure if it would be successful. It was less about the tool and more about the people. Initial expectation was very exploratory…we believed that PA would help the institution, but we weren’t exactly sure to what extent. We wanted to be able to take (short term) actions based on what the tool would tell us, and it has done that for us.  It took us 18 – 24 months to mash the data and data models loaded into the vendor’s system. We were the first campus to use some of the features of the vendor’s product, which was an interesting experience. We needed our IR executive director to play a critical communications role that would allow our campus to make the leap to implement actions based on our findings. Being inclusive of the various areas of the campus was an objective.

Step 1: Be prepared to invest time & resources without an immediate return. MC3 Time to initial return about two years.

Step 2: Get the right people at the table. MC3 Team about 12 people.

DK: having the latitude to get the right people at the table from the colleges was hugely beneficial. Data the tool provides is around associations and trends. We used an iterative process that helped us make progress as we worked with different campus constituents.

Question: how much time did each team member contribute to the team? DK: it varies depending on the area of inquiry. AC: we have a very active agenda, and everyone at the table is empowered to take action. The more people who have access to the raw data, the better…it helped us work more effectively.

Step 3: Identify Your Research Questions

AP: we had to gain a level of familiarity with the tool to interpret what it was telling us. PA tools can be giant rabbit holes that take you into some scary places. We pulled back a bit to think about hypotheses, and then go into the tool with that intent in a more thoughtful way.

AC: we have different sessions, a 15 week, and two 7 week sessions. Second 7 week session had way more drop-outs; some students were using this session to “beef up their units.” What was happening is that students were simply forgetting to go to their new class! We implemented a text message communication plan to remind them about their class…some of the responses were hysterical: “I TOTALLY FORGOT ABOUT THIS CLASS, THANK YOU!!”

Step 4: Empower the Team to Take Action

2016-2017 about 10 interventions enacted. If your institution is committed to making data-driven decisions, this just helps to drive further assessment and action. For us, it just “fits in” to the way we do our work.

Step 5: Evaluate Your Impact

DK: we do a few things to evaluate impact (the tool has propensity score matching). Some were more successful than others. For example, students who participate more in class tend to persist better; supplemental instruction courses also help a lot.

AP: we always want to “move the needle,” but we also want to influence student behavior. For example, we want them to register early.

Step 7: Review, Revise, Repeat

AP: we’re constantly looking to see how we can improve, bringing new faculty into the fold. How do we put what we’ve learned into the hands of students to help influence positive behaviors. The data has to get out of IR and into the hands of advisors and communications folk and other appropriate folk. What can we do to assist “at risk” students BEFORE we get their GPA?

Categories
Education Technology

Digital Literacy: The State of the Art

Presenter:

  • Bryan Alexander, President, Bryan Alexander Consulting, LLC

Resources

Bryan began by talking about the NMC’s Horizon Project Strategic brief…

In 2016, we interviewed all the members of the NMC membership for quantitative and qualitative responses.

Definitions of literacy based on A/B testing:

  • Cultural
  • Cognitive
  • Constructive
  • Communicativve
  • Civic
  • Critical
  • Creative
  • Confident

The 7 Elements of Digital Literacies

  • Media literacy
  • Information literacy
  • Digital Scholarship
  • Learning Skills
  • ICT literacy
  • Career & Identity management
  • Communications and Collaboration

Mozilla Web Literacy

  • Problem-Solving
  • Communication
  • Creativity
  • Collaboration

UNESCO Media & Information Laws of Literacy

Much longer, more political definitions I can’t quite capture as they’re so long. Highlights the difference between production vs. creation

Common Elements

  1. Media literacy > Info literacy > digital literacy
  2. Technical, social and personal capacities
  3. Learners as social, participatory makers

What skill is most important to you? CRITICAL THINKING! Creativity and Problem-solving are right up there, too. Very few digital literacy programs actually exist. Typically, responsibility for this is embodied in a single individual on campus. Conclusion: a university-wide approach is needed.

Literacy Types We Derived from Survey

  1. Universal literacy: basic familiarity with using basic digital tools
  2. Creative literacy: universal literacy, infused with more challenging technical skills (coding, animation, etc.), along with digital citizenship and copyright knowledge.
  3. Literacy across disciplines: different disciplines use different types of tech that are specific to those disciplines.

What Else Should We Add

  • Infographics
  • Data literacy
  • Inc. info and media lit in universal
  • Cyberbullying
  • Privacy
  • Awareness of data used by marketers

Key Findings: Conclusions

  • Should be institutions that provide this kind of information; tech runs at a rapid pace.
  • Implementations should be strategic, i.e. include the whole campus.
  • Students are much more present as makers – make this a part of the pedagogy.
  • Industry partners should be leveraged to prepare students for life outside of school
  • Collaboration across domains is a must
  • Critical social-political engagement
  • Curricula for different areas require different frameworks for implementation of technical literacy; there are also striking differences between regions.
    • Europe: capacities; national & continental frameworks
    • Africa: job skills, economic growth
    • Meddle East: media literacy
    • US: decentralized learning

Where Do We Go From Here?

  • Dealing with fear of fake news
  • Social crisis mode
  • We need to take automation seriously
  • Don’t forget the old stuff!
  • Models of trust
Categories
Education Technology

Responsible Use of Student Data in the Digital Era

Presenters

  • Martin Kurzweil, Director, Educational Transformation Program, ITHAKA
  • Mitchell Stevens, Associate Professor, Stanford University

Mitchell gave an introduction describing who he is and what he does at Stanford (MOOCs were mentioned…), and reviewed the agenda:

  1. Why this session?
  2. Past and possible futures of student records
  3. Applying and evolving principles of responsible use

Mitchell also shared the Draft Principles for Responsible Use of Student Data, a hard copy of which was provided to all session attendees.  Attendee introductions followed…

Anticipated Takeaways

  • Overview of the current landscape of data-ethics discussions in postsecondary education
  • Recognition of how these discussions are “living” questions on our campuses
  • Tractable principles and policies of potential use in your own institution / organization

MOOCs via Stanford’s Coursera Product had an opportunity to surface the question: “What is the course taker’s status?” Are they a) a student, b) a customer, or c) a human subject? Depending on the course taker’s status, laws governing data usage is vastly different. The Asilomar Convention for Learning Research in Higher Education discussed language describing human beings from an ethical perspective. This convention came up with a concept for “learner,” which is distinct from a “student.” Contractual language surrounding these terms are vastly different.

ru.stanford.edu

Fact 1: There Now is No Default Platform

The classroom:

  • Classroom is physically and temporally bounded location
  • Exists in nested jurisdictions – college, district/system, US state, nation, – with relations negotiated over generations
  • Implies special sovereignty over content and evaluation for instructors

The web:

  • Does not entail physical or temporal boundaries
  • Commingles multiple jurisdictions whose relations are now being negotiated
  • Implies no particular sovereignty over content and evaluation

Fact 2: The Academic Record is Being Remade

Yesterday:

  • Each person had one official college record
  • schools held records exclusively, in trust, in perpetuity
  • Available data for comprising records were thin, controlled by instructor-sovereigns and their designates, and difficult to integrate with other data

Today:

  • There is a rapid proliferation of academic providers and mechanisms for recording accomplishment
  • Schools have lost their cartel on records generally but retain their fiduciary obligations over their own students’ records
  • Available data for comprising records are rich, varied, jointly held, and easy to integrate with other data

Some points we’re going to cover

  • Institutional Practices to Improve Student Learning & Support
  • Data that are granular, collected in larger sets, are longitudinal, or are linked across systems
  • Application for educational improvement
  • Research to build basic knowledge
  • Representation of learning and achievement

Applications

  • Enrollment management
  • Institutional programs and policies
  • Early alert
  • Adaptive courseware

Great Diversity in Data Use

  • 2016 KPMG survey: 41% of respondents use student data for predictive analytics; 29% have internal capacity to analyze own data
  • 2016 Campus Computing Survey: <20% rated their institutions’ data-analytics investments as “very effective”
  • Ithaka S+R Faculty Surveys: minority are using any form of technology in instruction, although 63% want to

Concerns

  • Privacy
  • Consent
  • Algorithmic bias
  • Opacity
  • Self-fulfilling prophecies
  • Institutional interest != student interest

Five Questions

  1. What data goes in the record, what does not, and who decides?
  2. Do educators/researchers have a responsibility to use student data in some ways?
  3. Do educators/researchers have a responsibility to not use data in some ways?
  4. Whose data (and records) are they?
  5. Do we have adequate language for talking about these things?

Principles of Responsible Use

  • Shared understanding: Data themselves are “joint ventures” between students, faculty, campus systems (LMS, SIS, etc.).
  • Transparency: credits are evaluative in nature; students can rightfully expect to understand how data about them is generated (and have that explained to them)
  • Informed improvement: institutions have an obligation to constantly improve themselves based on the data they collect and use.
  • Open futures: education should create opportunity, not foreclose on it. Data used for predictive purposes should be used to expand student opportunities.

We then read the DRAFT Principled for Responsible Use of Student Data document and reviewed at each table. A few of the things we discussed as a group:

  • Contractual language and understanding with 3rd party vendors who aren’t researchers
  • “Ownership” of data
  • Data has a life cycle
  • Principle of data use are not the same as data privacy
  • Needs to be a theory-driven, principled reason for collection of every piece of data
  • Who gets access to this student data?
  • Shared responsibility between students and administration (higher ed is held to a higher standard than other organizations, why can’t we have a EULA-like standard)
  • Reasonable security standard

The session then reviewed three scenarios and discussed as a group.

Governing Responsible Use

  1. Who should be involved in interpreting and adjudicating principles of responsible use? Who should NOT participate in the process?
  2. What challenges do you anticipate to implementing principles of responsible use?
  3. What kind of cross-institutional coordination, support, or resources would be valuable?
Categories
Education Technology Uncategorized

Student Privacy Boot Camp

Presenters

  • Michael Hawes, Director of Student Privacy Policy, US Department of Education
  • Amelia Vance, Education Policy Counsel, Future of Privacy Forum
  • Rachel Rudnick, Privacy Officer / Assistant Director, University of Connecticut

Resources

What is your top privacy concern?

Attendees have many reasons for being here (several on GDPR, the European Union’s privacy law – something International students will care about). I’m specifically here to learn more about the use of student data within web applications. For example, how do we let students know how we’re using their data, beyond ToS (Terms of Service) or EULA (End User License Agreement).

Types of Risk

Keep in mind the “front page of the newspaper” kinds of risks, because that’s a significant driver on the perception side of things.

  1. An actual security or privacy risk
  2. Risk of not being in compliance
  3. Perception Risk

Michael Hawes’ Segment of the Session

By the end of this session, you’ll know a lot more about PTAC – Privacy Technical Assistance Center. This provides loads of guidance and tools you can use in your work.

ED’s role in protecting student privacy

  • We administer & enforce federal laws governing the privacy of student information (FERPA)
  • Raise awareness of privacy challenges
  • Provide tech assistance to schools, districts, states, colleges and universities
  • Promoting privacy and security best practices

What is Privacy?

Privacy and security are related, but not the same thing.

Privacy: the state of being free from intrusion or disturbance in one’s private life or affairs.” Components include:

  • Info
  • Bodily
  • Territorial
  • Communications

Privacy Principles (from NIST):

  • Authority and purpose
  • Accountability
  • Data Quality and Integrity
  • Data Minimization and Retention
  • Individual Participation and Redress
  • Security
  • Transparency
  • Use Limitation

IT Security:

  • Focused on confidentiality
  • Integrity
  • Availability

Privacy and Security overlap at Confidentiality & Integrity, plus Accountability, Audit and Risk Management

FERPA 101

  • 43 years old, passed in 1974
  • Applies to all institutions receiving federal funds under any program administered by the Secretary of Education
  • Gives eligible students the right to access and seek to amend their education records
  • Protects personally identifiable information (PII) from education records from unauthorized disclosure
  • Requires written consent before sharing PII – unless an exception applies

FERPA definitions

PII: is info that alone or in combination, is linked or linkable to a specific student that would allow a reasonable person in the school community, who does not have personal knowledge of the relevant circumstances, to identify the student with reasonable certainty.

Education records are any records directly related to the student that are maintained by, or on behalf of, an educational agency or institution.

The Netflix Prize from a few years ago is a good case in point (algorithm to improve their movie recommendation engine). The de-identified data was able to be re-identified by data researchers, based on movie preferences! Favorite movie became highly identifiable information.

  • Directory information exception
  • Students don’t attend school anonymously
  • Allows schools to release certain information without consent. A few examples:
    • name, address, telephone, electronic mail address
    • date and place of birth
    • photographs
    • weight & height of athletes
  • Schools/Districts must designate data elements they consider to be directory information. Common uses: yearbooks, concert programs, telephone directories.
  • Students have a right to opt-out of disclosures under the directory information exception.

School Official Exception: schools or LEAs can use the school official exception to disclose education records without consent to a third party if the 3rd party:

  • performs a service / function the school would otherwise do themselves
  • under direct control of the school / district
  • uses education data in a manner consistent

Health or Safety Emergencies Exception

  • Disclosure necessary to protect health & safety of the student or others
  • Articulable threat to health or safety
  • Typically law enforcement

Parents of Dependent Students

  • A school may choose to disclose, without the students consent, a student’s ed record to that student’s parent if the student is sa dependent for IRS tax purposes.

Judicial Orders & Subpoenas Exception

  • School may disclose PII from ed records necessary to comply with a judicial order or lawfully issued subpoena
  • Reasonable effort to notify eligible student of the order before complying with it
  • Some judicial orders and subpoenas are exempt from FERPA’s notification requirement

Financial Aid Exception

  • Ed records may be disclosed in connection with financial aid

Studies Exception

  • Permits disclosure of PII that are for or on behalf of the school for developing, validation, or administering predictive tests
  • Administering student aid programs
  • Improving instruction
  • Must specify purpose, scope, duration

Attendee question: what counts as consent?

  • Must be written (electronic must be authenticated).
  • Has to specify PII that will be disclosed
  • Has to specify category of people it’s going to
  • Has to specify purpose
  • Has to be voluntary (for example, it cannot be waived in a “blanket ToS” at the beginning of the term)

Data Governance, Online Services, and Predictive Analytics

  • Increase in data silos at IHEs and the importance of Data Governance
  • Guidance on Protecting Student Privacy while Using Online Educational Services (2014) and Model Terms of Service (2015)
  • Be mindful of privacy and ethics when using predictive analytics in higher education

HIPAA

  • If an institution keeps student medical records, HIPAA (generally, but not always) applies, not FERPA
  • Student and treatment records can be very complex! Engage counsel when working with this data

As recipients of federal student aid, universities are financial institutions under the Gramm-Leach Bliley Act.

Audience question: is there a NIST standard for transmitting FERPA data? Yes! When in doubt, ask the school about their requirements for PII.

CASE STUDY 1: DATA BREACH

Knowing how to respond when you’ve had a data breach can be really helpful. Thank about each of the roles needed in your org. The full extent or impact of a data breach is rarely known up front. Don’t get ahead of yourself.

We broke up into groups and discussed the following:

  • Public & Internal communications/Messaging
  • Response Plan

Things to consider:

  • How can you prevent this in the future?
  • Policies & Procedures
  • Central # to call should they have questions
  • FERPA training implemented in any way? Whoever would respond to such breaches should definitely be trained.
  • Have reporting obligations changed?

Federal Laws and Actions

  • FERPA rewrite
    • Potential rollback of 2008/2011 updates
  • Several student data privacy bills introduced in Congress in 2015 and a FERPA re-write may pass in 2018. One bill has been re-introduced in 2017 so far.
  • 40 states have passed 126 laws since 2013
  • Over last 5 years, states have enacted over 100 laws governing how schools and their service providers collect, use, and protect student data

Unintended Consequences

  • Words matter: definitions and vague language; governance needed
  • Fear-based policies
  • Privacy problems with privacy legislation
  • Need for input
  • Penalties

Big case of unintended consequences: LifeTouch (a billion-dollar photo vendor) is impacted and engaged politically because photos can be classified as PII. What do they sell? Yearbooks.

Interesting Trends

  • Governance
  • Transparency
  • Contracts
  • Opt-in or Out Requirements
  • Device and social media privacy
  • Audits
  • Training
  • Penalties (financial & criminal)

State Laws

  • Of 106 state laws passed on student privacy since 2013, only 26 are applicable to higher education.
  • Most laws discussing higher ed either do not differentiate between private or public institutions or higher ed, or only apply the law to state schools.
  • Reflects a perceived inability by state legislators to govern private institutions of higher education.

Lack of laws

  • 75% of data breaches occur in higher ed, so it’s surprising that there aren’t MORE laws governing data breaches in higher ed.
  • In total, 19 states since 2014 have passed laws that included at least one provision targeted at researches. Most of these are governance-focused, but some are far more restricted.

What is Driving These Laws?

Typical comments that encapsulate what’s driving these laws:

  • “What is ed research, and why do I care about it?”
  • “Researchers are able to get access to student data and use it for whatever they want”
  • “Parents should always be allowed to opt their child out of research that will not directly improve their child’s ed or help their child in some direct way”
  • Beyond IRBs

Rachel Rudnick, University of Connecticut Privacy Officer

I think of my role as mostly a compliance function. How many campuses have a privacy office and officer? It differs from campus to campus; there’s no one way to manage it.

Do you have a designated Privacy Officer?

  • What is a privacy officer?
  • Privacy vs. Information Security
  • Privacy Office
  • Centralized function vs. embedded?
  • Just part of someone’s job?

Where Should Privacy Report?

  • Compliance (good place to start, should have buy-in of C-suite)
  • Legal
  • IT
  • Audit
  • Provost
  • Registrar
  • President/Board
  • Nowhere? Everywhere?

Models to Consider

  • Compliance/regulatory function vs. Program
  • Centralized vs. distributed (embedded)
  • Big picture comprehensive program vs. regulation-by-regulation
  • Reactive vs. Proactive approach

What is Privacy?

This is a gross oversimplification, but this helps folk understand privacy a little better, especially when they need to call someone for help:

  • Privacy is the WHAT
  • Security is the HOW

What does a Privacy Officer Do?

  • Does not mean I have a “Do Not Disturb” sign on my door!
  • Knowledge of ever-evolving rules
  • Oversee program
  • Serve as privacy resource/Subject Matter Expert
  • Write and possibly enforce policies
  • Review/draft contract language
  • Assist/provide guidance to faculty, staff, students, constituents
  • Investigate concerns/complaints
  • Educate/conduct training
  • Breach mgmt
  • Internal/external communication
  • Create and maintain relationships/partnerships
  • Work hand-in-hand with the ISO
  • Be a team player > committees, committees, committees…

To manage privacy properly on a campus, you need great partnerships!

Partnerships & Collaboration with Stakeholders

  • ISO
  • Legal
  • Audit
  • Risk Mgmt
  • Senior Mgmt (buy-in, elevator speeches)
  • Functional Offices (registrar, bursar/financial aid, research compliance/sponsored programs, HR/Payroll, Health-related units, etc.)
  • Compliance Cowboys: liaisons to support your efforts; train the trainer

Tools

  • Data inventories
  • Records retention & Info Mgmt strategies
  • Privacy Impact Assessments (PIA)
  • Maturity Modeling
  • Nymity’s comprehensive approach
  • Beg, borrow and steal from colleagues

External resources

  • HE-CPO group (supported by EDUCAUSE)
  • IAPP
  • Law firms
  • Vendors (webinars, free tools)
  • NACUA/AACRAO
  • FERPA|Sherpa
  • PTAC

Want to Be a Privacy Officer?

EDUCAUSE has resources, search for Higher Ed CPO Primer, Parts 1 & 2 on their web site

 

%d bloggers like this: