Categories
Technology

Ethics and Analytics: Limits of Knowledge and a Horizon of Opportunity

Title:  Ethics and Analytics –  Limits of Knowledge and a Horizon of Opportunity

Presenters:

  • James E. Willis III, Ph.D., Purdue University
  • Matthew D. Pistilli, Ph.D., Purdue University

 

(See my related post, “The Edward Snowden Affair: A Teachable Moment for Student Affairs and Higher Ed“)

 

Highly interactive session, sat in groups of five and discussed among ourselves.

SLIDE:  Lack of Discussion of Ethics in Data and Technological Innovation

Why?

  • Cutting-edge technologies are being developed every dy
  • Cost/profit margins are determined with speed of development
  • Typeicall education lines are split between science adn humanities/technolgoy and ethics
  • Difference between what can be done and what should be done

 

SLIDE:  Where to Go from Here?

  • Begin the discussion now
  • Have the difficult converations
  • Bring together the stakeholders:  data scientists, engineers, managers, ethicists
  • Establish a framework to adapt quickly to questions of whether or not an action or research agenda should occur

 

SLIDE:  Ethical Discussions = Innovation

Logic of negation

  • Why shouldn’t we do this?
  • What should we do instead?

Different paths emerge from divergent conversations

  • Stakeholders have different voices and understandings of potential paths of development

 

SLIDE:  Potter’s Box

Definition

  • ID values
  • Appealing to ethical principles
  • Choosing Loyalties
  • Responsibilities and recommendations

How Potter’s Box is useful for ongoing discussions

If you know a student is at risk, what do you do?  What is your responsibility?

 

SLIDE:  Research Questions

  • What do the ethical models look like?
  • How are these models deployed rapidly – at the speed of technology?
  • How are these models refined with time?

 

SLIDE:  Group Discussion 1

What are the current projects going on in learning analytics today?  What are the potential ethical pitfalls that surround these developments?  Why are they potentially harmful?  Are these things always wrong or are they contextually wrong?

Some of the ideas generated by the room:

  • Type 1 / Type 2 error:  will the student pass this class?  What’s my prediction – will the student pass or fail the class?  How accurate is your prediction – did your model work?
  • Is it right to get information from the students?  Where does the locus of control lie?  Does the institution need to take more responsibility for this?
  • University One (FYE equivalent in Canada) – informal collection and sharing with the next year.  Are we branding / labeling students appropriately?  Are we sharing this information appropriately?  Who should know what and at what point?  Will that affect their future studies?
  • Top-down approach using a product with funding from the Gates Foundation (similar to the InBloom project).  Goal is to make a product that analyzes what students should be taking.  Pitfalls:  don’t know model, don’t know the raw data.

 

SLIDE:  Group Discussion 2

What is the role of “knowing” a predictive analytics – once something is known, what are the ethical ramifications of action or inaction?  What are the roles of student autonomy, information confidentiality, and predictive modeling in terms of ethical development of new systems / software / analytics?

  • How much do we really know?  If we have a high level of confidence, what is our responsibility?
  • Could it be solved by giving students opt-in versus opt-out?
  • Discovering things, going down certain paths that may raise ethical questions about students who may not succeed…i.e. financial burdens that may be incurred due to failure.

 

SLIDE:  Group Discussion 3

How might we affect the development of future analytics systems by having ethical discussions?  What are possible inventions that could come from these discussions?

  • See more active participation from the students, with a better understanding of how their data is going to be used.
  • Obligation to allow students to fail; channeling people into things they can be successful with.
  • EU regs:  disclosure, portability, forget
  • Portability:  health care analog; your info can go from one hospital to another, and that data has real-world benefits to patients.
  • Information can also say a lot about faculty too.

 

SLIDE:  Group Discussion 4

What are the frontiers of ethical discussions and technology?  Are there new frameworks?  How can we affect the future by today’s discussions?

  • Maybe you can redefine what student success is?  Maybe it isn’t graduation rates…
  • How do we affect the future?  It’s all of our roles to be a part of this conversation!  You don’t really have the option to say “I don’t have the time for this”

 

 

By Paul Schantz

CSUN Director of Web & Technology Services, Student Affairs. husband, father, gamer, part time aviator, fitness enthusiast, Apple fan, and iguana wrangler.

Leave a Reply

Your email address will not be published.

%d bloggers like this: