Categories
Education Technology

Digital Redlining

Presenter

  • Chris Gilliard, Professor, Macomb Community College, @hypervisible (a great follow)

Chris started by talking about patents, specifically Facebook’s “Customer Recognition System.” This product supposedly establishes a “trust level” and provides feedback to businesses about whether a customer can reasonably afford a particular product. Chris then provided a narrative about the electronic locking of car doors when threatening-looking black men walk near cars at a red light. Systems and situations like the ones noted above are closely tied to ideas of inclusion and who gets access to what. If systems can’t recognize you (i.e. cases of racial or gender identity misidentification), then systems put that on you.

Another narrative: a facial recognition research project on a university campus took photos of student faces for several months. The images were stored in a database and used for further analysis. This data was then later shared with a company in China for use by their government. Yikes.

A definition of redlining was shared, along with a map of Detroit’s Black Neighborhoods in the 1940s. Maps like this tell you a lot about where things are, like sewers, dumps, and so on. There’s a site called mapping inequality that provides downloadable maps like this. Birwood Wall (1951) is a 6 foot tall wall in Detroit that delineated black versus white neighborhoods. The wall doesn’t actually stop anyone, which is kinda the point.

Digital redlining: the practice of creating and perpetuating inequities between racial, cultural, and class groups specifically through the use of digital technologies, digital content, and the internet. … It can also be used to refer to inequities caused by the policies and practices of digital technologies.

Facebook used to provide an “ethnic affinity” which was part of the way they identified you (note this is not how you identify yourself). This is used for targeting of advertisements. It’s also illegal in some cases (i.e housing, as described above). FB says they no longer allow this, but research from Northeastern and USC show that this still occurs because of the very nature of the way it works.

I became aware of content filtering on my campus by working with my students on a project researching revenge porn (now called “nonconsensual intimate imagery” – exercising the old euphemism treadmill, haha). The filter thought my students were looking for porn, but that isn’t at all what they were looking for. The tool filtered things in a very deterministic way. The more resources a campus has, generally speaking the more access to information they have…ALL information.

Chris then shared a fiction short story which read like a Black Mirror show pitch (it was good). “There are many parallels between prison tech and ed tech” (lots of laughter at that). Chris collects examples of absurd or extractive practices in technology. An example was provided of a kid-tracking app that is essentially the same as tech used to track ex-cons. Another example provided where Jeff Bezos described a school where “…the child is the customer.” Another app tracks face metrics while a child is reading on a tablet. Another company is called BrainCo that monitors kids’ EEG activity, which reports to teachers to gauge student attention levels. Then he shared a slide with a recent EDUCAUSE article title “Analytics Can Save Higher Education. Really.” Chris then shared a few articles about how college admissions offices use data to track and rank student prospects before they apply. Many learning analytics tools take on a paternalistic tone in their operation; having opt-in and opt-out policies are really important to think about.

Algorithms predict, but some tasks are not or should not be focused on prediction…and I don’t think that education is a predictive task.

By Paul Schantz

CSUN Director of Web & Technology Services, Student Affairs. husband, father, gamer, part time aviator, fitness enthusiast, Apple fan, and iguana wrangler.

%d bloggers like this: