Trust: surveillance, truth and entanglement

These key readings, and reports of current events, shed light on one of the most urgent educational (and social) issues we face today – the relationship between trust and technology. In particular, they explore the extent to which technology is being given a greater role in recording, interpreting and reporting on human movement, social and educational activity, financial transactions, and more – often in the name of security, trustworthiness, and equity. Surveillance is ‘done’ by employers, the state and other powerful bodies, but also by individuals, and by commercial entities supplying us with goods and services. The implications of ‘surveillance culture’, as Lyon calls it, may be very significant for the future of learning. The promises and perils of an ‘information civilisation’ has implications for what we mean by truth (see for example the Solon article, below); the questions posed by the increasing entanglement of technology with our homes, workplaces, and even our bodies; and the use of technology to govern many aspects of our educational experiences.

Lyon, D. (2017). Surveillance Culture: Engagement, Exposure, and Ethics in Digital Modernity. International Journal of Communication, 11 (2017), 824-842. http://ijoc.org/index.php/ijoc/article/view/5527Lyon is well known for his work on the surveillance society, but has recently turned his attention to what he calls “surveillance culture”, where “people actively participate in an attempt to regulate their own surveillance and the surveillance of others” (p.824). This active participation, he argues, is “formed through organizational dependence, political-economic power, security linkages, and social media engagement” (p.826). He discusses situations which go beyond the “general collusion with contemporary surveillance” (p.829) into a so called “soft surveillance” context where populations “participate in, actively engage with, and initiate surveillance themselves”. The paper explores “surveillance imaginaries and practices”, including sharing, exposure, desire, and visibility.

Introna, L. D. (2016). Algorithms, Governance, and Governmentality: On Governing Academic Writing. Science, Technology, & Human Values, 41(1), 17–49. http://journals.sagepub.com/doi/abs/10.1177/0162243915587360 . Open Access version: http://eprints.lancs.ac.uk/76458/1/STHV_final_author_version_introna.pdfIntrona’s key argument is that the actions of algorithms – which are inscrutable and executable – are “not just in the world, they make worlds” (p.27). For this reason, he argues that we need to pay careful attention to them. He offers three perspectives on how the “problem of governing” can be thought about in relation to algorithms: that algorithms should be governed directly (by code being made more ‘open’ or ‘transparent’, for example); that algorithms enact governance themselves (for example through facial recognition at airports that identify passengers); and that algorithms actually contribute to creating expertise, subjects and so on through a process of governmentality. Introna uses a governmentality perspective to analyse the workings of plagiarism detection in contemporary academic writing, arguing that:

the prevailing rationality, and the governing technology, has produced a very particular regime of practice when it comes to academic writing. The inheritance from these governing practices is complex and multiple—for example, they have enacted a particular understanding of what academic writing is, what plagiarism is, what students are, and what teachers are. (p.37)

This paper also includes a helpful walkthrough of what an algorithm actually does and its philosophical dimensions and implications.

O’Neil, Cathy (2017). The era of blind faith in big data must end. TED Talk, 13:18minutes. https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end . Taking a firm position on the issue of trust and technology, O’Neil (the author of the recent book ‘weapons of math destruction’) describes algorithms as “opinions embedded in code”, and urges her audience to look past the apparent objectivity of algorithms and see them as tools that primarily work to replicate the ‘status quo’ – often with many biases attached. She argues that algorithms can be checked for fairness, and fixed.

In the news:

Quantified Life (Ajana, 2017): This short documentary by Btihaj Ajana draws out some of the possibilities, and threats, linked to the use of technology to continuously monitor aspects of our bodies, such as health, movement and diet. (direct link if the video below does not appear for you – https://www.youtube.com/watch?v=qI75kMqctik&feature=youtu.be )

[youtube https://www.youtube.com/watch?v=qI75kMqctik?controls=0&modestbranding=1&rel=0&showinfo=0&loop=0&fs=0&hl=en&enablejsapi=1&origin=https%3A%2F%2Fwww.moodle.is.ed.ac.uk&widgetid=1]

Play Video

I hacked my body for a future that never came (Robertson, 2017). Five years ago, Adi Robertson had a small magnet implanted in her index finger. As the magnet’s functionality is reduces, she reflects on the current state of implants and biohacking and considers why “it no longer feels like we’re in a boom period for human augmentation”. https://www.theverge.com/2017/7/21/15999544/biohacking-finger-magnet-human-augmentation-loss

Microchip Implants for Employees? One Company Says Yes (Astor 2017). https://www.nytimes.com/2017/07/25/technology/microchips-wisconsin-company-employees.html

That company microchipping its employees is owned by a major prison vendor (Melendez 2017). https://www.fastcompany.com/4044282/that-company-microchipping-its-employees-is-owned-by-a-major-prison-vendor

These two articles report on a recent development at a company in Wisconsin, USA, which has offered its employees an implantable RFID chip which will allow them to access buildings and pay for items without needing to carry a staff card. Many employees have volunteered, raising questions for some about the security and privacy implications. The second report suggests that the ownership of the company in question should raise additional concerns.

Roomba’s Next Big Step Is Selling Maps of Your Home to the Highest Bidder (Jones 2017). Roombas (automated vacuum cleaners) are not new, but even older technologies like these can be retrofitted, or their data used in new ways.  https://gizmodo.com/roombas-next-big-step-is-selling-maps-of-your-home-to-t-1797187829

The future of fake news: don’t believe everything you read, see or hear. (Solon 2017) Solon’s piece covers the latest in video manipulation technologies, showing various techniques being used to digitally alter or create speech which can appear to come from any person, including political figures. She argues that different kinds of media literacy will be required in the future to distinguish real from manufactured video speech. https://www.theguardian.com/technology/2017/jul/26/fake-news-obama-video-trump-face2face-doctored-content

Additional readings:

Baym, N. K. (2015). Personal Connections in the Digital Age. Polity Press. Retrieved from https://ebookcentral.proquest.com/lib/ed/reader.action?docID=4030041 . A number of chapters of Baym’s book might be of interest to those wishing to explore ideas of trust, relationships and connection further for their position paper or OER.

Crang, M. and Graham, S. (2007). Sentient cities: Ambient intelligence and the politics of urban spaceInformation, Communication & Society, 10/6, p.789-817. http://www.tandfonline.com/doi/abs/10.1080/13691180701750991 . Open access version: http://dro.dur.ac.uk/5154/ This paper argues that ubiquitous ICTs are changing urban space. To frame their wide-ranging overview, Crang and Graham split urban ubiquitous computing into three categories: commercial tracking, security surveillance and artistic appropriations of technology.

Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30, 75–89. https://link.springer.com/article/10.1057%2Fjit.2015.5 . (no official open access version, but searching should turn something up.) In this paper, Zuboff argues that big data serves as the foundation of a new “logic of accumulation” she has termed “surveillance capitalism”, which “aims to predict and modify human behavior as a means to produce revenue and market control” (p.75). She theorises and explores in depth why metaphors like ‘extraction’ and ‘data exhaust’ are problematic. Of particular interest to us this week is her discussion of the use of big data and surveillance practices to erase the very concept of trust by “emptying the contract of uncertainty” (p.81).

Discussion questions:

To what extent do you agree with Introna that plagiarism detection algorithms embed an idea of economic exchange into academic writing practices? Are there implications for trust here?

Lyon claims that ethical questions arise over “how we are made visible and how we make ourselves visible, or cloak our visibility” (p.834) – what sorts of questions about futures for learning might flow on from these?

What are the dimensions of digital citizenship that you see as becoming more important in the future?

O’Neil engages with Introna’s first perspective (that we should govern our algorithms). How might you respond to or develop her arguments drawing on Introna’s ‘governmentality’ perspective?