Benign Surveillance – some rough notes

A chance meetup with a colleague, friend, mentor today led to a short conversation with him and another colleague about data, surveillance and caring for our students, and some follow-on thoughts which I’m scribbling here for posterity and future thinking. I made my brain squeak.

We were discussing our way around issues of surveillance and being seen within the institution, whether there can be forms of benign surveillance, and how the relationship between students and academics has changed with increasing scale and digital mediation (not unconnected). In particular we touched on the extent to which:

  • Students might be comfortable with being seen by individual academic colleagues versus being seen by the institution.
  • Digital has changed who can see.
  • Students might increasingly start importing distrust into the institution based on prior digital surveillance experiences elsewhere.

We assume we start from a position of trust in Universities – but do we? Is Trust an old fashioned concept?

In my own experience I’ve encountered concerns from students about being visible to academics in ways that have surprised me – for example being anonymous in discussion forums so that teachers can’t identify you if you are asking “stupid questions” or getting it wrong. For me it speaks directly to the breakdown in the academic relationship that we were talking about. That’s exactly the sort of scenario in which a teacher ought to be able to reach out and intervene in ways that are supportive and helpful and which students would welcome.

We talked about the language of “cherising” our students in our draft new Strategic Plan, and whether relationships with students should be based on closer human connection. Fundamentally we talked about education as a route to socio-economic mobility (arguably little evidence to support it’s effectiveness) versus the wider definitions of wealth that investing in the education of a population might engender.

Edit: We also talked about the extent to which students always know what is best for them. I will come down strongly on the side of teachers as having useful knowledge and wisdom, but we also need to acknowledge that the education system now is not the one most of us were educated in.

Walking back up the road afterwords it got me thinking about how much being an effective part of a community involves seeing and being seen, and how we subject ourselves knowingly to certain forms of surveillance in order to achieve this. At a small scale – like the seminar I sat in today – I am totally comfortable with being seen, possibly even having my name ticked off on an attendance register. On a larger scale though – Facebook, Twitter etc my comfort level is very different.

Individuals quickly came to depend upon the new information and communication tools as necessary resources in the increasingly stressful, competitive, and stratified struggle for effective life. The new tools, networks, apps, platforms, and media thus became requirements for social participation. (Big other: surveillance capitalism and the prospects of an information civilization, Zuboff, S)

Liz McFall’s recent EFI talk about the extent to which our data-selves are really representative of our human selves took me back to the Data Drag project from MozFest last year and the idea more generally of data doubles:

“The theory of the Quantified Self is explored through a queer lens where data collection is considered as drag, that’s to say, the computational production of fictions of the self.” (John Philip Sage)

“The data double, however, goes beyond representation of our physical selves—it does not matter whether the double actually corresponds to the ‘real’ body. The data double constitutes an additional self, a ‘functional hybrid’…serving foremost the purpose of being useful to institutions, which allow or deny access to a multitude of domains (places, information, things) and discriminate between people.” (Bentham, Deleuze and Beyond: An Overview of Surveillance Theories from the Panopticon to Participation, Galič, Timan & Koops)

As a partial counterpoint to Zuboff I was also reminded of a paper I read on surveillance artworks – engaging the viewing subject directly in order to prompt more critique of surveillance. This feels like productive territory for exploring some of these issues, flipping “being seen” into something less psychically numbing.*

“Contemporary ways of being seen undoubtedly possess objectifying and controlling valences, but they may also afford new forms of connection and ethical responsibility among strangers.”

“By fostering ambiguity and decentring the viewing subject, critical surveillance art can capitalize on the anxiety of viewers to motivate questions that might lead to greater awareness of surveillance systems, protocols and power dynamics. Works that use participation to make viewers uncomfortable can guide moments of self-reflexivity about one’s relationship – and obligation – to others within surveillance networks.”

(Ways of being seen: surveillance art and the interpellation of viewing subjects, Monahan, T)

At the end of thinking about this I’m left feeling a bit bewildered, and now wondering whether in the rush to use data to see our students more clearly at scale, we’ve instead actively prompted being seen critically by our students.


* Which is a reminder to get back onto whether we could have a Glass Room exhibition at Edinburgh sometime.

8 thoughts on “Benign Surveillance – some rough notes

  1. Thanks A-M. Valuable stimulus. A few quick responses.

    On trust, I don’t think that we can see it as old fashioned. It is a fundamental psychological mechanism. We may be seeing a loss of trust around the place – in that we see less of our institutions as being worthy of trust – but the mechanisms of seeking places and people upon whom trust can be bestowed goes on. On this point about the losing of trust, I would highly recommend the recent Reith Lectures by Prof Onora O’Neill (OK – *I* think of 2002 as “recent”)
    https://www.bbc.co.uk/programmes/p00ghvd8
    We do well not to trust in the manifest absence of trustworthiness.

    The point about anonymity and “silly questions” is indeed indicative of a serious problem. And I would agree that this is about relationships. Perhaps there is a matter symmetry here. In order to trust, we must make ourselves vulnerable to the other. Are we asking our students to trust us in that sense, in the absence of any evidence that we trust them?

    The “cherish” word is an interesting one. Key here is not what we (as teachers and supporters of learning) do, but how the students feel; about themselves, and their place in the institution. They need (and deserve) to believe that the University cares about them, is glad that they are here, and would suffer loss if they were not. Do our students feel that they are members of – that they belong to – the University? I happened to see a couple of people having their photos taken in Old College the other day complete with wands and Hogwarts gowns. It occurred to me then that I certainly see more Gryffindor scarves around the place than University colours. 🙂 Now that *is* old fashioned.

    There are important ideas here around terms like “gaze” and “regard”, and how motives are understood. We don’t like to feel that we are being watched. But it is comforting to feel that we are watched over by some trusted agent or agency. We want students to trust our institutions, but those institutions need to be (and to show themselves) worthy of that trust.

    1. Thanks for your thoughtful comments Hamish – very much appreciated.
      It seems trite to say I agree, but, well, I agree.

      Regarding trust as old fashioned – that was provocation of course – it’s fundamental as you state, but I fear that in some areas of education we trust in data as some kind of objective higher truth and a short-cut to delivery of care. Building and sustaining trust between humans is an ongoing effort. Whether we get it right all the time might be less important than making the effort.

      Your point about vulnerability is well made and I’d immediately ask how able many of our colleagues are perhaps feeling an excess of vulnerability in the current climate? Precarity and adversity to risk go hand in hand as we know. However, I do see hints across a number of the areas that I work in of there is a more pervasive lack of trust in students (widespread use of plagiarism detection; concerns about students misusing recordings / not turning up to lectures; concerns about writing rude things on the internet). I also see our students make direct correlations between things like how easy it is to find something in the VLE and how much we care – it feels like we haven’t considered them – that we haven’t made the effort.

      I was tempted at the end of the post above to go for a glib rhetorical flourish, and ask whether our institutional practices around use of technologies are (unwittingly) functioning as participatory works of performance art, prompting a critique of institutional surveillance without care by students. That’s probably not helpful, but the paper on surveillance art did flip my thinking on it’s head. In considering issues of surveillance we’re not paying enough close enough attention to how we are seen by students.

      Or maybe we are. Arguably the NSS is another a form of surveillance.*
      < /glibRhetoricalFlourish>


      * and then I thought about the NSS as another data double and remembered that Ben W has already covered this: https://wonkhe.com/blogs/policy-in-numbers-what-counts-without-counting/
      “A university made out of numbers is not the same one that was measured — it’s a “data double“, or an aggregated representation put together from digital traces. What that data double tells others about the institution, however, shapes what others think about it, influencing their choices and decisions.”

  2. Thanks for this Anne Marie. I haven’t been teaching for 6 years now (except as Volunteer IT Buddy at Library and that’s a different relationship).In becoming a lecturer in HE many years ago, some of my most significant reflections were around relationships with students. I soon came to the realisation that my first duty was to treat them as adults and kindly. As a personal tutor/ programme leader I would listen and not tell without their permission. One of my first dilemmas was handling a phone call from an angry Greek father who was demanding, without success, to know his son’s grades as he was paying the fees. I dealt with it through talking to the son who eventually told his father the harsh truth of what had been happening.
    The term “benign surveillance ” has been rolling round my brain since I read your post earlier today and I still don’t like it. It as though the benignness is trying to soften the surveillance.
    Here’s something I wrote about surveillance a couple of months ago https://francesbell.com/bellblog/epiphanies-thanks-to-shoshana-zuboff/. As my subject discipline was Information Systems, I was very keen on encouraging students to think about the ethical implications of the tech and systems they were implementing. One good approach was to examine University Systems for which they the students were the users – it concentrated their minds on assumptions made, what happened to their creative works and personal data.
    I could go on about this all day 🙂 but I will share something I was thinking about recently. I was wondering about the student registration process, legitimate interest and informed consent. I find it difficult to imagine a situation in which a brand-new student, in some cases international students could possibly give informed consent to the downstream use of their personal data on systems they have not yet used. Obviously, you brilliant folk may have come up with something – I hope so.
    What worries me most is that the combination of the complexity of systems and ToS and business in general makes informed consent virtually impossible. There’s a great example in Chapter 1 of the new Zuboff book you might be able to read some of it here https://www.flickr.com/photos/francesbell/32213102407/ of the movement of the original concept of the Aware Home to where we are now. In some ways it’s a failure of Computer Science education but also a lack of the right sort of research. Just like the data is so available for further manipulation and interpretation, so the research too becomes about solutions looking for problems. I am just starting read Zuboff book but I suspect that I’ll find she is good at identifying the problems but the solution may be more elusive. I read Mozorov’s 16500 word book review before I started on Zuboff book.
    What I feel in my bones is that I hope that students can graduate with more questioning and resistance and less lethargy and shrugging their shoulders.

    1. Frances – many thanks for your (as ever) thoughtful comments.

      I want to pick up on the questions you raise about informed consent – you very rightly point out that it would be hard for new students to give informed consent as part of the registration process. The law would agree with you; GDPR is very clear that informed consent cannot exist in situations where there is a significant power imbalance:

      “Consent means giving people genuine choice and control over how you use their data. If the individual has no real choice, consent is not freely given and it will be invalid.” (ICO)

      If you look at most institutions you will find that they’re not asking for student consent for use of data. GDPR demands that a lawful basis for processing data is identified and gives 6 choices. Typically you will find that the lawful basis for some of the core data held in the student record will be covered by Contract, or Legal Obligations (some of this might be dependent on the underpinning legal charter for the University). Beyond that, the most heavily used lawful basis is almost certainly Legitimate Interests. Students have no options about the use of their data under any of these. That it is happening should be transparent, but we’re back into who reads the ToS type territory; your point about trying to ensure our students are critically engaged is well made there. If you read Shoshanna Zuboff’s 2015 paper (quoted above) then the concept of pyschic numbing is important I think:

      “Powerful felt needs for effective life vie against the inclination to resist the surveillance project. This conflict produces a kind of psychic numbing that inures people to the realities of being
      tracked, parsed, mined, and modified–or disposes them to rationalize the situation in resigned cynicism.”

      This probably isn’t what you wanted to hear!

      1. Thanks so much Anne Marie. I rather suspected that what you say about legitimate interest might be the case, having read Martin Hawksey’s slide show about GDPR a couple of months ago. I don’t know if you’ll be in Galway but I’d love to talk to you some time about this offline. I shouldn’t have troubled you with it here.

        1. I will be in Galway and I would love to talk to you more about all of this. No need to worry about offline versus online either. Happy to talk about this stuff wherever.

    1. Seen it twice now at MozFest – it’s really compelling.
      I was also carrying a copy of the Tactical Technology Collective’s Data Detox Kit when I was in Vancouver with you all. Meant to pass it on to Brian (some new cute little 10 minute security check cards were included) but I failed to remember.

Leave a Reply to Hamish Macleod Cancel reply

Your email address will not be published. Required fields are marked *