black and white animation of moving cogs, with Charlie Chaplin (the white silent film actor) riding one of the cogs with a spanner in his hand.

Why we don’t have good edtech decision making frameworks: It’s a bigger problem than HE

You know sometimes you get that moment of delicious friction when two small ideas rub up against each other and you see something new that you realise should have been obvious all along….?

One of the benefits of my little self-imposed sabbatical is that I have the time and space to do work that interests me. Some folks know that I’ve been involved with various open source groups since around the mid-2000s and for the last 5 years I’ve been on the board of the Apereo Foundation (and been Board Chair for the last 3 years). Just a few months ago I was also delighted to be accepted onto the board of the Open Source Initiative. I’ve had more time to give to these organisations recently and it’s been fun and enlightening immersing myself into broader open source communities and discussions.

We’ve got a lot on at Apereo at the moment and so I’ve been picking up work in a couple of areas to support our Executive Director. I’ll write another blog post (or two, or three) about Open Source Programme Offices (OSPOs) but the other area we’ve been working on is the EU Cyber Resilience Act. I say “working on” – our activities are mostly focussed on doing awareness raising rather late in the day. If you want to understand more about why the CRA could be a problem for open source (and remember that most commercial software is also built on open source, so this is everyone’s problem) this talk from Mike Milinkovich of the Eclipse Foundation is well worth your time (or this blog). The Eclipse Foundation has been taking a leadership role in work to try ensure that EU lawmakers have all the information they need to make good workable legislation that avoids some nasty unintended consequences.

I had a very useful call with a colleague yesterday who is working with Apereo to support our CRA awareness raising activities and in our discussions they

mentioned several times that problems we are seeing with the formation of law around cyber security (and AI) are in no small part because software has never been regulated before. The standards bodies don’t have substantial prior experience, the consultation frameworks don’t exist, and potentially not enough digital expertise exists at all the right levels that it needs to.

This morning Tim Fawns dropped me a DM and asked me what thoughts I had about universities buying student licenses for AI tools to ensure equal access. I think he knew I’d share his instinctive unease that this was a complicated space that needed thinking through and wanted to pick my brain as he knows I’ve been ranting about both AI and procurement.

And *click* – the pieces fell into place. OF COURSE universities don’t have good frameworks for assessing the issues around buying technology. Why would they? It’s been an entirely unregulated area and literal nation states are struggling to take the first steps into this space. I don’t mean to say this to let us off the hook btw. I still believe that we already have incredibly useful tools and frameworks within our walls in our research ethics and ethical supply chain spaces that we can choose to leverage. We also have models like the GDPR legitimate interests balancing tests, and the ALT FELT framework to inspire us. But the problem is bigger than us which, duh!, Facebook, Twitter, and all the dumpster fires.

I’m immensely grateful to have the time and space at the moment to think and see a bigger picture than just HE, and now I’m thinking about the ways that accessibility and data protection legislation have interacted with edtech procurement and practices in HE, and thinking about how future regulation in areas like cyber security, AI, and more will continue to do so. Higher Education is largely a digital business these days (please forgive the language) and so our responses to this evolving legislative landscape can’t simply be reactive. There are places and space of policy (such as the CRA) where the effects on us will be material (problems sharing open source produced as part of research, rising costs for commercial products etc.) and where those who can lobby lawmakers won’t necessarily be acting in our interests (*cough* OpenAI, I’m looking at you right now). As always, I don’t have answers, just mounting piles of concerns and problems. At the very least it makes abundantly clear that existing consultation frameworks aren’t sufficient and major stakeholders are being missed.

And do I need to rant again about how daft anglophone HE is in stepping back from engaging with open source software and communities over the last 15 or more years? Another sweeping generalisation I know, but it’s a trend and there’s no denying that.

One thought on “Why we don’t have good edtech decision making frameworks: It’s a bigger problem than HE

Leave a Reply

Your email address will not be published. Required fields are marked *