I’ve written this piece to revisit some older thinking, but also to state my position as part of my contribution to a presentation at the APT conference on 01 July at UCL. My colleagues Jenny Scoles and Timothy Drysdale will be there to present. Since I won’t be, the best I can do is give them something to point to if someone asks an awkward question. The presentation has quite a wide scope and I’m going to dig into one area of it here; namely why we need to develop some of our own learning technologies, and therefore what kinds of learning technologists we might need for the future.
I wrote a post almost a year ago about learning technologists and some of the new challenges that we face. I’m returning to it in the light of recent conversations and papers about non-traditional practical work in a science education context:
“The issues around learning technology that we now need to be alert to require us to have a higher level of understanding about platforms, systems, code, data, and ethics. As learning technologists we listen to and learn from our academic colleagues about how they are using technology in their practice; and bring our specialist knowledge to advise on the use of various technologies in specific pedagogical contexts. We need to move beyond just thinking about learning technologies as tools (or efficiency mechanisms) and consider the generative ways in which technologies might act as co-agents alongside teachers.”
I was arguing that I saw early signs and portents that the kinds of things a learning technologist might need to engage with was changing. I was tentatively arguing for a more post-humanist frame for thinking about learning technology. Specifically I thought that we would need more learning technologists with data handling and programming skills to meet our new challenges.
Learning technology ecosystems
Anyone who knows me knows that I have no real interest in talking about VLEs. I think they’re necessary and I think we need to do better in terms of the design and usability of courses within them, but I believe that most main stream VLEs are best thought of as commodity infrastructure, and that they have also become the epitome of the “platform university” (see recent announcements by Instructure for instance).
I also see significant dangers in rushing to outsource all our educational technology. At the very least we risk being hamstrung by what the market provides and restricted to a genericised view of learning technology delivered via a selection of black boxes. At worst I think we dumb down the potential for technology in education and don’t adequately prepare our students for the world around them.
I’m not arguing for a purist approach here; rather that we are thoughtful and intentional about our learning technology choices, recognising that the market cannot always provide, and retaining some capability to innovate as a sector for our specific needs. I’m not alone in thinking this. The recent Near Future Teaching project (“co-designing a values based vision for digital education”) run at Edinburgh by Prof Sian Bayne reached similar conclusions:
“Provide teaching staff and students with central access to programmers and developers for joint prototyping and trialling of new ways of doing digital education. Support associated pedagogic research through the Principal’s Teaching Award Scheme and other channels.” (Vision and Aims – Playful and Experimental: Near Future Teaching Final Report)
With that in mind then, we should be embracing as normal the notion of a wider ecosystem of learning technologies; a constellation of digital things, more or less loosely joined. Some might call that an NGDLE. I might not (or I might, but in another blog post I haven’t finished yet).
One recent example of where the market doesn’t provide, and where we need some capability to innovate for our specific needs is the subject of our APT presentation, which talks about the development of simulation activities for students as part of an engineering design project.
“A web-app was built that simulated a solar panel installed on a hill, with accurate sun movements. To develop student digital literacies as part of the activity, a facility for programmatic control of the solar panel orientation was included. There were threshold concepts (Meyer and Land, 2003) that needed to be understood, such as subtracting the signal from one sensor from another in order to work out the error in the orientation of the solar panel.”
This is a good example of developing a piece of learning technology that’s tightly bound to the course context and desired pedagogical outcomes. It’s also a good example of the kind of thing that’s only currently possible if academic colleagues have programming skills because there is no commercial supplier for this kind of learning technology and in many cases no local resource available to help develop it.
My colleagues in Engineering would argue that in order to do more of this kind of thing in the future something analogous to the current technician role is required. There’s a long history of technicians fabricating physical things for in-person lab work, and so it’s not a huge conceptual leap to assume we might now need people to fabricate digital things for online virtual-lab work. I agree entirely, and I’d extend that thinking beyond STEM education as the Near Future Teaching vision advocates.
Since that blog post in June last year I’ve been talking with a number of colleagues about an expanded view of our learning technology ecosystem, and the attendant expansion of role of the learning technologist that goes along with it. If we accept that the digital pervades education (it does) then I think there are very real and significant risks to institutions in not investing in and embracing this wider conception of our learning technology ecosystem, and the required expansion in the role of the learning technologist. Again, last year I said:
“It also needs to be possible for as many of our academic colleagues and our students as possible to engage meaningfully and critically with the increasing digitisation of education, and bring as diverse a set of perspectives as possible. Otherwise the risk is that their voices will be marginalised, and other interests and biases will come to the fore. Learning technologists have a vital role to play in this space I believe, but it requires institutions to invest in time, staff development and digital skills for all of us.”
I’ve argued in these conversations for us to have capacity to develop learning technologies for ourselves and where possible to use or release as open source.* This has converged with some recent rants on the blog including the need for the sector to do more for itself generally, and some thoughts about sustainability for non-traditional practical work specifically. I think it is vitally important that we retain the capacity to develop and implement for ourselves as a sector, as commercial educational technology will by its nature tend towards market share and monopoly (and therefore inherent genercisation). We need to ensure that the voices and requirements of our diverse academic communities are reflected in our learning technologies, and that we drive innovations and shape our own future, not merely react to one handed to us. A few colleagues have made the argument for this more eloquently than I:
In her recent University World News article Laura Czerniwicz wrote:
“Emergent opportunities in the digital era are just that – emergent. Despite the confidence of futurists, no one knows for sure how the confluence of current trends will play out.
But in this setting, the fall of the innovation dice is increasingly weighted to serve vested economic interests. It is therefore essential that universities enable and enlarge innovation spaces to expand the possible, and to do so with an agenda that serves social needs.” ()
And in a recent paper Jeremy Knox also writes:
“Indeed, there is good reason to think that utopic visions of technology for education could be reclaimed where public, collective ideals are built-in to the digital systems used for educational activity, instead of allowing private enterprise to increasingly encroach on teaching and learning practices through corporate platform models.”
Finally Ben Williamson highlights the real dangers that he sees in the wholesale outsourcing of learning technology:
At a time of budgetary austerity and myriad other pressures on HE, the platform university is a perversely pricy technical solution to complex structural, institutional, and political problems. Significant HE spending is now flowing from universities to platform providers, along with data they can use to their own advantage as market actors in an emerging sub-sector of platform capitalism. Unless universities act collectively in the sector’s own interests, they may find themselves positioned as educational product providers and data collection partners for the new HE platform industry.
Challenges and possible solutions
So far, so utopian. I work in an institution that is well resourced, but even here we are tending to favour outsourcing more often. I’m not going to shy away from the fact that to have this kind of learning technology development capacity needs money and resources. Fundamentally it needs us to see some kinds of learning technologies as more than commodity IT infrastructure, and to invest in them accordingly. IT departments with budgets under pressure and student expectations about services levels to meet find this deeply uncomfortable territory. I get that. It’s still necessary. We can manage this though, and IT people have some of the skills and thinking we need in here. For example:
- Can we strike some balance around Infrastructure as a Service type options? We could at least aim to standardise the infrastructure and development tools that we use.
- Are there common development frameworks or components that we could adopt? For example our APT example used the Blockly library from Google.
- We can try to standardise around delivery frameworks. A neat example might be the Tsugi project which provides an LTI enabled “app store” of tools that can be very easily plugged into the VLE. By adopting common standards like LTI and frameworks like Tsugi we can have visibility of the things we develop, be clear about what data is going where, and get maximum use out of them across out institutions.
- In a sector of diverse capacity and resource, we also need to think about solutions at a sector level. Where can we share? How can we support local remix to ensure that one institutions learning technology doesn’t become another’s “black box”.
- Can we engage students in co-creation of some of these technologies? Combined with sharing at sector level and the potential for remixing / refactoring locally this offers a myriad of rich pedagogical opportunities.
Finally, I’m going to end with something I wrote for Educational Technology magazine on the use of cloud services in education, because I think it sums up the core of the argument about retaining capacity and outsourcing:
Q. What risks and challenges should education institutions be aware of when adopting cloud technology?
I think the main challenge to be managed when adopting more cloud technology is to ensure that a move to outsourcing doesn’t end up depleting your own skill-base. Cloud can help take the strain when it comes to commodity IT, and it can also open up access to computing resources that would otherwise be impossible to run locally (AI, machine learning, chatbots, etc).
The trick is to look beyond the efficiency and cost savings that come with doing the first, and realise the innovation potential that comes with the second. Not achieving this will mean, in the long term, a reduction in our capacity for innovation, and probably exposing ourselves to significant cost risks through increased external dependencies.
Cloud technologies are being used to drive forward change processes in all our institutions, but the other vital thing to remember is that change is fundamentally a ‘people’ process. Turning to the cloud will allow us to negate technical barriers and problems, and open up new technologies beyond our current capabilities – but ultimately it won’t, in and of itself, deliver change. Continuing to attend to the cultural impact of change, and ensuring that people are well supported through investment in areas like digital skills development; these processes have to go hand-in-hand with the innovation.
* We have tried to model this through activities like launching a University blogging service based on WordPress, releasing our work on LTI and WordPress as open source, and hosting a Jupyter.org community hackathon to improve the nbgrader plugin for Jupyterhub.