A quick thought about LXD, OER, ChatGPT and how nothing changes

I’m scribbling a quick blog post to not lose a thought that been’s percolating over the last 24 hours or so.

I’ve been thinking again about learning experience design (LXD), based on thinking about the online learning space and the extent to which in a more crowded environment, it’s the quality and impact of experience that will count more and more.

This is exactly what I was working on at Athabasca University with the Integrated Learning Environment project, and with nascent work on defining a signature pedagogy and driving out a new approach to curriculum design and development. I’m still convinced that this is the kind of work that any university in the online space needs to be investing in and working on.

If you’re not familiar with Learning Experience Design you can read more here. Succinctly though, development of curriculum is approached as a design problem, with a high degree of emphasis placed on the learning experience of the student and decisions about activities, technologies, content etc all driven from that perspective.

It’s fundamentally a human centered approach, largely results in change work as much as technical development work (because it can upset the status quo), and is predicated on an increased amount of up-front effort in designing and iterating on improvements having a longer term pay off over the lifetime of a course or programme.

Yesterday I read a press release from Coursera that talked about the ability to use generative AI to speed up the development of course content, and it struck me that this kind of approach stands in stark opposition to the place that learning experience design is coming from.

AI-ASSISTED COURSE BUILDING POWERED BY GENERATIVE AI – Based on a few simple inputs from a human author, a new set of AI-powered features can auto-generate course content  — such as overall course structure, readings, assignments, and glossaries — to help educators dramatically reduce the time and cost of producing high-quality content.”

4 thoughts on “A quick thought about LXD, OER, ChatGPT and how nothing changes

  1. We’ll muddle through the vanilla production of every kind of text for some time before we get bots that can do “wild” badged as “creative” dot joining. What puzzles me is the mindset that clings to existing ways of doing things, i.e. let’s automate curriculum design. It’s unsurprising to be stuck in a horseless carriage space. The weak spot in formal education is its vulnerability to snake oil vendors (“do we have a magic bullet for you!”).

    A LLM fine tuned to detect and point to the con in BS ed tech, i.e. most of the stuff being sold to formal ed would be a valuable contribution to shuffling us along the path to a better place.

    1. Thanks for your comments and sorry for the delays in replying Chris. I agree with you absolutely in terms of just automating what we already do, but that speaks to the bigger “efficiency” mindset that seems to reign supreme. Changes to the quality of things are harder to quantify an ROI on, and so harder to get buy-in for, and thus we tend towards this boring reductionist view.

      I LOVE the idea of an LLM edtech bullshit detector. It seems exactly the kind of small, fine-tuned, useful, and specific use case that lots of open implementations are having good success with. Do we need a Kickstarter campaign?

  2. I wonder if there is a middle ground? As a software engineer, I make use of an AI co-pilot constantly now. It’s not writing whole projects for me. It does finish my lines. I think this human-supervised AI-as-co-pilot is the real sweet spot, and I’m excited to think about how this could apply to learning design.

    “This is a section from my lecture notes. Suggest 3 formative assessment questions, each with correct answers and at least two plausible distractors”.
    “These are the most common incorrect answers to this test question. Summarize the most common misconceptions students have”
    “Suggest three possible peer instruction questions to pose at the start of this topic”
    “Reformat this explanation as a worked example in the context of sustainable energy”

    None of these needs to be instead of a thoughtful and human-centred learning design, but could be a tool that an experienced learning designer uses and directs with care and oversight to instantiate ideas, run experiments, and gather feedback more efficiently.

    1. Thanks Edward. I like the co-pilot analogy here. I tend to think of the possibilities in this space in terms of cyborg metaphors; entanglements of humans and technologies.

      The examples you pose are interesting to me as in the main they are about tapping into the aggregated knowledge of others, and thus saving time in terms of researching and writing these things one’s self. They’d need to be checked by a subject matter expert I would assume? And I still want to try to get past the efficiency uses (though these aren’t to be sniffed at – certainly being able to get new courses out faster at a lower cost is highly beneficial for resource strapped institutions). What are the other possibilities in this space beyond cost savings and scale?

Leave a Reply to Anne-Marie Scott Cancel reply

Your email address will not be published. Required fields are marked *