A satirical AI generated image of an advert for ChatGPT 26, proclaiming "now with more fonts and better adverts". The style is cartoonish and evokes childishness.

LLMs Have Plateaued. Now we can finally figure out what to do!

The most intimidating thing about thinking about technology, whether as an academic, a policy maker or just as a human being trying to imagine the next five years without collapsing into a puddle of anxiety, is the fast pace at which it seems to move. Emergent technology makes us feel the future as something over which we have no control, that is difficult to understand, and that is catastrophically disruptive to our society, opening the door to the unfamiliar and the dangerous. This is not false, but is only ever going to lead to emotional paralysis. Particularly for philosophy, the slowest mode of thought, whose owl only flies at dusk, it feels impossible to find secure footing for analysis and critique.1 We stand instead on intellectual quicksand, the more we twist and turn the deeper we sink. Why even bother thinking about this new technology? Whatever we come up with will be obsolete by tomorrow? ...

A comic-style drawing of a man looking exasperated at a computer screen showing the ChatGPT logo surrounded by hearts

ChatGPT: An Uncritical Friend?

ChatGPT shows promise as a helpful interlocutor for research, but its overly complimentary nature threatens its usefulness.

AI Literacies launch event poster

AI Literacies launch

On Friday I attended the launch event of the AI Literacy initiative of the Digital Society Research Group, aka DISC, at Manchester Metropolitan. I recently joined this research group and spoke briefly at it myself. The event was held in hybrid format with the recordings posted to the DISC YouTube channel. The “headline” event was a keynote talk by Mark Carrigan. I particularly enjoyed his analysis of the concern in HE at the moment around subversive use of chatbots as a ‘crisis of trust’. I’d not thought about it like that before, and I think he’s quite right. ...