ChatGPT is really bad at procedural tasks

This image features a close-up view of an airplane safety instruction card. The card is divided into multiple panels, each containing illustrations and instructions for various safety procedures. These include how to use an oxygen mask, how to inflate a life vest, and the correct brace positions in case of an emergency. The illustrations are colorful and designed to be easily understandable, using a combination of images and minimal text to communicate critical safety information. The overall layout is designed for quick reference, catering to passengers' needs during an emergency.

In this post, I look at ChatGPT’s ability to create procedural tasks in images. Using paper airplanes, three-fingered dwarves, origami rabbits, and the sinking Titanic we look at the limitations of this powerful tool in creating coherent instructional images.

How wrong can it be? Analysis of ChatGPT’s capabilities in producing a scholarly reading list

I’ve just returned from the remarkable Workshop for Instruction in Library Use 2023 (WILU). These types of workshops never fail to ignite my enthusiasm and inspire my work in libraries. This year’s WILU was no exception. One session, in particular, stood out as the crowd favourite: “Imagining Instructional Practices in … Read more