Stepford app

A design-based response to bias in storywriting AI

Tester reviews scores Stepford on its sexism analysis (zoom to full screen)

In December 2021, Algowritten members* received funding from the Mozilla Technology Award to develop a project that would set the same AIs that tells stories (GPT-3 and similar algorithms) the challenge of spotting and describing sexist bias in the texts that they produced.

We chose, in the end, to focus on sexist bias towards women in order to simplify the training process, and because we found lots of evidence of sexist bias in our initial AI-human short story collection.

The first stage of the project was to develop a suitable prompt that could instuct the GPT-3 AI to perform sexism detection and description and then develop an interface to allow us to ask volunteers to mark its effectiveness in performing its task**. The second goal is to develop a large number of human-graded attempts to define sexism. We initially planned to ‘fine-tune’ the GPT-3 model that we were using. However, we found out that asking human testers to assess whether they agreed with a machine about whether a fiction text was sexist or not was much more challenging for them than we had imagined. What does it mean to teach a machine about sexism? How might it interpret our feedback? How much do we want it to know? Nuances of interpretation are highly varied and rather than provide a clear solution the tool rather seems to ask searching questions about bias in storytelling, especially around character tropes and the depiction of women in genre settings. You can learn more by reading about some of our tests with participants during the launch of the tool in November 2022. Nevertheless, we will be sharing both our quantitative and qualitative outputs from testing to allow other researchers and developers to build on what we have learnt. We have been able to do this and explore this approach to bias detection thanks to essential funding from Mozilla Foundation which leads worldwide efforts to build trustworthy AIs.

Learn more about the project:


* An arts organisation called Naromass was set up by members David Jackson, Marsha Courneya and Toby Heys to receive funding. Narromas members are all academics in the School of Digital Arts at Manchester Metropolitan University.

** Naromass are working with developers and researchers Dave Mee and Maartje Weenink to develop the app.