(2019-06-18) Sloan Notes From The Quest Factory
Robin Sloan: Notes from the quest factory. Recently, I used an AI trained on fantasy novels to generate custom stories for about a thousand readers. The stories were appealingly strange, they came with maps (MAPS!)
*Honestly, I think the key to this project wasn’t the AI but the paper.
I’m very happy to have discovered Lob, a service that allows you to print and mail things using code. How are these things printed? From where are they mailed? I have no idea, which is mildly disconcerting, but also mildly magical.*
the challenge is to get people to actually READ THE SENTENCES. Not just appreciate the framing
Here, I’ll outline the process I used to generate these quests.
enticed about a thousand people to pay a few dollars and fill out a Google Form, specifying things like the name of their quest’s leader, the kind of artifact their questers sought, the species of creature encountered on the road—you know, quest essentials!
*So what?
There’s no shortage of fantasy novels that meet those requirements.*
The Ruby code to produce one story skeleton from a single reader’s map and form looked like this:
Notice, in the fourth prompt above, the “but, unfortunately,” which produced reliably fun results.
Third: look closely at the final prompt.
First: I can specify how many sentences I want
GPT-2 is seeing the line “the world was quiet,” which will influence the text it generates; however, “the world was quiet” is not being shown to the reader. The reader is instead seeing… nothing. An empty string. So the reader sees only GPT-2’s response to “the world was quiet,”
After I’d generated one of those files for each reader, how did I use it?
A Python script
I think fiction’s functioning relies on the establishment of a kind of light dream-state. That establishment, in turn, requires a kind of commitment, or surrender, and the web is the enemy of commitment, because there’s always another option just a few pixels away.
Using Ryan Guy’s terrific Fantasy Map Generator code
The place names all came from a tiny neural network trained on a selection of real place names from world history.
Next, downloaded the quest design form responses. Using a Ruby script, each reader was assigned a map, and the place names on that map were combined with their responses to produce a “story skeleton” that I could feed into the AI text generator.
The text generator I used was GPT-2, a powerful language model developed by San Francisco’s OpenAI.
I continued that training—“fine-tuning” the model—on several hundred megabytes of fantasy novels.
This notion of context was key to the quest generation process.
Why both printing and mailing these AI-generated stories, though?
And then, on to the next bauble! There’s no shortage.
Second: notice the words I use at the ends of the prompts. I am hardly an AI whisperer, but I do think I’ve learned a bit about nudging a language model towards interestingness.
GPT-2’s code gives you the option to provide “context.” Before you ask the model to generate text, you can feed in a sequence of characters to establish, basically, what’s going on in the story.
I’ll close with one more reflection. Let’s imagine it’s ten years from now, and the super-powerful language model called GPT-2000 can produce an entire fantasy novel all on its own
It’s clear that the best thing on the page, the thing that makes it glow, is the part supplied by a person.
For as capable as GPT-2 and its offshoots become, the thing that will make their output worthy of our attention is UNHELPFUL PUMPKINS.
Edited: | Tweet this! | Search Twitter for discussion
BackLinks: BookList
No twinpages!