Janelle Shane @JanelleCShane I blog at aiweirdness.com/. My book "You Look Like a Thing and I Love You" is out now! Research Scientist in optics. she/her. wandering.shop/@janellecshane Nov. 19, 2019 2 min read

Struggling with crafting the first sentence of your novel?

Be comforted by the fact that AI is struggling even more.


Last year I trained torch-rnn on 10,096 unique first lines of novels, all contributed by fans of AI Weirdness.

The neural net struggled to make sense for longer than a few words at a time, although it did produce some gems.


In just a year, neural nets have become powerful enough to generate consistently grammatical sentences. Once finetuned on the dataset of novel first lines, GPT-2 can generate readable sentences.

They don't necessarily make SENSE, but the rhythm is spot on.

Ah you refer to my FIRST attempt last year, which is best forgotten

Today's GPT-2 neural net gets the feel and rhythm of a story's first line. It also inserts surrealist narrative hooks, although this is most likely because it is trying to be mundane and is making mistakes because it doesn't understand what it's saying.

Sometimes the neural net's uncomprehending surrealism ventures out of "interesting narrative hook" territory and into "failed simile" territory

Ah yes I see you have encountered the bonus material. Fun fact: GPT-2 can’t quite believe that the sentences are all supposed to be independent so it can be induced to follow certain themes

The disadvantage of a neural net that can string together a grammatical sentence is that its writing now can begin to be terrible in a more human sense, rather than merely incomprehensible

The neural net is particularly prone to a brand of awfulness that seems to stem from wordy Victorian prose.

Grammatically impeccable, utterly unreadable

The neural net was originally trained on internet writing, so it doesn't quite believe that the first lines are independent.

The happy result is that I can nudge it in certain directions with a Harry Potter prompt.
(3901 more examples exported here  https://github.com/janelleshane/novel-first-lines-dataset/blob/master/iteration150_temperature0p8_potter.txt  )

My Little Pony is one of the most persistent genres - prompt the neural net with "Twilight Sparkle was out of cupcakes." and it will respond with dozens more MLP story openings.

They do end up being a bit grimmer though. (4284 more examples here  https://github.com/janelleshane/novel-first-lines-dataset/blob/master/iteration150_temperature0p8_ponies.txt  )

You would think that prompting the neural net with "It is a truth universally acknowledged" would send it to new heights of Victorian wordiness, but instead it mostly gets scarier.

(2822 more examples here  https://github.com/janelleshane/novel-first-lines-dataset/blob/master/iteration150_temperature0p8_victorian.txt )

I tried leaning into the scariness by prompting with "It is a terrible, terrible idea, even if entirely accidental, to talk to one of the Ancient Ones." and oh god did it ever work

(3574 more examples here  https://github.com/janelleshane/novel-first-lines-dataset/blob/master/iteration150_temperature0p8_ancient.txt )

There are so many gems in the raw neural net output here  https://github.com/janelleshane/novel-first-lines-dataset 

looking forward to seeing what others discover in there

You can follow @JanelleCShane.


Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.