François Chollet+ Your Authors @fchollet Deep learning @google. Creator of Keras, neural networks library. Author of 'Deep Learning with Python'. Opinions are my own. May. 16, 2019 1 min read + Your Authors

Based on the observation that the GPT-2 medium-size model has memorized (and can spit back word-for-word) very long extracts from the web, such as the Gorilla Warfare meme, I had an idea for a very simple ML-less text generation algorithm. I spent the past 20 min implementing it.

My algo is to make search queries for the keywords in a prompt, plus the exact sequence of the last words in the prompt (trying different number of words to get at least one match), then stitch together result snippets by using last words as a continuity pivot. It works decently!

For the unicorn prompt, it just spits back the GPT-2 result (stitched together from multiple news sites!). Same for the Gorilla Warfare meme. For less popular prompts, it gets more creative, combining sentences or sub-sentences from multiple related sources.

I will not be releasing the code, because you guys couldn't handle the power of a Python script cobbled together in 20 minutes with Requests, BeautifulSoup, and regular expressions. It would change algorithmic cyberwar forever.


You can follow @fchollet.



Bookmark

____
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Since you’re here...

... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.