So, you wrote a novel.
Hey, so did I!
And you know what? So did at least fifty thousand other folks — just this year alone.
(That number might be as high as 100k, or as low as 30k, but 50k is a conservative number to work with. And I’m going to over simplify here, as I do throughout this blog, but the general premises are worthy of exploration.)
50,000 authors who are looking to get published.
50,000 authors who want YOU to read their story.
50,000 authors, 90% of whom, will never see the inside of a bookstore.
Most authors will self-publish, because, that’s the only way to get their creation onto those warm white printed pages. (www.blurb.com)
But *all* authors will probably try to get their work submitted to a publisher. Which means that’s 50,000 queries and manuscripts that need to be analyzed — by humans. Let’s stand back and take a look at this as an information processing problem. Literary agents and publishers are sifting through tens of thousands of novels, the haystack, to find the ten or fifty needles that thread the reader onto its string of emotional attachment. Stories that will win awards, rise in the ratings and hopefully pay for themselves and all the wannabe novels that flop.
That’s a shit ton of critical analysis reading to do — accurately and quickly.
Enter, stage left, Deepmind.
Google’s Deepmind neural networked general intelligence platform is designed to take data, any data, and learn it. Want Deepmind to find cats in a picture? Find terrorist threats in email? Learn to mimic the human voice, or parse and replicate Shakespeare? First you need to train it. Engineers take a training set, say the top 500 most popular novels of the 20th century, and the worst 500, and they feed them to Deepmind. DM eats them like lollipops, licking and linking all the nuance of language, cadence, sentence structure, word selection, scene usage from these novels. It doesn’t “understand” them, but it doesn’t need to. It just has to learn: that’s good, and that’s not.
Then you take the next 100 top sellers and a the next worst 100 and test the dynamically constructed mind that was created from the original training. Editors would stand by to give hints and advice to the neural network, edge cases that Deepmind would miss. Eventually, the Automated Literary Analysis Neural Network ALANN, could now be opened to the public. Budding authors like you and me could submit our full manuscripts (no queries or synopsis nonsense) and ALANN would swallow up our words and spit back a thumbs up or down, and maybe a critique of what needs improvement.
ALANN wouldn’t be fool proof. But statistically, it would easily reduce the number of needles that proved worthy of closer inspection. ALANN would house manuscripts for years, waiting for trends to return. ALANN would be the single stop shop for finding material for publication.
The idea is not new. Ten years ago the concept was put to work for screenplays:
Imagine how well a Deepmind ALANN would perform today. If this doesn’t exist today, it will, soon. Billions of dollars ride on the discovery of the next Twilight, Hunger Games, or Harry Potter literary phenomena.