An article at Harvard Business Review takes an interesting look at “seven things human editors do that algorithms don’t (yet).” They boil down to: anticipation, risk-taking, the whole picture, pairing, social importance, mind-blowingness, and trust.
Clearly there’s still room for humans on the web. In search, that’s good news for Blekko, which brings the old human-edited approach back into the mix of an industry that has largely been dominated by the algorithm for the last decade, though the jury’s still out on whether it will ever be as effective as Google.
In terms of content creation, we’ve already seen the beginnings of what the algorithm can do. Look at Demand Media’s business model (at least for the content portion) – it’s largely algorithm based, though it still uses humans to write and edit the content.
The future content farm may be a different story though. We’ve also seen the absence of human intervention in content creation. Look at what Narrative Science is doing. The company, run by a former DoubleClick executive, describes itself in the following manner:
“We tell the story behind the data. Our technology identifies trends and angles within large data sources and automatically creates compelling copy. We can build upon stories, providing deeper context around particular subjects over time. Every story is generated entirely from scratch and is always unique. Our technology can be applied to a broad range of content categories and we’re branching into new areas every day.”
Look at what IBM has been able to accomplish through machine learning with its robot Watson. How long until a bunch of Watsons are creating content for the web (and creating other Watsons, for that matter)?
The good news is it might still be a while before robots replace us all. Back to the points made in the Harvard Business Review, by Eli Pariser, he notes that algorithms aren’t yet good at predicting future news, to the extent that humans are.
As far as risk-taking, “Chris Dixon, the co-founder of personalization site Hunch, calls this “‘he Chipotle problem,’ he writes. “As it turns out, if you are designing a where-to-eat recommendation algorithm, it’s hard to avoid sending most people to Chipotle most of the time. People like Chipotle, there are lots of them around, and while it never blows anyone’s mind, it’s a consistent three-to-four-star experience. Because of the way many personalization and recommendation algorithms are designed, they’ll tend to be conservative in this way — those five-star experiences are harder to predict, and they sometimes end up ones. Yet, of course, they’re the experiences we remember.”
Would you trust content created by algorithms or do you put your trust in humans?