Real-time discovery of quality content is becoming more important because pre-scheduling sharing of older content doesn’t have the same impact it once did. The field of real-time content analysis is evolving quickly. The chart above illustrates the sharing timeline and identifies the discovery hotspot that these 5 tools help identify. Quality content discovered and shared in the hotspot is more likely to be viewed by your followers and network as a news breaker as opposed to a news follower - which can increase your online authority, establish you as a trend spotter and help give your business an information edge.

The Innovative Marketer’s Blog by Kevin Jorgensen on Tue, Sep 11, 2012

Real-time discovery of quality content is becoming more important because pre-scheduling sharing of older content doesn’t have the same impact it once did. The field of real-time content analysis is evolving quickly. The chart above illustrates the sharing timeline and identifies the discovery hotspot that these 5 tools help identify. Quality content discovered and shared in the hotspot is more likely to be viewed by your followers and network as a news breaker as opposed to a news follower - which can increase your online authority, establish you as a trend spotter and help give your business an information edge.

The Innovative Marketer’s Blog by Kevin Jorgensen on Tue, Sep 11, 2012

Now that content curation has become mainstream and so many businesses are focused on contextualizing your actions, it’s their turn to flourish. Once the basis of human intent has been founded, you can pretty well guess what an individual’s next action will be. This drums up dreams of better advertising, properly targeted marketing, improved product recommendations and personalized data streams, all given back to the consumer. It’s an entire service built around a data model that understands you better than you understand yourself.
“If we gaze a bit further down the road, it’s reasonable to expect that content curators will employ the use of personalized and artificial intelligence technologies. Whether its a site with 20 million uniques, or one with 10 thousand, these tools will have the ability to learn and make assumptions about readers across publishers’ properties, and across the internet landscape, says Jarret Myer, CEO and Founder of UPROXX .
“These tools will have the ability to learn and make assumptions about groups of users, with the intent to serve the most desired content based on crowd sentiment. It’s also possible to expect there won’t be a need for editors to curate and upload content to initiate conversation; personalization technologies will automate the process of curation by identifying topics of interest and uploading them in real-time.
The companies that are truly winning over audiences and driving consumers are the ones that are experimenting with a balance of automated aggregation and human-directed curation. It’s a process of out-sourcing and in-sourcing.

Fast Commpany, The Content Conundrum: To Create Or Automate, 07-05-2012

(Source: contentcurationmarketing.com)

Prismatic is built around a complex system that provides large scale, real-time, dynamic personalized re-ranking of information, as well classifying and grouping topics into an ontology. Prismatic has recently announced new improvements, as well, so expect the team to continue to push the limits for news consumption. This discussion would be of interest to any online or social news junkies (like me) and/or developers who are fiddling with ideas of how to build on top of Twitter.
Although Thirst is starting out on the Twitter platform, the company is really more about natural language processing technology. The Twitter iPad app is more of a proof of concept around whether its NLP processor works well. Verma says that it’s really difficult to keep up with information shared through Twitter and there has to be a better way of surfacing the most important news. Thirst uses a custom natural language processor to pick out the most important stories around different keywords or subjects like ‘gay marriage’ (because of this past week’s big announcement from President Barack Obama in support of it).
I certainly send the Twitter engine enough signals — tweets, retweets, follows, followers, and location data — to help determine what is important to me. So far, though, I have to believe that Twitter’s Discovery engine has mistaken me for another user.

Ongo to close doors

We believe in Ongo's unique approach, blending aggregated news sources with the curation only a professional editorial staff can provide. Unfortunately, we may be too far ahead of the general market's readiness to adopt such a solution, and business realities leave us no option but to close our doors. (Ongo announcement to subscribers)

But it’s important to note that Ongo drew skepticism from media watchers from the start. Its business model rested on the belief that people would value an ad-free, curated experience enough to pay for it, despite both the availability of the (mostly) free web and other free apps like Flipboard.

"We think we have a role in bringing the external perspective into the employee base" says Edwards. That makes IBM a curator as much as a communicator.
Machines can learn what memes will go viral, but only humans can aggregate the content effectively to create a truly social Web experience, according to Jonah Peretti, the co-founder of BuzzFeed, a meme-focused aggregator site.
That is largely where we find ourselves in the journalism conversation of 2012, with a dreary roll call of depressive statistics invariably from the behemoth’s point of view: newspaper job losses, ad-spending cutbacks, shuttered bureaus, plummeting stock prices, major-media bankruptcies. Never has there been more journalism produced or consumed, never has it been easier to find or create or curate news items, and yet this moment is being portrayed by self-interested insiders as a tale of decline and despair.
Matt Welch, When Losers Write History, April 8, 2012