Virginia Backaitis of CMS Wire recently discussed the rise of Big Content and how semantic technologies are taking a role in mining value from said content. She writes, “The first time I heard the term Big Content, I thought ‘Oh brother.’ At the time, the people who used the term “Big Content” either talked about it in terms of content that went beyond their control (i.e. viral) and said that this was a highly desirable thing OR they argued that ‘content’ was growing at a rapid clip, and that Big Data technologies, such as Hadoop, were incapable of dealing with voluminous unstructured data — that you needed something called ‘Big Content’ to do that.”

She continues, “The latter might have made for an interesting argument, save one problem; Big Data technologies, and Hadoop in particular, can handle unstructured data just fine… Last October Gartner started talking about ‘Big Content.’ Here’s part of what Craig Roth, vice president and service director for Gartner Research, in Burton Group’s Collaboration and Content Strategies service, had to say in a blog post: ‘Big Data has much to offer to folks who are turned off by the word ‘data’ and may pay more attention to its potential value if a subset of its techniques are thought of as Big Content. Just as Big Data uses Apache Hadoop (with MapReduce) to go beyond traditional BI, Big Content combines technologies to go beyond traditional search. These technologies are applied to text analytics, sentiment analysis, video analysis, semantic web technologies, and attention management. For industries that care more about what people are saying rather than what meters are measuring, Big Content will become a big deal’.”

Read more here.

Image: Courtesy Flickr/ fsse8info