Ron Miller of TechCrunch reports, “IBM today announced a new product called Watson Analytics, one they claim will bring sophisticated big data analysis to the average business user. Watson Analytics is a cloud application that does all of the the heavy lifting related to big data processing by retrieving the data, analyzing it, cleaning it, building sophisticated visualizations and offering an environment for communicating and collaborating around the data. And lest you think that IBM is just slapping on the Watson label because it’s a well known brand (as I did), Eric Sall, vp of worldwide marketing for business analytics at IBM says that’s the not the case. The technology underlying the product including the ability to process natural language queries is built on Watson technology.” Read more
Big Data has been getting its fair share of commentary over the last couple months. Surveys from multiple sources have commented on trends and expectations. The Semantic Web Blog provides some highlights here:
- From Accenture Anayltics’s new Big Success With Big Data report: There remain some gaps in what constitutes Big Data for respondents to its survey: Just 43 percent, for instance, classify unstructured data as part of the package. That option included open text, video and voice. Those are gaps that could be filled leveraging technologies such as machine learning, speech recognition and natural language understanding, but they won’t be unless executives make these sources a focus of Big Data initiatives to start with.
- From Teradata’s new survey on Big Data Analytics in the UK, France and Germany: Close to 50 percent of respondents in the latter two countries are using three or more data types (from sources ranging from social media, to video, to web blogs, to call center notes, to audio files and the Internet of Things) in their efforts, compared to just 20 percent in the UK. A much higher percentage of UK businesses (51 percent) are currently using just a single type of new data, such as video data, compared with France and Germany, where only 21 percent are limiting themselves to one type of new data, it notes. Forty-four percent of execs in Germany and 35 percent in France point social media as the source of the new data. About one-third of respondents in each of those countries are investigating video, as well.
Apigee wants the development community to be able to seamlessly take advantage of predictive analytics in their applications.
“One of the biggest things we want to ensure is that the development community gets comfortable with powering their apps with data and insights,” says Anant Jhingran, Apigee’s VP of products and formerly CTO for IBM’s information management and data division. “That is the next wave that we see.”
Apigee wants to help them ride that wave, enabling their business to better deal with customer issues, from engagement to churn, in more personal and contextual ways. “We are in the business of helping customers take their digital assets and expose them through clean APIs so that these APIs can power the next generation of applications,” says Jhingran. But in thinking through that core business, “we realized the interactions happening through the APIs represent very powerful signals. Those signals, when combined with other contextual data that may be in the enterprise, enable some very deep insights into what is really happening in these channels.”
With today’s announcement of a new version of its Apigee Insights big data platform, all those signals generated – through API and web channels, call centers, and more – can come together in the service of predictive analytics for developers to leverage.
What best practices should inform your company’s text analytics initiatives? Executive Lessons on Modern Text Analytics, a new white paper prepared by: Geoff Whiting, principal at GWhiting.com and Alesia Siuchykava, project director at Data Driven Business provides some insight. Contributors to the lessons shared in the report include Ramkumar Ravichandran, Director, Analytics, at Visa and Matthew P.T. Ruttley, Manager of Data Science at Mozilla Corp
One of the interesting points made in the paper is that text analytics can be applied to many use cases: customer satisfaction and management effectiveness, product design insights, and enhancing predictive data modeling as well as other data processes. But at the same time, a takeaway is that it is better for text analytics teams to follow a narrow path than to try to accommodate a wide-ranging deployment. “All big data initiatives, and especially initial text analytics, need a specific strategy,” the writers note, preferable focusing on “low-hanging fruit through simple business problems and use cases where text analytics can provide a small but fast ROI.
Customer experience management vendor Clarabridge wants to bring the first-person narrative from call center interactions to life for marketing analysts, customer care managers, call center leaders and other customer-focused enterprise execs. With its just released Clarabridge Speech, it now brings via the cloud a solution that integrates Voci Technologies’ speech recognition smarts with its own capabilities for using NLP to analyze and categorize text, sentiment and emotion in surveys, social media, chat sessions, emails and call center agents’ own notes.
Agent notes certainly are helpful when it comes to assessing whether customers are having negative experiences and whether their loyalty is at stake, among other concerns. But, points out Clarabridge CEO Sid Banerjee, “an agent almost never types word for word what the customer says,” nor will they necessarily characterize callers’ tones as angry, confused, and so on. With the ability now to take the recorded conversation and turn it into a transcript, the specific emotion and sentiment words are there along with the entire content of the call to be run through Clarabridge’s text and sentiment algorithms.
“You get a better sense of the true voice of the customer and the experience of that interaction – not just the agent perspective but the customer perspective,” Banerjee says.
IBM Taps Global Network of Innovation Centers to Fuel Linux on Power Systems for Big Data and Cloud Computing
CHICAGO, Aug. 22, 2014 /PRNewswire/ — At the LinuxCon North America conference last week, IBM (NYSE: IBM) announced it is tapping into its global network of over 50 IBM Innovation Centers and IBM Client Centers to help IBM Business Partners, IT professionals, academics, and entrepreneurs develop and deliver new Big Data and cloud computing software applications for clients using Linux on IBM Power Systems servers. Read more
Gil Press of Forbes reports, “Gartner released last week its latest Hype Cycle for Emerging Technologies. Last year, big data reigned supreme, at what Gartner calls the ‘peak of inflated expectations.’ But now big data has moved down the ‘trough of disillusionment’ replaced by the Internet of Things at the top of the hype cycle. In 2012 and in 2013 Gartner’s analysts thought that the Internet of Things had more than 10 years to reach the ‘plateau of productivity’ but this year they give it five to ten years to reach this final stage of maturity. The Internet of Things, says Gartner, ‘is becoming a vibrant part of our, our customers’ and our partners’ business and IT landscape’.” Read more
Daniel Newman of Forbes recently wrote, “Over the last month there has been an unfathomable amount of content published about the massive privacy intrusion that is Facebook Messenger. With the ability to intrude into the lives of its users in ways that the NSA would never think to, it isn’t a surprise that the new download brought such strong opinions; many of which served as recommendations to not download the application. The good news about the widespread dialogue on messenger is that it brought to light the issues that surround privacy of data. Further implicating what some of us have always known. “When the service is free, the user is the product.” Make sense? In other words, when companies like Facebook create applications that we use in our everyday lives, for free, the real price is in what we sacrifice for the right to use the application for free, our data.” Read more
Lars Hard of Beta News recently wrote, “Artificial intelligence (AI) has become a bit of a buzzword among technology professionals (and even within the mainstream public) but truthfully, most people do not know how it works or how it is already being integrated within leading enterprise businesses. AI for businesses is today mostly made up of machine learning, wherein algorithms are applied in order to teach systems to learn from data to automate and optimize processes and predict outcomes and gain insights. This simplifies, scales and even introduces new important processes and solutions for complex business problems as machine learning applications learn and improve over time. From medical diagnostics systems, search and recommendation engines, robotics, risk management systems, to security systems, in the future nearly everything connected to the internet will use a form of a machine learning algorithm in order to bring value.” Read more
That’s how Blab characterizes the work it’s doing to add structure to the chaotic world of online conversation, normalizing and patterning the world’s discussions across 50,000 social network, news outlet, blog, video and other channels, regardless of language – to the tune of some hundred million posts per day and 1 million predictions per minute. Near realtime predictions, says CEO Randy Browning, of what a target audience will be interested in a 72-hour forward-looking window based on what they’re talking about now, so that customers can tailor their buying strategies for AdWords or search terms as well as create or deploy content that’s relevant to those interests.
“We predict what will be important to people so they can buy search terms or AdWords at a great price before the market or Google sees it,” he says. That’s the main reason customers turn to Blab today, with optimizing their own content taking second place. Crisis management is the third deployment rationale. “If a brand has multiple issues, we can tell them which will be significant or which will be a blip and then fade away, so they can get a predictive understanding of where to focus their resources to mitigate issues coming down the pike.
NEXT PAGE >>