Posts Tagged ‘NLP’

What Will The Internet of Things Look Like in 2020?

5436515880_784b02af11_bDr. Mahesh Saptharishi of Forbes recently wrote, “Communication, in its many forms, has tremendous power… When you combine language with the Internet, you see how our lives have changed; we are freed from past limitations of distance, time and memory. Sensors give us an even greater opportunity to experience our world. The Internet-of-Things (IoT), as these connected sensors are collectively called, has enabled the digitization of language communicated by the physical world. Sensors allow the Internet to instantly extend the reach of our sight and sound. The data from sensors allow us to not only interactively, but also observationally communicate language.” Read more

Wit.ai Offers Siri-Like Natural Language Processing in an API

witaiKia Kokalitcheva of VentureBeat reports, “A few years ago, when Apple added Siri to the iPhone, talking to inanimate objects with batteries to make them do stuff was pretty novel. Today, thanks to companies like Wit.ai, even kids at hackathons are showing off weekend projects that are voice-controlled. Wit.ai, a Y Combinator-backed startup that provides natural language processing (NLP) in the form of an API, is helping developers and startups integrate voice commands into their products. The company is announcing today that it has raised $3 million in seed funding just over a year since its founders posted the first API version on Hacker News.” Read more

IBM Watson Accelerates Global Expansion

watsonNEW YORK – 07 Oct 2014: IBM is announcing significant milestones fueling adoption of Watson and cognitive computing cloud capabilities on a global scale. Watson is a groundbreaking platform that represents a new era of computing based on its ability to interact in natural language, process vast amounts of Big Data to uncover patterns and insights, and learn from each interaction.

On Tuesday, October 7, IBM Watson Group Senior Vice President Mike Rhodin demonstrates Watson at work in its Client Experience Center at its new global headquarters at 51 Astor Place in New York City’s Silicon Alley. Read more

It’s Time for Developers to Take Linked Data Seriously

9965173654_7bf862d89d_nCandice McMillan of Programmable Web reports, “If you aren’t familiar with linked data or the Semantic Web, it’s probably a good time to get better acquainted with the concepts, as it seems to be a popular theme that is rippling through the Web development world, with promises of exciting possibilities. At this year’s APIcon in London, Markus Lanthaler, inventor of Hydra and one of the core designers of JSON-LD, talked about the importance of optimizing data architecture for an integrated future, focusing on linked data as a promising solution.” Read more

IBM Watson Chief Mike Rhodin on What Watson Has Taught IBM

WatsonBarb Darrow of GigaOM recently wrote, “IBM’s Watson natural language query/cognitive computing prodigy was a huge PR coup for Big Blue. Three years ago, Watson defeated Jeopardy champ Ken Jennings on national TV and beat other challengers like a drum on a subsequent victory tour. (Ask Gigaom’s own Stacey Higginbotham about that sometime.) IBM rode that wave for years to show that despite its woes, it can still do really hard stuff. IBM wants Watson to be a $10 billion business by 2023. But, unfortunately for IBM, there is ‘not a lot of commercial application to playing Jeopardy,’ Mike Rhodin, IBM SVP for Watson, acknowledged at Emtech 2014 at MIT on Tuesday.IBM invested untold millions in Watson, so it’s now time for Watson to, in the tortured words of another Emtech presenter, become ‘a market-based solution’.” Read more

Healthline Launches New HealthData Engine

hitcJasmine Pennic of HIT Consultant reports, “Healthline, provider of intelligent health information and technology solutions, today launched its HealthData Engine to harness the power of structured and unstructured data to improve outcomes and reduce costs. The new big data analytics platform leverages the company’s market-leading HealthTaxonomy, advanced clinical natural language processing (NLP) technologies and semantic analysis to turn patient data into actionable insights.” Read more

Paxata Wins Ventana Research 2014 Technology Innovation Award for Information Optimization

paxataREDWOOD CITY, Calif.–(BUSINESS WIRE)–Paxata, the first unified Adaptive Data Preparation platform built from the ground-up to address the next generation of data integration, quality, enrichment, collaboration and governance needs for business analytics, was recognized by Ventana Research as the winner of the 2014 Technology Innovation Award for Information Optimization. Read more

Unearthing Data on Non-Public Companies with Artificial Intelligence

datafoxGreg MacSweeney of Wall Street and Tech recently wrote, “It’s relatively easy to find information on public companies. Bloomberg, Thomson Reuters, and Dun & Bradstreet, for example, all have in-depth information that is accessible to anyone with a subscription. But where do investment bankers, venture capitalists, and other investors find reliable information about private companies? If you talk to investment bankers, or other investors who are looking for information on non-public companies, it quickly becomes apparent there is no easy answer. Investment bankers rely mostly on Google searches and a combination of information gathered from Hoovers, S&P Capital IQ, Dun & Bradstreet, and others. But it is a laborious manual process to do due diligence on private companies.” Read more

Semantic Technology Job: R&D Engineer

IPsoft logoIPsoft needs a R&D Engineer. The job description states, “Amelia is the next generation human-computer dialog system that acts as your personal student, instructor, assistant, or friend.  Amelia is based on the latest state-of-the-art technologies in natural language processing, information retrieval, machine learning, and more.What distinguishes Amelia from previous generation human-computer dialog systems is its learning ability.  Amelia is capable of understanding syntax and semantics of natural language and automatically builds its own neural ontology from them.  If you want to teach Amelia about a certain object, you simply describe the object in natural language, then Amelia builds a neural ontology for the object automatically.  Once the neural ontology is built, Amelia can explain or answer questions about the object by traversing through the ontology.  Objects do not have to be specified upfront; you can talk about random stuffs and expect Amelia to build neural ontologies for objects that are newly introduced during your conversation with Amelia.  When you ask questions about things that Amelia does not have neural ontologies for, Amelia tries to find the most appropriate answer from the World Wide Web.  These include questions about weathers, current events, historical/geopolitical facts, etc.”

Read more

Clarabridge Goes Straight To The Customers’ Mouth To Analyze Call Center Interactions

cbridge logoCustomer experience management vendor Clarabridge wants to bring the first-person narrative from call center interactions to life for marketing analysts, customer care managers, call center leaders and other customer-focused enterprise execs. With its just released Clarabridge Speech, it now brings via the cloud a solution that integrates Voci Technologies’ speech recognition smarts with its own capabilities for using NLP to analyze and categorize text, sentiment and emotion in surveys, social media, chat sessions, emails and call center agents’ own notes.

Agent notes certainly are helpful when it comes to assessing whether customers are having negative experiences and whether their loyalty is at stake, among other concerns. But, points out Clarabridge CEO Sid Banerjee, “an agent almost never types word for word what the customer says,” nor will they necessarily characterize callers’ tones as angry, confused, and so on. With the ability now to take the recorded conversation and turn it into a transcript, the specific emotion and sentiment words are there along with the entire content of the call to be run through Clarabridge’s text and sentiment algorithms.

“You get a better sense of the true voice of the customer and the experience of that interaction – not just the agent perspective but the customer perspective,” Banerjee says.

Read more

NEXT PAGE >>