Alex Armstrong of I Programmer reports, “Google has awarded over $1.2 million to support research in several areas of natural language understanding that relate to Google’s concept of the Knowledge Graph. Google has been investing heavily in machine learning and deep neural networks to improve web search. Supporting natural language understanding is also motivated by the need to further search technology. In the announcement of the awards Google Research Blog explains how natural language processing is integral to its Knowledge Graph technology that represents a shift ‘from strings to things’.”

The blog post states, “Understanding natural language is at the core of Google’s work to help people get the information they need as quickly and easily as possible. At Google we work hard to advance the state of the art in natural language processing, to improve the understanding of fundamental principles, and to solve the algorithmic and engineering challenges to make these technologies part of everyday life. Language is inherently productive; an infinite number of meaningful new expressions can be formed by combining the meaning of their components systematically. The logical next step is the semantic modeling of structured meaningful expressions — in other words, ‘what is said’ about entities. We envision that knowledge graphs will support the next leap forward in language understanding towards scalable compositional analyses, by providing a universe of entities, facts and relations upon which semantic composition operations can be designed and implemented.”

Read more here.

Image: Courtesy Google