A recent announcement on EurekAlert! states: “Researchers from North Carolina State University have developed artificial intelligence (AI) software that is significantly better than any previous technology at predicting what goal a player is trying to achieve in a video game. The advance holds promise for helping game developers design new ways of improving the gameplay experience for players.”We developed this software for use in educational gaming, but it has applications for all video game developers,” says Dr. James Lester, a professor of computer science at NC State and senior author of a paper on the work. “This is a key step in developing player-adaptive games that can respond to player actions to improve the gaming experience, either for entertainment or – in our case – for education.” The researchers used “deep learning” to develop the AI software. Deep learning describes a family of machine learning techniques that can extrapolate patterns from large collections of data and make predictions. Deep learning has been actively investigated in various research domains such as computer vision and natural language processing in both academia and industry.”
Apigee wants the development community to be able to seamlessly take advantage of predictive analytics in their applications.
“One of the biggest things we want to ensure is that the development community gets comfortable with powering their apps with data and insights,” says Anant Jhingran, Apigee’s VP of products and formerly CTO for IBM’s information management and data division. “That is the next wave that we see.”
Apigee wants to help them ride that wave, enabling their business to better deal with customer issues, from engagement to churn, in more personal and contextual ways. “We are in the business of helping customers take their digital assets and expose them through clean APIs so that these APIs can power the next generation of applications,” says Jhingran. But in thinking through that core business, “we realized the interactions happening through the APIs represent very powerful signals. Those signals, when combined with other contextual data that may be in the enterprise, enable some very deep insights into what is really happening in these channels.”
With today’s announcement of a new version of its Apigee Insights big data platform, all those signals generated – through API and web channels, call centers, and more – can come together in the service of predictive analytics for developers to leverage.
In a recent article, Mike Kravis explained, “In a previous post I discussed how the Internet of Things (IoT) will radically change your big data strategy. Massive amounts of data from sensors, wearable devices, and other technologies are creating new and exciting opportunities to make better business decisions in real time. However, harvesting all of this data is only half of the equation. Making the data actionable is where real value lies. Traditionally, companies have mined data to look for trends and opportunities. In the world of IoT, searching for nuggets of information in petabyte sized databases is the equivalent of trying to find a needle in a haystack. To help extract value quickly and effectively, companies are turning to machine learning technologies, like big data technologies. However, implementing machine learning successfully can be extremely time consuming and complex. This has given birth to a new breed of vendors who deliver machine learning as a service, allowing customers to quickly implement technologies to turn massive IoT databases into actionable, revenue generating gold mines.”
Jason Mick recently blogged, “While iOS 8 should make Apple, Inc.’s (AAPL) Siri substantially smarter, Microsoft Corp.’s (MSFT) Windows Phone voice-controlled assistant Cortana currently enjoys a nice lead in natural language processing and the ability to interface with multiple apps to perform useful functions.
Cortana is a commercial product, but it’s also a bit of lab experiment for the folks at Microsoft. During the 2014 FIFA World Cup, Microsoft showed off its increasingly sophisticated prediction algorithms, which correctly guessed 15 out of 16 winners in the knockout round stage. Its sole mistake was picking Brazil to beat the Netherlands (whoops) for third place in the consolation match.”
Serial entrepreneur and thought leader Nova Spivack recently wrote for Gigaom, “When we talk about the future of artificial intelligence (AI), the discussion often focuses on the advancements and capabilities of the technology, or even the risks and opportunities inherent in the potential cultural implications. What we frequently overlook, however, is the future of AI as a business. IBM Watson’s recent acquisition and deployment of Cognea signals an important shift in the AI and intelligent virtual assistant (IVA) market, and offers an indication of both of the potentials of AI as a business and the areas where the market still needs development. The AI business is about to be transformed by consolidation. Consolidation carries real risks, but it is generally a sign of technological maturation. And it’s about time, as AI is no longer simply a side project, or an R&D euphemism. AI is finally center stage.”
Daniel Gutierrez reported, “Prelert, the anomaly detection company, today announced the release of an Elasticsearch Connector to help developers quickly and easily deploy its machine learning-based Anomaly Detective® engine on their Elasticsearch ELK (Elasticsearch, Logstash, Kibana) stack. Earlier this year, Prelert released its Engine API enabling developers and power users to leverage its advanced analytics algorithms in their operations monitoring and security architectures. By offering an Elasticsearch Connector, the company further strengthens its commitment to democratizing the use of machine learning technology, providing tools that make it even easier to identify threats and opportunities hidden within massive data sets. Written in Python, the Prelert Elasticsearch Connector source is available on GitHub. This enables developers to apply Prelert’s advanced, machine learning-based analytics to fit the big data needs within their unique environment.”
The article continues with, “Prelert’s Anomaly Detective processes huge volumes of streaming data, automatically learns normal behavior patterns represented by the data and identifies and cross-correlates any anomalies. It routinely processes millions of data points in real-time and identifies performance, security and operational anomalies so they can be acted on before they impact business. The Elasticsearch Connector is the first connector to be officially released by Prelert. Additional connectors to several of the most popular technologies used with big data will be released throughout the coming months.”
Image courtesy Prelert.