Derrick Harris of GigaOM recently wrote, “Jeff Hawkins is best known for bringing us the Palm Pilot, but he’s working on something that could be much, much bigger. For the past several years, Hawkins has been studying how the human brain functions with the hope of replicating it in software. In 2004, he published a book about his findings. In 2012, Numenta, the company he founded to commercialize his work, finally showed itself to the world after roughly seven years operating in stealth mode. I recently spoke with Hawkins to get his take on why his approach to artificial intelligence will ultimately overtake other approaches, including the white-hot field of deep learning. We also discussed how Numenta has survived some early business hiccups and how he plans to keep the lights on and the money flowing in.” Read more
Posts Tagged ‘Machine Learning’
Bernard Marr recently wrote, “It’s been estimated that by 2015, almost two million people will be employed in big data jobs in the US. Hal Varian, Google’s chief economist, is quoted as saying “…the sexy job in the next 10 years will be statisticians” and Tom Davenport, Distinguished Professor at Babson College, believes that a data scientist has the sexiest job of the 21st century. So what are these sexy jobs? Here’s a quick look at some of the positions available today that might allow you to break into the glamorous and exciting world of the big data professionals.” Read more
REDWOOD CITY, Calif.–(BUSINESS WIRE)–Paxata, the first unified Adaptive Data Preparation platform built from the ground-up to address the next generation of data integration, quality, enrichment, collaboration and governance needs for business analytics, was recognized by Ventana Research as the winner of the 2014 Technology Innovation Award for Information Optimization. Read more
Greg MacSweeney of Wall Street and Tech recently wrote, “It’s relatively easy to find information on public companies. Bloomberg, Thomson Reuters, and Dun & Bradstreet, for example, all have in-depth information that is accessible to anyone with a subscription. But where do investment bankers, venture capitalists, and other investors find reliable information about private companies? If you talk to investment bankers, or other investors who are looking for information on non-public companies, it quickly becomes apparent there is no easy answer. Investment bankers rely mostly on Google searches and a combination of information gathered from Hoovers, S&P Capital IQ, Dun & Bradstreet, and others. But it is a laborious manual process to do due diligence on private companies.” Read more
Big Data has been getting its fair share of commentary over the last couple months. Surveys from multiple sources have commented on trends and expectations. The Semantic Web Blog provides some highlights here:
- From Accenture Anayltics’s new Big Success With Big Data report: There remain some gaps in what constitutes Big Data for respondents to its survey: Just 43 percent, for instance, classify unstructured data as part of the package. That option included open text, video and voice. Those are gaps that could be filled leveraging technologies such as machine learning, speech recognition and natural language understanding, but they won’t be unless executives make these sources a focus of Big Data initiatives to start with.
- From Teradata’s new survey on Big Data Analytics in the UK, France and Germany: Close to 50 percent of respondents in the latter two countries are using three or more data types (from sources ranging from social media, to video, to web blogs, to call center notes, to audio files and the Internet of Things) in their efforts, compared to just 20 percent in the UK. A much higher percentage of UK businesses (51 percent) are currently using just a single type of new data, such as video data, compared with France and Germany, where only 21 percent are limiting themselves to one type of new data, it notes. Forty-four percent of execs in Germany and 35 percent in France point social media as the source of the new data. About one-third of respondents in each of those countries are investigating video, as well.
A recent announcement on EurekAlert! states: “Researchers from North Carolina State University have developed artificial intelligence (AI) software that is significantly better than any previous technology at predicting what goal a player is trying to achieve in a video game. The advance holds promise for helping game developers design new ways of improving the gameplay experience for players.”We developed this software for use in educational gaming, but it has applications for all video game developers,” says Dr. James Lester, a professor of computer science at NC State and senior author of a paper on the work. “This is a key step in developing player-adaptive games that can respond to player actions to improve the gaming experience, either for entertainment or – in our case – for education.” The researchers used “deep learning” to develop the AI software. Deep learning describes a family of machine learning techniques that can extrapolate patterns from large collections of data and make predictions. Deep learning has been actively investigated in various research domains such as computer vision and natural language processing in both academia and industry.”
Apigee wants the development community to be able to seamlessly take advantage of predictive analytics in their applications.
“One of the biggest things we want to ensure is that the development community gets comfortable with powering their apps with data and insights,” says Anant Jhingran, Apigee’s VP of products and formerly CTO for IBM’s information management and data division. “That is the next wave that we see.”
Apigee wants to help them ride that wave, enabling their business to better deal with customer issues, from engagement to churn, in more personal and contextual ways. “We are in the business of helping customers take their digital assets and expose them through clean APIs so that these APIs can power the next generation of applications,” says Jhingran. But in thinking through that core business, “we realized the interactions happening through the APIs represent very powerful signals. Those signals, when combined with other contextual data that may be in the enterprise, enable some very deep insights into what is really happening in these channels.”
With today’s announcement of a new version of its Apigee Insights big data platform, all those signals generated – through API and web channels, call centers, and more – can come together in the service of predictive analytics for developers to leverage.
IPsoft needs a R&D Engineer. The job description states, “Amelia is the next generation human-computer dialog system that acts as your personal student, instructor, assistant, or friend. Amelia is based on the latest state-of-the-art technologies in natural language processing, information retrieval, machine learning, and more.What distinguishes Amelia from previous generation human-computer dialog systems is its learning ability. Amelia is capable of understanding syntax and semantics of natural language and automatically builds its own neural ontology from them. If you want to teach Amelia about a certain object, you simply describe the object in natural language, then Amelia builds a neural ontology for the object automatically. Once the neural ontology is built, Amelia can explain or answer questions about the object by traversing through the ontology. Objects do not have to be specified upfront; you can talk about random stuffs and expect Amelia to build neural ontologies for objects that are newly introduced during your conversation with Amelia. When you ask questions about things that Amelia does not have neural ontologies for, Amelia tries to find the most appropriate answer from the World Wide Web. These include questions about weathers, current events, historical/geopolitical facts, etc.”
In a recent article, Mike Kravis explained, “In a previous post I discussed how the Internet of Things (IoT) will radically change your big data strategy. Massive amounts of data from sensors, wearable devices, and other technologies are creating new and exciting opportunities to make better business decisions in real time. However, harvesting all of this data is only half of the equation. Making the data actionable is where real value lies. Traditionally, companies have mined data to look for trends and opportunities. In the world of IoT, searching for nuggets of information in petabyte sized databases is the equivalent of trying to find a needle in a haystack. To help extract value quickly and effectively, companies are turning to machine learning technologies, like big data technologies. However, implementing machine learning successfully can be extremely time consuming and complex. This has given birth to a new breed of vendors who deliver machine learning as a service, allowing customers to quickly implement technologies to turn massive IoT databases into actionable, revenue generating gold mines.”
Caleb Garling of the MIT Technology Review reports, “Machines are doing more and more of the work typically completed by humans, and detecting diseases may be next: a new company called Enlitic takes aim at the examination room by employing computers to make diagnoses based on images. Enlitic cofounder and CEO Jeremy Howard—formerly the president and lead scientist at data-crunching startup Kaggle—says the idea is to teach computers how to recognize various injuries, diseases, and disorders by showing them hundreds of x-rays, MRIs, CT scans, and other films. Howard believes that with enough experience, a computer can start to spot trouble and flag the images immediately for a physician to investigate. That could save physicians from having to comb through stacks of films.” Read more
NEXT PAGE >>