Northwestern University reports, “Someday we might be able to build software for our computers simply by talking to them. Ken Forbus, Walter P. Murphy Professor of Electrical Engineering and Computer Science, and his team have developed a program that allows them to teach computers as you would teach a small child—through natural language and sketching. Called Companion Cognitive Systems, the architecture has the ability for human-like learning. ‘We think software needs to evolve to be more like people,’ Forbus says. ‘We’re trying to understand human minds by trying to build them.’ Forbus has spent his career working in the area of artificial intelligence, creating computer programs and simulations in an attempt to understand how the human mind works. At the heart of the Companions project is the claim that much of human learning is based on analogy. When presented with a new program or situation, the mind identifies similar past experiences to form an action. This allows us to build upon our own knowledge with the ability to continually adapt and learn.” Read more
Marketing and Communications, April 10, 2014 — The Texas A&M University Libraries is preparing to launch VIVO, a web-based community of research profiles to enhance faculty collaboration. By providing standard research profiles for all university faculty and graduate students, researchers can discover and contact individuals with similar interests whether they are across campus or at another VIVO institution. The data entry and standardization will continue through the summer with the VIVO debut planned for Open Access Week in October 2014. Read more
NEW YORK, NEW YORK — (Marketwired) — 04/10/14 — ADmantX, the next-generation contextual analysis and semantic data provider, today announced that the U.S. Patent and Trademark office has awarded patent US 8543578 B2 for its flagship semantic technology. The ADmantX technology automatically comprehends content in both advertising messages and page content to ensure brand-safe, effective ad delivery.
Alex Philp recently wrote for IBM Data Magazine, “The Watson cognitive computing engine is rapidly evolving to recognize geometry and geography, enabling it to understand complex spatial relationships. As Watson combines spatial with temporal analytics and natural language processing, it is expected to derive associations and correlations among streaming data sets evolving from the Internet of Things. Once these associations and correlations occur, Watson can combine the where, what, and when cognitive dimensions into causal explanations about why. Putting the where into Watson represents a key evolutionary milestone into understanding cause-effect relationships and extracting meaningful insights about natural and synthetic global phenomena.” Read more
Gadjo Cardenas Sevilla of the Calgary Herald recently wrote, “Apple’s Siri intelligent personal assistant has been around for nearly four years and standard on iOS devices for three years. The peppy and often humorous artificial intelligence has evolved in terms of features and the number of services it can access. Siri is also getting some stiff competition from Google Now, which along with answering user-initiated queries like Siri, it also passively delivers information to the user by way of visual flash cards… Named after a character in Microsoft’s popular Halo video game franchise, the Cortana personal assistant is expected to come to Windows Phone devices, Xbox and possibly tablets in April… A recent leak with details and screenshots of BlackBerrys upcoming BB 10.3 operating system, reveals that the company formerly known as RIM has been working on an Intelligent Assistant feature to rival Siri and Google Now.” Read more
Rancho Cordova, CA (PRWEB) April 01, 2014–The pharmaceutical community, health care organizations, and software providers are coming together at the OASIS open standards consortium to define a machine-readable content classification standard for the interoperable exchange of clinical trial data via content management systems. The work of the new OASIS Electronic Trial Master File (eTMF) Standard Technical Committee will promote interoperability across diverse computing platforms and cloud networks within the clinical trials community. Read more
A new article out of Information Daily reports, “Milton Keynes may see driverless cars on its roads in 12-18 months, says Geoff Snelson, Strategy Director of MK Smart, the innovation programme being run in the city. The driverless two-person pods are one of the outputs of the MK Smart programme, which is a collaboration between a number of organisations including the Open University (which is located in Milton Keynes) and BT. Central to the project is the creation of the ‘MK Data Hub’, which will support the acquisition and management of vast amounts of data relevant to city systems from a variety of data sources. As well as transport data, these will include data about energy and water consumption, data acquired through satellite technology, social and economic datasets, and crowd-sourced data from social media or specialised apps. Building on the capability provided by the MK Data Hub, the project will innovate in the areas of transport, energy and water management, tackling key demand issues.” Read more
Nick Stockton of Quartz reports, “Computers stole your job; now they know your pain. Using a combination of facial recognition software and machine learning algorithms, researchers have trained computers to be dramatically better than humans at reading pained facial expressions. And they’re working on new programs to help clue you into what your friend, coworker, or client is feeling. In a study released Friday (paywall) in the journal Current Biology, researchers asked 170 subjects whether the expressions of pain shown on faces in a series of videos were real or faked. They found that the humans’ collective empathetic ability was about the same as a coin flip—they read the expressions correctly only 50% of the time. Even after researchers trained the subjects to read the subtle, involuntary muscle triggers that experts use to tell when an emotion is being faked, they were only right 55% of the time.” Read more
PALO ALTO, Calif., March 25, 2014 /PRNewswire-iReach/ — EngageClick (http://www.engageclick.com), the predictive and personalized multi-screen advertising platform that delivers superior ad engagement and performance, emerged out of stealth mode today. The EngageClick ad platform differentiates itself by applying data-driven technology that uses cognitive science, machine-learning technology and big data analytics to perform predictive segmentation, and subsequently delivers dynamic smart ads with automatic and incremental optimization across multiple screens. EngageClick maximizes advertiser ROI and increases media yield, helping ad agencies and brands increase consumer engagement and ad performance, at scale. Read more
BLOOMINGTON, Ind. — By understanding, managing and inferring patterns from data, machine learning has brought us self-driving vehicles, spam filters and smartphone personal assistants. Now an Indiana University Bloomington computer scientist has received $1.4 million to give machine learning more muscle by making it applicable to greater amounts of more diverse data.
Chung-chieh “Ken” Shan, an assistant professor in the School of Informatics and Computing, will receive the funding from the U.S. Defense Department’s Defense Advanced Research Projects Agency over 46 months. Read more
NEXT PAGE >>