The Wharton School of Business recently wrote, “Knowledge@Wharton spoke with Brad Becker, chief design officer for IBM Watson, about current and future applications of cognitive computing and how he hopes to make computers ‘more humane.’ An edited version of the conversation follows.” Asked how his background in user experience design affects his role in the Watson project, Becker commented, “[It’s based on] the idea that technology should work for people, not the other way around. For a long time, people have worked to better understand technology. Watson is technology that works to understand us. It’s more humane, it’s helpful to humans, it speaks our language, it can deal with ambiguity, it can create hypotheses, it can learn from us. And, of course, since it’s a computer, it can scale as much as needed and has recall far beyond what humans have.” Read more
Posts Tagged ‘cognitive computing’
NEW YORK & SAN DIEGO, CA – 12 Nov 2014: Cognitive apps are in market today and continue to change the way professionals and consumers make decisions. To help accelerate this transformation, the IBM Watson Group announced an investment in Pathway Genomics Corporation, a clinical laboratory that offers genetic testing services globally, to help deliver the first-ever cognitive consumer-facing app based on genetics from a user’s personal makeup. Read more
Late last month saw IBM expand its existing engagement with the Cleveland Clinic around its deployment of IBM Watson technology to cover new domains. The vendor already has worked with faculty, physicians and students at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University on a project to develop Watson-related cognitive technologies to help physicians make more informed and accurate decisions faster and to cull new insights from electronic medical records. Now, the Lerner Research Institute’s Genomic Medicine Institute at Cleveland Clinic will evaluate Watson’s ability to help oncologists develop more personalized care to patients for a variety of cancers.
Watson is being leveraged by other institutions in the field of cancer care, including Memorial Sloan Kettering and MD Anderson. The new venture with Cleveland Clinic is focused on identifying patterns in genome sequencing and medical data to unlock insights that will help clinicians bring the promise of genomic medicine to their patients, using Watson’s cognitive system, deep computational biology models and IBM’s public cloud infrastructure SoftLayer, IBM says.
“There is a lot of work going on in the cancer area,” says Steve Harvey, IBM VP of Watson Cancer Genomics. This latest partnership aims to work toward identifying drugs that might be relevant to treat a particular patient’s condition by working from the understanding that cancer is a disease of DNA, and by leveraging the fact that the cost of reading DNA has gone down drastically. Today, it’s possible to take a patient’s normal cell and see the DNA there and compare that to the DNA in a cancer cell to see the differences – the mutations – that can point medical professionals in the direction of what actually is causing the tumor.
October saw the debut of Cognitive Scale’s cognitive cloud platform, which provides sourcing, analyzing and interpreting data of all sorts and context signals on any public cloud infrastructure. The details of the platform for pulling insights out of massive amounts of multi-structured data are covered in a story you can read at our sister site Dataversity.net. Here, The Semantic Web Blog relays some more information about usage scenarios around its services, according to Matt Sanchez, the company’s founder, chief technology officer, and vice president of products.
The platform includes at its top layer vertical applications, and healthcare is a main focus. Guided care applications have a spot here. The role of care managers becomes more important in the changing world of healthcare costs and reimbursements, where patient engagement – especially of the chronically ill – can keep a pediatric asthma patient, for example, from showing up in the ER room, which translates to a high-cost visit. Today, “provider organizations are more incented to be proactive in care,” says Sanchez, which means asking and analyzing who is at risk right now and what can be done to prevent a negative outcome like that.
Saffron 10, a major upgrade to Saffron Technology’s cognitive computing platform, debuts today, with new machine learning algorithms that support anticipating outcomes based on patterns found in large amounts of data. (You can read more about the platform and Saffron’s Associative MemoryBase here as well as in this article at our Dataversity sister site.)
One of the new machine learning algorithms is Saffron Universal Cognitive Distance, which the company says is the first non-linear, non-parametric similarity computation for making sense of massive amounts of data without requiring businesses to pre-model the data. The other is Saffron Mutual Information, which the vendor says addresses “sparse data” challenges by making it possible to perform classification with high accuracy on high-dimensional data (with tens of thousands of features).
CEO Gayle Sheppard explains that the focus is on looking for the signals that really matter from the noise of Big Data, as companies “merge outside-in intelligence with inside-out intelligence, increasing the amount of data now available for decision making.” With the new algorithms, the associations the Saffron platform makes among people, places and things data — counting an entity’s frequency and the contexts in which one thing is associated with something else — extends to discovering patterns to anticipate what happens next.
Jamie Bisker reported, “Most consumers and professionals of all types have a basic feeling about technological innovation as something positive. It is true that we may bemoan the loss of a favorite aspect of the past, and tend to recall for the most part only favorable situations that strengthen such memories. But, in general, people feel upbeat about the convenience and capabilities that technology can provide. We evoke pleasant feelings from the past and that is our nature. It is also our nature, at a very deep biological level, to anticipate the future.
Jeff Hawkins, in his book On Intelligence, highlights research he both collected and directs about the physiological aspect of how neurons in the brain are connected. He has shown that prediction is basically wired in to a large portion of neural circuitry. Hawkins named this approach as a “memory-prediction framework.” He also makes the case for prediction as being one of the foundations of human intelligence.”
A recent report from David Loshin states, “As our world becomes more attuned to the generation, and more importantly, the use of massive amounts of data, information technology (IT) professionals are increasingly looking to new technologies to help focus on deriving value from the velocity of data streaming from a wide variety of data sources. The breadth of the internet and its connective capabilities has enabled the evolution of the internet of things (IoT), a dynamic ecosystem that facilitates the exchange of information among a cohort of devices organized to meet specific business needs. It does this through a growing, yet intricate interconnection of uniquely identifiable computing resources, using the internet’s infrastructure and employing internet protocols. Extending beyond the traditional system-to-system networks, these connected devices span the architectural palette, from traditional computing systems, to specialty embedded computer modules, down to tiny micro-sensors with mobile-networking capabilities.”
Loshin added, “In this paper, geared to the needs of the C-suite, we’ll explore the future of predictive analytics by looking at some potential use cases in which multiple data sets from different types of devices contribute to evolving models that provide value and benefits to hierarchies of vested stakeholders. We’ll also introduce the concept of the “insightful fog,” in which storage models and computing demands are distributed among interconnected devices, facilitating business discoveries that influence improved operations and decisions. We’ll then summarize the key aspects of the intelligent systems that would be able to deliver on the promise of this vision.”
The full report, “How IT can blend massive connectivity with cognitive computing to enable insights” is available for download for a fee.
Tim Beyers of The Motley Fool recently wrote, “For years, International Business Machines has been dabbling with what it calls ‘cognitive computing.’ Now the company that brought you the Watson supercomputer believes it has a chip that can think like the human brain. Called TrueNorth, the chip draws on some 5.4 billion interconnected transistors to form a vast network not unlike the neural networks found in the human brain. That’s a potentially massive breakthrough, especially for Internet-connected mobile devices that encounter new data every second. We’re likely to be years away from mass production of the TrueNorth chip. And even then, experts quoted in this article in The New York Times seem to be split on its potential impact.” Read more
Catherine Havasi, CEO of Luminoso recently wrote for Tech Crunch, “Everyone knows that ‘water is wet,’ and ‘people want to be happy,’ and we assume everyone we meet shares this knowledge. It forms the basis of how we interact and allows us to communicate quickly, efficiently, and with deep meaning. As advanced as technology is today, its main shortcoming as it becomes a large part of daily life in society is that it does not share these assumptions. We find ourselves talking more and more to our devices — to our mobile phones and even our televisions. But when we talk to Siri, we often find that the rules that underlie her can’t comprehend exactly what we want if we stray far from simple commands. For this vision to be fulfilled, we’ll need computers to understand us as we talk to each other in a natural environment. For that, we’ll need to continue to develop the field of common-sense reasoning — without it, we’re never going to be able to have an intelligent conversation with Siri, Google Glass or our Xbox.” Read more
In mid-July Dataversity.net, the sister site of The Semantic Web Blog, hosted a webinar on Understanding The World of Cognitive Computing. Semantic technology naturally came up during the session, which was moderated by Steve Ardire, an advisor to cognitive computing, artificial intelligence, and machine learning startups. You can find a recording of the event here.
Here, you can find a more detailed discussion of the session at large, but below are some excerpts related to how the worlds of cognitive computing and semantic technology interact.
One of the panelists, IBM Big Data Evangelist James Kobielus, discussed his thinking around what’s missing from general discussions of cognitive computing to make it a reality. “How do we normally perceive branches of AI, and clearly the semantic web and semantic analysis related to natural language processing and so much more has been part of the discussion for a long time,” he said. When it comes to finding the sense in multi-structured – including unstructured – content that might be text, audio, images or video, “what’s absolutely essential is that as you extract the patterns you are able to tag the patterns, the data, the streams, really deepen the metadata that gets associated with that content and share that metadata downstream to all consuming applications so that they can fully interpret all that content, those objects…[in] whatever the relevant context is.”
NEXT PAGE >>