Dian Schaffhauser of Campus Technology recently wrote, “A startup that offers a cloud-based biomedical search and profiling platform is teaming up with a major publisher of scientific and technical texts to create customized research portals for academic organizations and scientific societies. Knode will be working with John Wiley & Sons to set up searchable research networks to help users locate research expertise. Currently, Knode’s focus is on life sciences, but the company expects to expand to other fields ‘soon,’ according to an interview with CEO David Steinberg posted on the Wiley Web site.” Read more
Posts Tagged ‘data mining’
This week saw Frost & Sullivan award its 2013 Company of the Year to Definiens, a provider of image analysis solutions and data mining solutions for life science, tissue diagnostics, and clinical digital pathology. Definiens’ gaining of the title owes much to its work around tissue datafication that’s leveraging its Definiens Cognition Network Technology, which the company says mimics the human mind’s cognitive powers to reposition knowledge within a semantic network.
“What we do essentially is look at ways to be able to better diagnose cancer and develop therapies,” says Merrilyn Datta, CMO at Definiens. The company looks to extract data from tumor images, historically available as slides from biopsies, datafying the tissues involved to create digital images and then using its Cognition Network Technology to extract all the different relevant objects in that image and correlate them to patient outcomes. “That can be extremely, extremely powerful,” says Datta.
The image analysis technology was developed by Physics Nobel Laureate Gerd Binning, and includes a set of principles aimed at emulating the human mind’s cognitive powers, which are defined by the ability to intuitively aggregate pixels into ‘objects’ and understand the context and relationships between those objects rather than the computer’s normal way of just examining images pixel by pixel. These principles include: context, which is established and utilized through the technology’s creation of a hierarchical network of pixel clusters representing nested structures within the image; navigation, for supporting efficient navigation inside the network in order to enable specific local processing and addressing of specific contexts; and evolution of the network, in which the individual stages of segmentation and classification are alternated and the structures represented within the network are created and constantly improved in a series of loops, whereby each classification can be enhanced with local context and specific expert knowledge.
When a Medicare administrative contractor found itself under the gun to improve its performance in order to retain its business with the Centers for Medicare & Medicaid Services, it was able to leverage semantic technology to realize its goals.
The audience at last week’s Semantic Technology & Business Conference heard about this case study from management consultancy Blue Slate Solutions, which specializes in improving and transforming operations through business process and analytic transformation and which spearheaded the semantic project for the contractor. “This client was struggling in the medical review process, to decide if claims should be paid,” recounted David S. Read, Blue Slate CTO.
Jennifer Bresnick of EHR Intelligence recently wrote, “No one can reasonably deny that EHR adoption and meaningful use participation have been a struggle. From the patchwork of health IT systems that need to be wrangled and replaced to the payment reforms piggybacking on a revolution in how providers and lawmakers view quality care, the healthcare landscape is changing at a lightning pace, and it’s not always easy to keep up. Dan Riskin, MD, CEO and Co-Founder of Health Fidelity, believes there’s a lot of work to be done in order to improve the provider experience and the patient’s outcomes, both of which rely on making better use of technology and the data it produces.” Read more
Anna Papachristos of 1to1media recently wrote, “If consumers aren’t talking, they’re typing or texting. But just as quickly as they speak or text their words, companies capture that data to gather insight from every channel at any time. But if organizations truly want to make the most of today’s socially connected landscape, they must hone their analytical capabilities, thereby enabling themselves to assess and respond accordingly to the mounting sentiment. Speech analytics offers many unique insights into real-time brand interactions, but organizations must integrate this voice and text data to bring actionable results to the enterprise.” Read more
A new article on EDN Network by Debbie Meduna, Dev Rajnarayan, and Jim Steele of Sensor Platforms, Inc. discusses how to make mobile platforms context-aware. The team writes, “Using the sensors available on these platforms, one can infer the context of the device, its user, and its environment. This can enable new useful applications that make smart devices smarter. It is common to utilize machine learning to determine the underlying meaning in the large amount of sensor data being generated all around us. However, traditional machine learning methods (such as neural networks, polynomial regression, and support vector machines) do not directly lend themselves to application in a power-conscious mobile environment. The necessary techniques for ensuring effective implementation of sensor context are discussed.” Read more
Doug Henschen of Information Week recently shared his thoughts on the less-than-nefarious intent of the NSA’s PRISM Big Data tools. He writes, “It’s understandable that democracy-loving citizens everywhere are outraged by the idea that the U.S. Government has back-door access to digital details surrounding email messages, phone conversations, video chats, social networks and more on the servers of mainstream service providers including Microsoft, Google, Yahoo, Facebook, YouTube, Skype and Apple. But the more you know about the technologies being used by the National Security Agency (NSA), the agency behind the controversial Prism program revealed last week by whistleblower Edward Snowden, the less likely you are to view the project as a ham-fisted effort that’s ‘trading a cherished American value for an unproven theory,’ as one opinion piece contrasted personal privacy with big data analysis.” Read more
Sean Gallagher of Ars Technica writes, “Most of us are okay with what Google does with its vast supply of ‘big data,’ because we largely benefit from it—though Google does manage to make a good deal of money off of us in the process. But if I were to backspace over Google’s name and replace it with ‘National Security Agency,’ that would leave a bit of a different taste in many people’s mouths. Yet the NSA’s PRISM program and the capture of phone carriers’ call metadata are essentially about the same sort of business: taking massive volumes of data and finding relationships within it without having to manually sort through it, and surfacing ‘exceptions’ that analysts are specifically looking for. The main difference is that with the NSA, finding these exceptions can result in Foreign Intelligence Surveillance Act (FISA) warrants to dig deeper—and FBI agents knocking at your door. So what is it, exactly, that the NSA has in its pile of ‘big data,’ and what can they do with it?” Read more
Virginia Backaitis of CMS Wire recently discussed the rise of Big Content and how semantic technologies are taking a role in mining value from said content. She writes, “The first time I heard the term Big Content, I thought ‘Oh brother.’ At the time, the people who used the term “Big Content” either talked about it in terms of content that went beyond their control (i.e. viral) and said that this was a highly desirable thing OR they argued that ‘content’ was growing at a rapid clip, and that Big Data technologies, such as Hadoop, were incapable of dealing with voluminous unstructured data — that you needed something called ‘Big Content’ to do that.” Read more
Bob Zurek of Smart Data Collective recently shared seven trends that are impacting business. One trend is added data mining and analytic functions: “Industry leaders in the big data space understand the requirements to expand the underlying analytics and statistical capabilities in their platform. This goes beyond typical analytic functions into the world of very sophisticated data mining functionality. Teradata Aster Data includes a wide variety of analytic capabilities including support for statistical, text analytics, graph, sentiment analysis and in-database PMML execution through the support of Zementis. Other companies including IBM Netezza have embedded support for the popular R statistical language as well as Matrix engine, a parallelized linear algebra package. Over time, we will see a significant expansion of these capabilities across a broad range of big data solutions. ” Read more
NEXT PAGE >>