Posts Tagged ‘data mining’

Extracting Value from Big Data Requires Machine Learning

Involuntary Commitment

James Kobielus of InfoWorld recently wrote, “Machine-generated log data is the dark matter of the big data cosmos. It is generated at every layer, node, and component within distributed information technology ecosystems, including smartphones and Internet-of-things endpoints… Clearly, automation is key to finding insights within log data, especially as it all scales into big data territory. Automation can ensure that data collection, analytical processing, and rule- and event-driven responses to what the data reveals are executed as rapidly as the data flows. Key enablers for scalable log-analysis automation include machine-data integration middleware, business rules management systems, semantic analysis, stream computing platforms, and machine-learning algorithms.? Read more

The Data Behind the Internet of Things

5265955179_05c3d1b1a0_o

Nancy Gohring of Computerworld recently wrote, “The market for connected devices like fitness wearables, smart watches and smart glasses, not to mention remote sensing devices that track the health of equipment, is expected to soar in the coming years. By 2020, Gartner expects, 26 billion units will make up the Internet of Things, and that excludes PCs, tablets and smartphones. With so many sensors collecting data about equipment status, environmental conditions and human activities, companies are growing rich with information. The question becomes: What to do with it all? How to process it most effectively and use it in the smartest way possible?” Read more

A Better Definition for Machine Learning

6829374619_e3726b2c11

Bill Franks of the International Institute for Analytics recently opined, “In recent years, the use of the term Machine Learning has surged. What I struggle with is that many traditional data mining and statistical functions are being folded underneath the machine learning umbrella. There is no harm in this except that I don’t think that the general community understands that, in many cases, traditional algorithms are just getting a new label with a lot of hype and buzz appeal. Simply classifying algorithms in the machine learning category doesn’t mean that the algorithms have fundamentally changed in any way.” Read more

Knode Teams with Wiley to Create Researcher Portals with Semantic Mining

Knode

Dian Schaffhauser of Campus Technology recently wrote, “A startup that offers a cloud-based biomedical search and profiling platform is teaming up with a major publisher of scientific and technical texts to create customized research portals for academic organizations and scientific societies. Knode will be working with John Wiley & Sons to set up searchable research networks to help users locate research expertise. Currently, Knode’s focus is on life sciences, but the company expects to expand to other fields ‘soon,’ according to an interview with CEO David Steinberg posted on the Wiley Web site.” Read more

Cognition Network Technology: Taking On Cancer By Repositioning Knowledge In A Semantic Network

rsz_defThis week saw Frost & Sullivan award its 2013 Company of the Year to Definiens, a provider of image analysis solutions and data mining solutions for life science, tissue diagnostics, and clinical digital pathology. Definiens’ gaining of the title owes much to its work around tissue datafication that’s leveraging its Definiens Cognition Network Technology, which the company says mimics the human mind’s cognitive powers to reposition knowledge within a semantic network.

“What we do essentially is look at ways to be able to better diagnose cancer and develop therapies,” says Merrilyn Datta, CMO at Definiens. The company looks to extract data from tumor images, historically available as slides from biopsies, datafying the tissues involved to create digital images and then using its Cognition Network Technology to extract all the different relevant objects in that image and correlate them to patient outcomes. “That can be extremely, extremely powerful,” says Datta.

The image analysis technology was developed by Physics Nobel Laureate Gerd Binning, and includes a set of principles aimed at emulating the human mind’s cognitive powers, which are defined by the ability to intuitively aggregate pixels into ‘objects’ and understand the context and relationships between those objects rather than the computer’s normal way of just examining images pixel by pixel. These principles include:  context, which is established and utilized through the technology’s creation of a hierarchical network of pixel clusters representing nested structures within the image; navigation, for supporting efficient navigation inside the network in order to enable specific local processing and addressing of specific contexts; and evolution of the network, in which the individual stages of segmentation and classification are alternated and the structures represented within the network are created and constantly improved in a series of loops, whereby each classification can be enhanced with local context and specific expert knowledge.

Read more

Semantic Tech To The Rescue For Medicare Administrative Contractor

rsz_readWhen a Medicare administrative contractor found itself under the gun to improve its performance in order to retain its business with the Centers for Medicare & Medicaid Services, it was able to leverage semantic technology to realize its goals.

The audience at last week’s Semantic Technology & Business Conference heard about this case study from management consultancy Blue Slate Solutions, which specializes in improving and transforming operations through business process and analytic transformation and which spearheaded the semantic project for the contractor. “This client was struggling in the medical review process, to decide if claims should be paid,” recounted David S. Read, Blue Slate CTO.

Read more

Meaningful Use and the Semantic Web

5493768345_f6fc9a7879

Jennifer Bresnick of EHR Intelligence recently wrote, “No one can reasonably deny that EHR adoption and meaningful use participation have been a struggle.  From the patchwork of health IT systems that need to be wrangled and replaced to the payment reforms piggybacking on a revolution in how providers and lawmakers view quality care, the healthcare landscape is changing at a lightning pace, and it’s not always easy to keep up.  Dan Riskin, MD, CEO and Co-Founder of Health Fidelity, believes there’s a lot of work to be done in order to improve the provider experience and the patient’s outcomes, both of which rely on making better use of technology and the data it produces.” Read more

The Evolution of Voice and Text Analytics

Anna Papachristos of 1to1media recently wrote, “If consumers aren’t talking, they’re typing or texting. But just as quickly as they speak or text their words, companies capture that data to gather insight from every channel at any time. But if organizations truly want to make the most of today’s socially connected landscape, they must hone their analytical capabilities, thereby enabling themselves to assess and respond accordingly to the mounting sentiment. Speech analytics offers many unique insights into real-time brand interactions, but organizations must integrate this voice and text data to bring actionable results to the enterprise.” Read more

Determining Context on Mobile Devices

A new article on EDN Network by Debbie Meduna, Dev Rajnarayan, and Jim Steele of Sensor Platforms, Inc. discusses how to make mobile platforms context-aware. The team writes, “Using the sensors available on these platforms, one can infer the context of the device, its user, and its environment.  This can enable new useful applications that make smart devices smarter.  It is common to utilize machine learning to determine the underlying meaning in the large amount of sensor data being generated all around us. However, traditional machine learning methods (such as neural networks, polynomial regression, and support vector machines) do not directly lend themselves to application in a power-conscious mobile environment.  The necessary techniques for ensuring effective implementation of sensor context are discussed.” Read more

In Defense of PRISM’s Big Data Strategy

Doug Henschen of Information Week recently shared his thoughts on the less-than-nefarious intent of the NSA’s PRISM Big Data tools. He writes, “It’s understandable that democracy-loving citizens everywhere are outraged by the idea that the U.S. Government has back-door access to digital details surrounding email messages, phone conversations, video chats, social networks and more on the servers of mainstream service providers including Microsoft, Google, Yahoo, Facebook, YouTube, Skype and Apple. But the more you know about the technologies being used by the National Security Agency (NSA), the agency behind the controversial Prism program revealed last week by whistleblower Edward Snowden, the less likely you are to view the project as a ham-fisted effort that’s ‘trading a cherished American value for an unproven theory,’ as one opinion piece contrasted personal privacy with big data analysis.” Read more

NEXT PAGE >>