IANS Live recently wrote, “[Twitter] has finally given access to its vast database to a selected pool of researchers to study tweets and find answers to a variety of issues. As part of its ambitious data grant programme, Twitter is allowing academic researchers across various fields to ‘go back and study things’ over, with almost a decade of historical data, Washington Post reported. While Harvard Medical School and Boston Children’s Hospital are looking at tweets about food-poisoning cases to find answers to the spread of food-borne illnesses, researchers from the University of California at San Diego are studying whether happy people are likely to post happy images on Twitter.” Read more
Posts Tagged ‘data mining’
In a recent article, Mike Kravis explained, “In a previous post I discussed how the Internet of Things (IoT) will radically change your big data strategy. Massive amounts of data from sensors, wearable devices, and other technologies are creating new and exciting opportunities to make better business decisions in real time. However, harvesting all of this data is only half of the equation. Making the data actionable is where real value lies. Traditionally, companies have mined data to look for trends and opportunities. In the world of IoT, searching for nuggets of information in petabyte sized databases is the equivalent of trying to find a needle in a haystack. To help extract value quickly and effectively, companies are turning to machine learning technologies, like big data technologies. However, implementing machine learning successfully can be extremely time consuming and complex. This has given birth to a new breed of vendors who deliver machine learning as a service, allowing customers to quickly implement technologies to turn massive IoT databases into actionable, revenue generating gold mines.”
Chloe Green of Information Age recently wrote, “Handling immense data sets requires a combination of scientific and technological skills to determine how data is stored, searched and accessed. In science, the importance of data scientists in ensuring that data is handled correctly from the outset is not underestimated; other industries can learn from the scientific approach. Text-mining tools and the use of relevant taxonomies are essential. If we think about big data as a huge number of data points in some multi-dimensional space, the problem is one of analysis, i.e. frequently finding very similar or very dissimilar points which cannot be compared. In life sciences, taxonomies assign data points a class, thus comparison of two points is as easy as looking up other data points in the same class.” Read more
James Kobielus of InfoWorld recently wrote, “Machine-generated log data is the dark matter of the big data cosmos. It is generated at every layer, node, and component within distributed information technology ecosystems, including smartphones and Internet-of-things endpoints… Clearly, automation is key to finding insights within log data, especially as it all scales into big data territory. Automation can ensure that data collection, analytical processing, and rule- and event-driven responses to what the data reveals are executed as rapidly as the data flows. Key enablers for scalable log-analysis automation include machine-data integration middleware, business rules management systems, semantic analysis, stream computing platforms, and machine-learning algorithms.? Read more
Nancy Gohring of Computerworld recently wrote, “The market for connected devices like fitness wearables, smart watches and smart glasses, not to mention remote sensing devices that track the health of equipment, is expected to soar in the coming years. By 2020, Gartner expects, 26 billion units will make up the Internet of Things, and that excludes PCs, tablets and smartphones. With so many sensors collecting data about equipment status, environmental conditions and human activities, companies are growing rich with information. The question becomes: What to do with it all? How to process it most effectively and use it in the smartest way possible?” Read more
Bill Franks of the International Institute for Analytics recently opined, “In recent years, the use of the term Machine Learning has surged. What I struggle with is that many traditional data mining and statistical functions are being folded underneath the machine learning umbrella. There is no harm in this except that I don’t think that the general community understands that, in many cases, traditional algorithms are just getting a new label with a lot of hype and buzz appeal. Simply classifying algorithms in the machine learning category doesn’t mean that the algorithms have fundamentally changed in any way.” Read more
Dian Schaffhauser of Campus Technology recently wrote, “A startup that offers a cloud-based biomedical search and profiling platform is teaming up with a major publisher of scientific and technical texts to create customized research portals for academic organizations and scientific societies. Knode will be working with John Wiley & Sons to set up searchable research networks to help users locate research expertise. Currently, Knode’s focus is on life sciences, but the company expects to expand to other fields ‘soon,’ according to an interview with CEO David Steinberg posted on the Wiley Web site.” Read more
This week saw Frost & Sullivan award its 2013 Company of the Year to Definiens, a provider of image analysis solutions and data mining solutions for life science, tissue diagnostics, and clinical digital pathology. Definiens’ gaining of the title owes much to its work around tissue datafication that’s leveraging its Definiens Cognition Network Technology, which the company says mimics the human mind’s cognitive powers to reposition knowledge within a semantic network.
“What we do essentially is look at ways to be able to better diagnose cancer and develop therapies,” says Merrilyn Datta, CMO at Definiens. The company looks to extract data from tumor images, historically available as slides from biopsies, datafying the tissues involved to create digital images and then using its Cognition Network Technology to extract all the different relevant objects in that image and correlate them to patient outcomes. “That can be extremely, extremely powerful,” says Datta.
The image analysis technology was developed by Physics Nobel Laureate Gerd Binning, and includes a set of principles aimed at emulating the human mind’s cognitive powers, which are defined by the ability to intuitively aggregate pixels into ‘objects’ and understand the context and relationships between those objects rather than the computer’s normal way of just examining images pixel by pixel. These principles include: context, which is established and utilized through the technology’s creation of a hierarchical network of pixel clusters representing nested structures within the image; navigation, for supporting efficient navigation inside the network in order to enable specific local processing and addressing of specific contexts; and evolution of the network, in which the individual stages of segmentation and classification are alternated and the structures represented within the network are created and constantly improved in a series of loops, whereby each classification can be enhanced with local context and specific expert knowledge.
When a Medicare administrative contractor found itself under the gun to improve its performance in order to retain its business with the Centers for Medicare & Medicaid Services, it was able to leverage semantic technology to realize its goals.
The audience at last week’s Semantic Technology & Business Conference heard about this case study from management consultancy Blue Slate Solutions, which specializes in improving and transforming operations through business process and analytic transformation and which spearheaded the semantic project for the contractor. “This client was struggling in the medical review process, to decide if claims should be paid,” recounted David S. Read, Blue Slate CTO.
Jennifer Bresnick of EHR Intelligence recently wrote, “No one can reasonably deny that EHR adoption and meaningful use participation have been a struggle. From the patchwork of health IT systems that need to be wrangled and replaced to the payment reforms piggybacking on a revolution in how providers and lawmakers view quality care, the healthcare landscape is changing at a lightning pace, and it’s not always easy to keep up. Dan Riskin, MD, CEO and Co-Founder of Health Fidelity, believes there’s a lot of work to be done in order to improve the provider experience and the patient’s outcomes, both of which rely on making better use of technology and the data it produces.” Read more
NEXT PAGE >>