Jasmine Pennic of HIT Consultant reports, “Healthline, provider of intelligent health information and technology solutions, today launched its HealthData Engine to harness the power of structured and unstructured data to improve outcomes and reduce costs. The new big data analytics platform leverages the company’s market-leading HealthTaxonomy, advanced clinical natural language processing (NLP) technologies and semantic analysis to turn patient data into actionable insights.” Read more
Health Care / Life Sciences
Semantic Interoperability of Electronic Healthcare Info On The Agenda At U.S. Veterans Health Administration
The Yosemite Project, unveiled at this August’s Semantic Technology & Business Conference during the second annual RDF as a Universal Healthcare Exchange Language panel, lays out a roadmap for leveraging RDF in support of making all structured healthcare information semantically interoperable. (The Semantic Web Blog’s sister publication, Dataversity.net, has an article on its site explaining the details of that roadmap.)
The Yosemite Project grew out of the Yosemite Manifesto that was announced at the 2013 SemTechBiz conference (see our story here). The goals of the Manifesto have now been mapped out into the Project’s guidelines to follow on the journey to semantic interoperability by David Booth, senior software architect at Hawaii Resource Group (who led the RDF Healthcare panels at both the 2013 and 2014 conferences). The approach taken by the Yosemite Project matches that of others in the healthcare sector who want to see semantic interoperability of electronic healthcare information.
Among them are Booth’s fellow panelists at this year’s event, including Rafael Richards. Richards, who is physician informaticist at the U.S. Veterans Health Administration – which counts 1,200 care sites in its portfolio – comments on that alignment as it relates to the work he is leading in the Linked Vitals project to integrate the VA’s VistA electronic health records system with data types conforming to Fast Healthcare Interoperability Resources, orFHIR,standard for data exchange, and with information types supporting the Logical Observation Identifiers Names and Codes, or LOINC, database that facilitates the exchange and pooling of results for clinical care, outcomes management, and research.
A recent press release states, “Transforming our cities into the Smart Cities of the future will encompass incorporating technologies and key digital developments all linked by machine-to-machine (M2M) solutions and real-time data analytics which sit under the umbrella term of the Internet of Things. Smart cities however must be underpinned by the appropriate ICT infrastructure based on fibre optic and high-speed wireless technologies, which is well underway in many developed cities around the world. This infrastructure allows for the development of smart communities; supporting connected homes; intelligent transport systems; e-health; e-government and e-education; smart grids and smart energy solutions – just to name a few of the exciting solutions smart cities will incorporate. Many of the technological advancements emerging around the world today can, and will be, applied to smart cities. Artificial Intelligence; Electric Vehicles; Autonomous Vehicles; Mobile applications; Drones; Wearable and Smart devices and so on are just some of the key developments to watch.” Read more
Caleb Garling of the MIT Technology Review reports, “Machines are doing more and more of the work typically completed by humans, and detecting diseases may be next: a new company called Enlitic takes aim at the examination room by employing computers to make diagnoses based on images. Enlitic cofounder and CEO Jeremy Howard—formerly the president and lead scientist at data-crunching startup Kaggle—says the idea is to teach computers how to recognize various injuries, diseases, and disorders by showing them hundreds of x-rays, MRIs, CT scans, and other films. Howard believes that with enough experience, a computer can start to spot trouble and flag the images immediately for a physician to investigate. That could save physicians from having to comb through stacks of films.” Read more
Dominik Schweiger, Zlatko Trajanoski and Stephan Pabinger recently wrote, “Semantic Web has established itself as a framework for using and sharing data across applications and database boundaries. Here, we present a web-based platform for querying biological Semantic Web databases in a graphical way. Results: SPARQLGraph offers an intuitive drag &drop query builder, which converts the visual graph into a query and executes it on a public endpoint. The tool integrates several publicly available Semantic Web databases, including the databases of the just recently released EBI RDF platform. Furthermore, it provides several predefined template queries for answering biological questions. Users can easily create and save new query graphs, which can also be shared with other researchers.” Read more
WASHINGTON, D.C. – SYSTAP, LLC. today announced that Syapse, the leading provider of software for enabling precision medicine, has selected Bigdata® as its backend semantic database. Syapse, which launched the Precision Medicine Data Platform in 2011, will use the Bigdata® database as a key element of their semantic platform. The Syapse Precision Medicine Data Platform integrates medical data, omics data, and biomedical knowledge for use in the clinic. Syapse software is delivered as a cloud-based SaaS, enabling access from anywhere with an internet connection, regular software updates and new features, and online collaboration and delivery of results, with minimal IT resources required. Syapse applications comply with HIPAA/HITECH, and data in the Syapse platform are protected according to industry standards.
Syapse’s Precision Medicine Data Platform features a semantic layer that provides powerful data modeling, query, and integration functionality. According to Syapse CTO and Co-Founder, Tony Loeser, Ph.D., “We have adopted SYSTAP’s graph database, Bigdata®, as our RDF store. Bigdata’s exceptional scalability, query performance, and high-availability architecture make it an enterprise-class foundation for our semantic technology stack.”
Peter Murray-Rust of OpenSource.com recently wrote, “Open is about sharing and collaboration. It’s the idea that ‘we’ is more powerful, more rewarding and fulfilling than ‘I’. I can’t promise jobs, but I do know that openis becoming very big. Governments and funders are pushing the open agenda, even though academics are generally uninterested or seriously self-interested. Some governments and some companies recognize the value of teams; academia and academics generally don’t. The false values of impact factor and the false values of academic publishing mean that open access is a poor reflection of open, or what you may recognize as the open source way.” Read more
A recent article in Medical Xpress reports, “Machine learning has been improved by Dr Thomas Wilhelm of the Institute of Food Research, which is strategically funded by the Biotechnology and Biological Sciences Research Council. Instead of developing one model from the training data, his technique involves developing hundreds of diverse models, and applying these to independent, unseen data, and seeing which models work best in their ability to predict outcomes. This avoids ‘overfitting’ of a model to a specific training data set. The new technique can be applied to many different situations, but Dr Wilhelm applied it to epigenetic data on cervical cancer.” Read more
At the upcoming Semantic Technology & Business Conference in San Jose, Dr. Terry Roach, principal of CAPSICUM Business Architects, and Dr. Dean Allemang, principal consultant at Working Ontologist, will host a session on A Semantic Model for an Electronic Health Record (EHR). It will focus on Australia’s electronic-Health-As-A-Service (eHaas) national platform for personal electronic health records, provided by the CAPSICUM semantic framework for strategically aligned business architectures.
Roach and Allemang participated in an email interview with The Semantic Web Blog to preview the topic:
The Semantic Web Blog: Can you put the work you are doing on the semantic EHR model in context: How does what Australia is doing with its semantic framework compare with how other countries are approaching EHRs and healthcare information exchange?
Roach and Allemang: The eHaaS project that we have been working on has been an initiative of Telstra, a large, traditional telecommunications provider in Australia. Its Telstra Health division, which is focused on health-related software investments, for the past two years has embarked on a set of strategic investments in the electronic health space. Since early 2013 it has acquired and/or established strategic partnerships with a number of local and international healthcare software providers ranging from hospital information systems [to] mobile health applications [to] remote patient monitoring systems to personal health records [to] integration platforms and health analytics suites.
At the core of these investments is a strategy to develop a platform that captures and maintains diverse health-related interactions in a consolidated lifetime health record for individuals. The eHaaS platform facilitates interoperability and integration of several health service components over a common secure authentication service, data model, infrastructure, and platform. Starting from a base of stand-alone, vertical applications that manage fragmented information across the health spectrum, the eHaaS platform will establish an integrated, continuously improving, shared healthcare data platform that will aggregate information from a number of vertical applications, as well as an external gateway for standards-based eHealth messages, to present a unified picture of an individual’s health care profile and history.
Megan Williams of Business Solutions recently wrote, “It’s a simple fact of the current state of healthcare that most providers are not using the data they gather via EHRs as best they could. The case of missed technological opportunities is nothing new to healthcare, but in the case of search engines, not employing them to mine existing information could be costing your clients, and their patients, unnecessary tests, wasted time, and missed information that would have been helpful in achieving desired patient outcomes.”
Williams goes on, “A study published by the Journal Of The American College Of Radiology titled “Optimizing Emergency Department Imaging Utilization Through Advanced Health Record Technology” addresses the application of search functions to clinical environments, specifically the emergency department (ED). This department presents special challenges because of the nature of the work involving evaluating complex patients while under time pressure. These challenges are made worse by the incomplete medical history that typically comes with patients who enter the department.”
She adds, “The study covers QPID, which is a programmable health record intelligence system that adds semantic search and knowledge management layers to an EHR system. QPID works as an extension of its data repository, facilitating extraction from it. While most EHR systems do include databases that handle rudimentary data retrieval, QPID allows users to pull data by topic-related ‘packages’ of data and concepts in saved, operable queries that can be used on both structured and unstructured data sources.”
Image: Courtesy Flickr/ Yann Ropars
NEXT PAGE >>