Jennifer Zaino

Semantically Aligned Design Principles At Core of Australian Electronic Health Records Platform

site-header-10th-blog-304x200At the upcoming Semantic Technology & Business Conference in San Jose, Dr. Terry Roach, principal of  CAPSICUM Business Architects, and Dr. Dean Allemang, principal consultant at Working Ontologist, will host a session on A Semantic Model for an Electronic Health Record (EHR). It will focus on Australia’s electronic-Health-As-A-Service  (eHaas) national platform for personal electronic health records, provided by the CAPSICUM semantic framework for strategically aligned business architectures.

Roach and Allemang participated in an email interview with The Semantic Web Blog to preview the topic:

The Semantic Web Blog: Can you put the work you are doing on the semantic EHR model in context: How does what Australia is doing with its semantic framework compare with how other countries are approaching EHRs and healthcare information exchange?

Roach and Allemang: The eHaaS project that we have been working on has been an initiative of Telstra, a large, traditional telecommunications provider in Australia. Its Telstra Health division, which is focused on health-related software investments, for the past two years has embarked on a set of strategic investments in the electronic health space. Since early 2013 it has acquired and/or established strategic partnerships with a number of local and international healthcare software providers ranging from hospital information systems [to] mobile health applications [to] remote patient monitoring systems to personal health records [to] integration platforms and health analytics suites.

At the core of these investments is a strategy to develop a platform that captures and maintains diverse health-related interactions in a consolidated lifetime health record for individuals. The eHaaS platform facilitates interoperability and integration of several health service components over a common secure authentication service, data model, infrastructure, and platform. Starting from a base of stand-alone, vertical applications that manage fragmented information across the health spectrum, the eHaaS platform will establish an integrated, continuously improving, shared healthcare data platform that will aggregate information from a number of vertical applications, as well as an external gateway for standards-based eHealth messages, to present a unified picture of an individual’s health care profile and history.

Read more

Digital Reasoning Takes On Early Risk Detection For Compliance- And Security-Sensitive Sectors

synthimageDigital Reasoning’s Synthesys machine learning platform (which The Semantic Web Blog initially covered here) this summer should see its Version 3.9 release. The update will build on the 3.8 release, which delivered with its Glance user interface the discovery and investigative capabilities that help information analysts in finance, intelligence and other compliance- and security-sensitive sectors react to findings in user profiles of interest and their associated relationships, activities and risks. Version 3.9 takes on the proactive part of the equation — early risk detection — via its Scout user interface.

Last year, the company honed in on compliance use cases ranging from insider trading to money laundering with Version 3.7 of Synthesys (covered here). There, the technology for discovering the meaning in unstructured data at scale, highlighting important entities in context, was applied to email communications for organizations such as financial institutions that have to be on the lookout for conversations that cross compliance boundaries.

Read more

Get The Scoop On The Critical ABCs of RDF

semtechbiz-10th-125sqThere’s a chance to learn everything you should know about RDF to get the most value from the W3C standard model for data interchange at the 10th annual Semantic Technology & Business Conference in San Jose next month. David Booth, senior software architect at Hawaii Resource Group, will be hosting a session explaining how the standard’s unique capabilities can have a profound effect on projects that seek to connect data coming in from multiple sources.

“One of the assumptions that people make looking at RDF is that it is  analogous to any other data format, like JSON or XML,” says Booth, who is working at the Hawaii Research Group’s on a contract the firm has with the U.S. Department of Defense to use semantic web technologies to achieve healthcare data interoperability. “It isn’t.” RDF, he explains, isn’t just another data format – rather, it’s about the information content that is encoded in the format.

“The focus is different. It is on the meaning of data vs. the details of syntax,” he says.

Read more

Alta Plana Takes The Pulse Of Text Analytics

wordcloudSeth Grimes, president and principal consultant of Alta Plana Corp. and founding chair of the Sentiment Analysis Symposium, has put together a thorough new report, Text Analytics 2014: User Perspectives on Solutions and Providers. Among the interesting findings of the report is that “growth in text analytics, as a vendor market category, has slackened, even while adoption of text analytics, as a technique, has continued to expand rapidly.”

Grimes explains that in a fragmented market, consisting of everything from text analytics services to solution-embedded technologies, the opportunities for users to practice text analytics is strong, but that increasingly text analytics is not the main focal point of the solutions being leveraged.

Reflecting the diversity of options, respondents listed among their providers a number of open-source offerings such as Apache OpenNLP and GATE, API services such as AlchemyAPI and Semantria, and enterprise software solution and business suite providers like SAP. The word cloud above was generated by Alta Plana at Wordle.net to show how users responded to the question of companies they know provide text/content analytics functionality. Nearly 50 percent of users are likely to recommend their most important provider.

Read more

AlphaSense’s Advanced Linguistics Search Engine Could Buy Back Time For Financial Analysts To Do More In-Depth Research

alpha1When Raj Neervannan, CTO and co-founder of financial search engine company AlphaSense, thinks about search, he thinks about it “as a killer app that is only growing…..People want answers, not noise. They want to ask more intelligent questions and get to the next level of computer-aided intelligence.”

For AlphaSense’s customers – analysts at large investment firms and banks or any other industry, as well as one-person shops – that means search needs to get them out of ferreting through piles of research docs for the nuggets of information they really need. Neervannan knows the pain of trying to interpret a CEO’s commentary to understand what he or she was really saying when making the point that numbers were going down when referring to inventory turns. (Jack Kokko, former analyst at Morgan Stanley, is AlphaSense’s other co-founder.)

“You are essentially digging through sets of documents [using keyword search], finding locations of terms, pulling them in piece by piece and constructing a case as to what the company’s inventory turn was really like – what other companies’ similar information was, how that matches up. You have to do quantitative analysis and benchmarks, and it can take weeks,” he says.

Read more

OpenText Takes Next Steps In Automatic Content Classification

otextOpenText yesterday made its secure file sharing and synchronization product, Tempo Box, available for free to customers using its OpenText Content Suite enterprise information management tool.

“A lot of our customers have major concerns about employees sharing documents with cloud tools like Dropbox,” says Lubor Ptacek, vp of strategic marketing. They want them to be available, synched and sharable across all their devices, but using such services can create security and compliance problems. By deploying Tempo Box on top of their existing infrastructure, at no charge to all internal employees and any external parties they may need to share content with, companies get a seamless and cost-effective way to share files in the cloud without compromising security, records management requirements and storage optimization, he says – “the things that enterprise customers care about, especially those operating in regulated environments.”

Among those capabilities is applying automatic content classification, which is usually required for records management reasons – for example, helping companies determine if a document is an employee record they must keep for five years or a tax record they have to hold for seven years. That under-the-hood classification engine is an outgrowth of OpenText’s acquisition a few years back of text mining, analytics and search company Nstein. Since the acquisition, says Ptacek, the company has been looking at ways to apply the technology to specific business problems and make it part of its applications.

Read more

GraphLab Create Aims To Be The Complete Package For Data Scientists

glabData scientists can add another tool to their toolset today: GraphLab has launched GraphLab Create 1.0, which bundles up everything starting from tools for data cleaning and engineering through to state-of-the-art machine learning and predictive analytics capabilities.

Think of it, company execs say, as the single platform that data scientists or engineers can leverage to unleash their creativity in building new data products, enabling them to write code at scale on their own laptops. The driving concept behind the solution, they say, is to make large-scale machine learning and predictive analytics easy enough that companies won’t have to hire huge teams of data scientists and engineers and build the big hardware infrastructures that lie behind many of today’s Big Data-intensive products. And, the data scientists and engineers that do use it won’t need to be experts at machine-learning algorithms – just experienced enough to write Python code.

Read more

Corporate Social Media: Sentiment Tracking Is Up, But Other Metrics Are Out of the Mix

insightThe State of Corporate Social Media is … well, strong might be too strong a word for it. The recently released State of Corporate Social Media Briefing 2014, from USM (Useful Social Media) finds – among other things – that social media team sizes are being reduced, fewer budgets are increasing, and fewer key performance indicators are being measured.

It’s not all negative. In fact, report author and USM founder Nick Johnson concludes that all together, its results could be interpreted as indicating that “social media within companies is beginning to mature, and the drive to leverage social to its full extent is undiminished.”

That said, the data equally could be interpreted to mean that “social media within companies is stagnating, and there’s an increasing lack of resources available to those within business to move forward to full leverage social’s potential.”

Read more

Versium Leverages Microsoft Azure Machine Learning For New Predictive GivingScore Solution To Improve Fundraising

versiumpixVersium, which earlier this year launched its Predictive FraudScore solution (covered here) today releases its Predictive GivingScore solution, designed to help charitable institutions and political organizations better predict who is likely to donate, be a repeat donator, or make the more significant contribution. PredictiveGiving Score is the latest of the company’s predictive Score products, which also include churn, social influencer and shopper scoring – and it’s by no means the last.

It was built with Microsoft Azure Machine Learning, a managed cloud service for building predictive analytics solutions publicly unveiled just a short time ago. CEO Chris Matty says that platform is an aid to Versium in rapidly building its new score solutions. (Just shy of ten Versium scoring products are currently in use or in development.) Azure ML, Matty notes, contains dozens of machine learning algorithms and mathematical computation models it leverages to easily and effectively experiment, create and tune models to get the highest accuracy in predictive scoring solutions.

“Once we have a score built it just takes little tuning. But when we are building a new score we need to look at some different models and see what works better,” he says. “We want to move quickly by evaluating the different models, and we can visualize very easily the process of building the predictive model.”

Read more

Daedalus Takes Meaning-As-A-Service To Excel, GATE And CMS Systems

meaningasaserviceDaedalus (which The Semantic Web Blog originally covered here) has just made its Textalytics meaning-as-a-service APIs available for Excel and GATE (General Architecture for Text Engineering), a JAVA suite of tools used for natural language processing tasks, including information extraction in many languages. Connecting its semantic analysis tools with these systems is one step in a larger plan to extend its integration capabilities with more API plug-ins.

“For us, integration options are a way to lower barriers to adoption and to foster the development of an ecosystem around Textalytics,” says Antonio Matarranz, who leads marketing and sales for Daedalus. The three main ecosystem scenarios, he says, include personal productivity tools, of which the Excel add-in is an example, and NLP environments, of which GATE is an example. “But UIMA (Unstructured Information Management Applications) is also a target,” he says. The list also is slated to include content management systems and search engines, among them open source systems like WordPress, Drupal, and Elasticsearch.

Read more

NEXT PAGE >>