Neil Versel of Forbes recently wrote, “Electronic health records’ usability, or lack thereof, is the big talk this year in health IT, as federal officials get ready to contemplate the rules for Stage 3 of the Meaningful Use EHR incentive program. While Stage 3 will not begin before 2017, the U.S. Department of Health and Human Services has indicated that it will publish draft regulations in the spring. The hope is to finalize the rules before the end of 2015 so vendors can update their technology and give healthcare providers plenty of time to install and adjust to the updates before they move to Stage 3.” Read more
Posts Tagged ‘Standards’
We reported yesterday on the news that JSON-LD has reached Recommendation status at W3C. Three formal vocabularies also reached that important milestone yesterday:
The W3C Documentation for The Data Catalog Vocabulary (DCAT), says that DCAT “is an RDF vocabulary designed to facilitate interoperability between data catalogs published on the Web….By using DCAT to describe datasets in data catalogs, publishers increase discoverability and enable applications easily to consume metadata from multiple catalogs. It further enables decentralized publishing of catalogs and facilitates federated dataset search across sites. Aggregated DCAT metadata can serve as a manifest file to facilitate digital preservation.”
Meanwhile, The RDF Data Cube Vocabulary addresses the following issue: “There are many situations where it would be useful to be able to publish multi-dimensional data, such as statistics, on the web in such a way that it can be linked to related data sets and concepts. The Data Cube vocabulary provides a means to do this using the W3C RDF (Resource Description Framework) standard. The model underpinning the Data Cube vocabulary is compatible with the cube model that underlies SDMX (Statistical Data and Metadata eXchange), an ISO standard for exchanging and sharing statistical data and metadata among organizations. The Data Cube vocabulary is a core foundation which supports extension vocabularies to enable publication of other aspects of statistical data flows or other multidimensional data sets.”
Lastly, W3C now recommends use of the Organization Ontology, “a core ontology for organizational structures, aimed at supporting linked data publishing of organizational information across a number of domains. It is designed to allow domain-specific extensions to add classification of organizations and roles, as well as extensions to support neighbouring information such as organizational activities.”
JSON-LD has reached the status of being an official “Recommendation” of the W3C. JSON-LD provides yet another way for web developers to add structured data into web pages, joining RDFa.The W3C documentation says, “JSON is a useful data serialization and messaging format. This specification defines JSON-LD, a JSON-based format to serialize Linked Data. The syntax is designed to easily integrate into deployed systems that already use JSON, and provides a smooth upgrade path from JSON to JSON-LD. It is primarily intended to be a way to use Linked Data in Web-based programming environments, to build interoperable Web services, and to store Linked Data in JSON-based storage engines.” This addition should be welcome news for Linked Data developers familiar with JSON and/or faced with systems based on JSON.
SemanticWeb.com caught up with the JSON-LD specfication editors to get their comments…
Manu Sporny (Digital Bazaar), told us, “When we created JSON-LD, we wanted to make Linked Data accessible to Web developers that had not traditionally been able to keep up with the steep learning curve associated with the Semantic Web technology stack. Instead, we wanted people that were comfortable working with great solutions like JSON, MongoDB, and REST to be able to easily integrate Linked Data technologies into their day-to-day work. The adoption of JSON-LD by Google and schema.org demonstrates that we’re well on our way to achieving this goal.”
Silver Spring, MD (PRWEB) October 30, 2013 — PhUSE and CDISC are happy to announce the completion of Phase I of the FDA/PhUSE Semantic Technology Working Group Project. The PhUSE Semantic Technology Working Group aims to investigate how formal semantic standards can support the clinical and non-clinical trial data life cycle from protocol to submission. This deliverable includes a draft set of existing CDISC standards represented in RDF. Read more
I spent the majority of my professional life as the scribe, analyst, advocate, facilitator and therapist for the information industry. I started with the traditional publishers and then moved on to my engagement in the financial information industry. I watched the business of information evolve through lots of IT revolutions … from microfiche to Boolean search to CD-ROM to videotext to client server architecture to the Internet and beyond.
At the baseline of everything was the concept of data tagging – as the key to search, retrieval and data value. I saw the evolution from SGML (which gave rise to the database industry). I witnessed the separation of content from form with the development of HTML. And now we are standing at the forefront of capturing meaning with formal ontologies and using inference-based processing to perform complex analysis.
I have been both a witness to (and an organizer of) the information industry for the better part of 30 years. It is my clear opinion that this development – and by that I mean the tagging of meaning and semantic processing is the most important development I have witnessed. It is about the representation of knowledge. It is about complex analytical processing. It is about the science of meaning. It is about the next phase of innovation for the information industry.
Let me see if I can put all of this into perspective for you. Because my goal is to enlist you into our journey. Read more
Today sees the launch of Meritora, the first commercial implementation of the universal payment standard PaySwarm (initially discussed in this blog here and here). The creation of Digital Bazaar, the company founded and CEO’d by Manu Sporny – whose W3C credentials include being founder of both the Web Payments Community Group and JSON-LD Community Group, as well as chair of the RDF Web Applications Working Group – Meritora is designed to ease what is still a surprisingly arduous task of buying and selling on the web. The service is starting with a simple asset hosting feature for helping vendors sell digital content on WordPress-powered sites, and support for decentralized web app stores so that app creators can put their work on their web sites, set a price for them, and let them be bought there, at a web app store, or anywhere on the web.
The name Meritora points to the service’s underlying purpose of rewarding greatness, coming from the bases ‘merit’ and ‘ora,’ the latter of which has been used across a number of cultures to express a unit of value, Sporny says (noting that it means ‘golden’ in Esperanto, and was also used as a unit of currency among Anglo-Saxons). That’s a big name to live up to, but the service hopes to do so by making Web payments work simply, securely, quickly, with low fees and no vendor lock-in for buyers and sellers on the digital content scene.
There’s Linked Data to thank for what Meritora, and PaySwarm, can do, with Sporny describing the system as “the world’s first payment solution where the core of the technology is powered by Linked Data.”
The W3C has announced that eleven specifications of SPARQL 1.1 have been published as recommendations. SPARQL is the Semantic Web query language. We caught up with Lee Feigenbaum, VP Marketing & Technology at Cambridge Semantics Inc. to discuss the significance of this announcement. Feigenbaum is a SPARQL expert who currently serves as the Co-Chair of the W3C’s SPARQL Working Group, leading the design of SPARQL.
Feigenbaum says, “SPARQL 1.1 is a huge leap forward in providing a standard way to access and update Semantic Web data. By reaching W3C Recommendation status, Semantic Web developers, vendors, publishers and consumers have a stable, well-vetted, and interoperable set of standards they can rely on for the foreseeable future.”
NEXT PAGE >>