announcements

IMSC Announces Release of Integraf Integration Platform

integraf logo

IMSC’s Integraf Integration Platform Enables Business Analysts to Build Enterprise Integration Solutions

ARLINGTON, VA–(Marketwired – March 27, 2014) – Information Management Solutions Consultants (IMSC) announces the commercial release of Integraf, a flexible and intuitive integration and interoperability platform for combining information from disparate data sources and for building enterprise web services. Integraf is based on open standards from the W3C and strongly leverages open source technology to achieve scalability and economy.

“Integraf enables system integration done by business analysts without a single line of programming code… that is powerful,” says Rana Chegu, CEO of Peridot Solutions and former Gartner Executive. Integraf’s underlying model is both machine-readable and human-readable, which is the key to empowering business analysts and subject matter experts (SMEs) in the development of enterprise integration solutions. This makes Integraf unique in the integration engine market.

Read more

Saffron Technology Raises $7 million in Series B funding

Saffron Technology LogoCary, NC, March 20, 2014 – Saffron Technology, a cognitive systems company helping Fortune 1000 businesses understand the value of transforming disconnected data into actionable knowledge, today announced that it has closed a $7 million Series B investment round. Funds are earmarked to accelerate business growth, including opening new global headquarters in Silicon Valley.

“Data becomes infinitely more powerful when you tie together its meaning from a multiplicity of disparate sources. Our patented Natural Intelligence Platform unifies all kinds of data – structured and unstructured – in real time, from a large variety of sources and continuously learns about the things in the data without the need for pre-determined rules or models,” said Gayle Sheppard, Saffron Technology CEO. “Now you can automatically see converging and other patterns to anticipate outcomes and prepare to act. These capabilities, combined with our customers’ success with Saffron, position us well for growth. With this additional funding, we will expand customer-centric next generation service teams, build a strong brand presence, create scalability across our business, and establish a Silicon Valley headquarters in spring 2014.”

Read more

194 Million Linked Open Data Bibliographic Work Descriptions Released by OCLC

OCLC WorldCat logoYesterday, Richard Wallis gave a peek into some exciting new developments in the OCLC’s Linked Open Data (LOD) efforts.  While these have not yet been formally announced by OCLC, they represent significant advancements in WorldCat LOD. Our reporting to date on LOD at WorldCat is here.

Most significantly, OCLC has now released 194 Million Linked Open Data Bibliographic Work descriptions. According to Wallis, “A Work is a high-level description of a resource, containing information such as author, name, descriptions, subjects etc., common to all editions of the work.” In his post, he uses the example of “Zen and the Art of Motorcycle Maintenance” as a Work.

Read more

RDF 1.1 is a W3C Recommendation

RDF 1.1Almost exactly 10 years after the publication of RDF 1.0 (10 Feb 2004, http://www.w3.org/TR/rdf-concepts/), the World Wide Web Consortium (W3C) has announced today that RDF 1.1 has become a “Recommendation.” In fact, the RDF Working Group has published a set of eight Resource Description Framework (RDF) Recommendations and four Working Group Notes. One of those notes, the RDF 1.1 primer, is a good starting place for those new to the standard.

SemanticWeb.com caught up with Markus Lanthaler, co-editor of the RDF 1.1 Concepts and Abstract Syntax document, to discuss this news.

photo of Markus LanthalerLanthaler said of the recommendation, “Semantic Web technologies are often criticized for their complexity–mostly because RDF is being conflated with RDF/XML. Thus, with RDF 1.1 we put a strong focus on simplicity. The new specifications are much more accessible and there’s a clear separation between RDF, the data model, and its serialization formats. Furthermore, the primer provides a great introduction for newcomers. I’m convinced that, along with the standardization of Turtle (and previously JSON-LD), this will mark an important point in the history of the Semantic Web.”

Read more

First of Four Getty Vocabularies Made Available as Linked Open Data

Getty Vocabularies - Linked Open Data logoJim Cuno, the President and CEO of the Getty, announced yesterday that the Getty Research Institute has released the Art & Architecture Thesaurus (AAT) ® as Linked Open Data. Cuno said, “The Art & Architecture Thesaurus is a reference of over 250,000 terms on art and architectural history, styles, and techniques. It’s one of the Getty Research Institute’s four Getty Vocabularies, a collection of databases that serves as the premier resource for cultural heritage terms, artists’ names, and geographical information, reflecting over 30 years of collaborative scholarship.”

The data set is available for download at vocab.getty.edu under an Open Data Commons Attribution License (ODC BY 1.0). Vocab.getty.edu offers a SPARQL endpoint, as well as links to the Getty’s Semantic Representation documentation, the Getty Ontology, links for downloading the full data sets, and more.

Read more

Stardog 2.1 Hits Scalability Breakthrough

Stardog LogoWashington, DC – January 21, 2014 – The new release (2.1) of Stardog, a leading RDF database, hits new scalability heights with a 50-fold increase over previous versions. Using commodity server hardware at the $10,000 price point, Stardog can manage, query, search, and reason over datasets as large as 50B RDF triples.

The new scalability increases put Stardog into contention for the largest semantic technology, linked data, and other graph data enterprise projects. Stardog’s unique feature set, including reasoning and integrity constraint validation, at large scale means it will increasingly serve as the basis for complex software projects.

“We’re really happy about the new scalability of Stardog,” says Mike Grove, Clark & Parsia’s Chief Software Architect, “which makes us competitive with a handful of top graph database systems. And our feature set is unmatched by any of them.”

The new scalability work required software engineering to remove garbage collection pauses during query evaluation, which the 2.1 release also accomplishes. Along with a new hot backup capability, Stardog is more mature and production-capable than ever before.

Read more

New Vocabularies Are Now W3C Recommendations

W3C LogoWe reported yesterday on the news that JSON-LD has reached Recommendation status at W3C. Three formal vocabularies also reached that important milestone yesterday:

The W3C Documentation for The Data Catalog Vocabulary (DCAT), says that DCAT “is an RDF vocabulary designed to facilitate interoperability between data catalogs published on the Web….By using DCAT to describe datasets in data catalogs, publishers increase discoverability and enable applications easily to consume metadata from multiple catalogs. It further enables decentralized publishing of catalogs and facilitates federated dataset search across sites. Aggregated DCAT metadata can serve as a manifest file to facilitate digital preservation.”

Meanwhile, The RDF Data Cube Vocabulary  addresses the following issue: “There are many situations where it would be useful to be able to publish multi-dimensional data, such as statistics, on the web in such a way that it can be linked to related data sets and concepts. The Data Cube vocabulary provides a means to do this using the W3C RDF (Resource Description Framework) standard. The model underpinning the Data Cube vocabulary is compatible with the cube model that underlies SDMX (Statistical Data and Metadata eXchange), an ISO standard for exchanging and sharing statistical data and metadata among organizations. The Data Cube vocabulary is a core foundation which supports extension vocabularies to enable publication of other aspects of statistical data flows or other multidimensional data sets.”

Lastly, W3C now recommends use of the Organization Ontology, “a core ontology for organizational structures, aimed at supporting linked data publishing of organizational information across a number of domains. It is designed to allow domain-specific extensions to add classification of organizations and roles, as well as extensions to support neighbouring information such as organizational activities.”

 

JSON-LD is an official Web Standard

JSON-LD logo JSON-LD has reached the status of being an official “Recommendation” of the W3C. JSON-LD provides yet another way for web developers to add structured data into web pages, joining RDFa.The W3C documentation says, “JSON is a useful data serialization and messaging format. This specification defines JSON-LD, a JSON-based format to serialize Linked Data. The syntax is designed to easily integrate into deployed systems that already use JSON, and provides a smooth upgrade path from JSON to JSON-LD. It is primarily intended to be a way to use Linked Data in Web-based programming environments, to build interoperable Web services, and to store Linked Data in JSON-based storage engines.” This addition should be welcome news for Linked Data developers familiar with JSON and/or faced with systems based on JSON.

SemanticWeb.com caught up with the JSON-LD specfication editors to get their comments…

photo of Manu Sporny Manu Sporny (Digital Bazaar), told us, “When we created JSON-LD, we wanted to make Linked Data accessible to Web developers that had not traditionally been able to keep up with the steep learning curve associated with the Semantic Web technology stack. Instead, we wanted people that were comfortable working with great solutions like JSON, MongoDB, and REST to be able to easily integrate Linked Data technologies into their day-to-day work. The adoption of JSON-LD by Google and schema.org demonstrates that we’re well on our way to achieving this goal.”

Read more

OpenCube Project Launches – Promises Opportunities for Open Statistical Data

OpenCube logoEU Initiative OpenCube partner consortium to develop software tools for publishing and reusing Linked Open Statistical Data

Thermi, Thessaloniki, Greece, January 14th, 2014 – A consortium of partners headed by the Centre for Research and Technology – Hellas (CERTH), recently launched the OpenCube project, an EU Initiative for Publishing and Enriching Linked Open Statistical Data for the Development of Data Analytics and Enhanced Visualization Services. The project intends to make Linked Open Statistical Data (LOSD) more accessible to publishers and users and to facilitate mining these data so as to enable the extraction of interesting and previously hidden insights. As part of the project, these innovative new technologies will be tested at four pilot sites: three government agencies from across Europe and a large financial institution.

Linked Statistical Data

Governments, organizations and companies are increasingly releasing their data for others to reuse. A major part of open data concerns statistics, such as population figures and economic and social indicators. Analysis of statistical open data can create value for citizens and businesses in areas ranging from business intelligence to epidemiological studies and evidence-based policy-making.

Recently, Linked Data emerged as a promising paradigm to enable use of the web as a platform for data integration. Linked Statistical Data has been proposed as the most suitable way to publish open data on the web. However, publishing and mining LOSD faces particular challenges as it requires appropriate tools and methods.

Read more

ODI celebrates New Year OBE for Technical Director, Jeni Tennison

Photo of Jeni TennisonThe Open Data Institute has announced that Jeni Tennison has been awarded an OBE in the “Queen’s New Year Honours.”

For those not familiar, King George V created these honors on 4 June 1917, during World War I. The honor was intended to reward services to the war effort by civilians at home in the UK and servicemen in support positions. Today, they are awarded for prominent national or regional roles and to those making distinguished or notable contributions in their own specific areas of activity. There are three ranks to the honors: Commander (CBE), Officer (OBE) and Member (MBE). Tennison is being given the OBE.

The official release reads:

Open Data Institute (ODI) founders, Sir Nigel Shadbolt and Sir Tim Berners-Lee have warmly welcomed news that the organisation’s Technical Director, Jeni Tennison has received an OBE in the Queen’s New Year Honours.

Tennison, who grew up in Cambridge, first trained as a psychologist before gaining a PhD in collaborative ontology development from the University of Nottingham.

Before joining the ODI, she was the technical architect and lead developer for legislation.gov.uk, which pioneered the use of open data APIs within the public sector, set a new standard in the publication of legislation on the web, and formed the basis of The National Archives’ strategy for bringing the UK’s legislation up to date as open, public data.

Speaking about today’s Honour, ODI Chairman, Sir Nigel Shadbolt said: “Jeni inspires affection, loyalty and admiration in all who know her. She has a special blend of deep technical know how and an intuitive sense of what works in the world of the Web. In Jeni the ODI has a fantastic CTO and the open data community a great role model. It has been a privilege to work with her for over two decades and it is wonderful to see her recognised in this way.”

Before taking up her post at the ODI, Tennison worked with Shadbolt on the early linked data work on data.gov.uk, helping to engineer new standards for the publication of statistics as linked data; building APIs for geographic, transport and education data; and supporting the publication of public sector organograms as open data.

Read more

NEXT PAGE >>