Posts Tagged ‘SPARQL’
The W3C has announced that eleven specifications of SPARQL 1.1 have been published as recommendations. SPARQL is the Semantic Web query language. We caught up with Lee Feigenbaum, VP Marketing & Technology at Cambridge Semantics Inc. to discuss the significance of this announcement. Feigenbaum is a SPARQL expert who currently serves as the Co-Chair of the W3C’s SPARQL Working Group, leading the design of SPARQL.
Feigenbaum says, “SPARQL 1.1 is a huge leap forward in providing a standard way to access and update Semantic Web data. By reaching W3C Recommendation status, Semantic Web developers, vendors, publishers and consumers have a stable, well-vetted, and interoperable set of standards they can rely on for the foreseeable future.”
Seevl, the free music discovery service that leverages semantic technology to help users conduct searches across a world of facts-in-combination to find new musical experiences and artist information, has launched an app for Deezer that will formally go live Monday. (See our in-depth look at Seevl here, and a screencast of how the service works here.) Deezer is a music streaming service available in more than 150 countries – not the U.S. yet, though – that claims more than 20 million users.
Seevl, which late last year updated its YouTube plug-in with more music discovery features and better integration with the YouTube user interface, models its data in RDF. In a blog post earlier this year, founder and CEO Alexandre Passant explained how the Seevl service uses Redis for simple key-value queries and SPARQL for some more complex operations, like recommendations or social network analysis, as well as provenance. As for the new Deezer app, it provides the same features as the YouTube app for easily navigating and discovering music among millions of tracks, Passant tells the Semantic Web Blog.
Algebraix Data Corporation today announced its SPARQL Server(TM) RDF database successfully executed all 17 of its queries on the SP2 benchmark up to one billion triples on one computer node. The SP2 benchmark is the most computationally complex for testing SPARQL performance and no other vendor has reported results for all queries on data sizes above five million triples. Read more
Gartner Names Semantic Technologies To Its Top Technology Trends Impacting Information Infrastructure in 2013
Semantic technologies have made it to Gartner’s list of the top technology trends that will impact information infrastructure this year.
The research firm yesterday released the list of nine trends that it says will play key roles in modernizing information management and in making the role of information governance increasingly important. Semantic technologies come in at No.3 on the list – right behind closely-tied-to trends Big Data and modern information infrastructure.
When the Nobel Prize winners for 2013 are announced in the fall, perhaps there also will be some challenges issued to the worldwide community of data enthusiasts to see what they can do with open Linked Data about the prizes that have been awarded since the beginning of the 20th century.
Right now that’s just on the wish lists of Matthias Palmér and Hannes Ebner, co-founders of MetaSolutions AB, a spin-off from the Royal Institute of Technology in Stockholm and Uppsala University focused on semantic and scalable web apps. But a solid start has been made through their work with Nobel Media AB, which develops and manages programs, productions and media rights of the Nobel Prize within the areas of digital and broadcast media, including the Nobelprize.org domain, on the Nobel Prize Linked Data set.
It shouldn’t be surprising that Entagen, which makes the semantically-enabled Big Data analytics and collaboration engine TripleMap, has had its sights set on the life sciences space. CEO Christopher Bouton has his Ph.D in molecular neurobiology and has worked at a number of bio tech firms, as well as been the head of integrative data mining at Pfizer – a company that’s using TripleMap for visualized knowledge maps of associations between domain-specific entities (see our story here).
“We see some really compelling and exciting applications of this type of technology in the life sciences space,” says Bouton. But TripleMap can be applied to any scenario where Big Data dots must be connected so that users can collaborate around the understanding of the associations between entities – health care, legal, retail and finance all come to mind.
A new article on R Bloggers explains how to get “up and running on the Semantic Web” using SPARQL with R in under five minutes. The article states, “We’ll use data at the Data.gov endpoint for this example. Data.gov has a wide array of public data available, making this example generalizable to many other datasets. One of the key challenges of querying a Semantic Web resource is knowing what data is accessible. Sometimes the best way to find this out is to run a simple query with no filters that returns only a few results or to directly view the RDF. Fortunately, information on the data available via Data.gov has been cataloged on a wiki hosted by Rensselaer. We’ll use Dataset 1187 for this example. It’s simple and has interesting data – the total number of wildfires and acres burned per year, 1960-2008.” Read more
EventMedia Live, Winner of ISWC Semantic Web Challenge, Starts New Project With Nokia Maps, Extends Architecture Flexibility
The winner of the Semantic Web Challenge at November’s International Semantic Web Conference (ISWC) was EventMedia Live, a web-based environment that exploits real-time connections to event and media sources to deliver rich content describing events that are associated with media, and interlinked with the Linked Data cloud.
This week, it will begin a one-year effort under a European Commission-funded project to align its work with the Nokia Maps database of places, so that mobile users of the app can quickly get pictures of these venues that were taken by users with EventMedia’s help.
A project of EURECOM, a consortium combining seven European universities and nine international industrial partners, EventMedia Live has its origins in the “mismatch between those sites specializing in announcing upcoming events and those other sites where users share photos, videos and document those events,” explains Raphaël Troncy, assistant professor at the EURECOM: School of Engineering & Research Center, Multimedia Communications, and one of the project’s leaders.
Here are some final thoughts from our panel of semantic web experts on what to expect to see as the New Year rings in:
Broader deployment of the schema.org terms is likely. In the study by Muehlisen and Bizer in July this year, we saw Open Graph Protocol, DC, FOAF, RSS, SIOC and Creative Commons still topping the ranks of top semantic vocabularies being used. In 2013 and beyond, I expect to see schema.org jump to the top of that list.
Christine Connors, Chief Ontologist, Knowledgent:
I think we will see an uptick in the job market for semantic technologists in the enterprise; primarily in the Fortune 2000. I expect to see some M&A activity as well from systems providers and integrators who recognize the desire to have a semantic component in their product suite. (No, I have no direct knowledge; it is my hunch!)
We will see increased competition from data analytics vendors who try to add RDF, OWL or graphstores to their existing platforms. I anticipate saying, at the end of 2013, that many of these immature deployments will leave some project teams disappointed. The mature vendors will need to put resources into sales and business development, with the right partners for consulting and systems integration, to be ready to respond to calls for proposals and assistance.