Technologies

Get The Scoop On The Critical ABCs of RDF

semtechbiz-10th-125sqThere’s a chance to learn everything you should know about RDF to get the most value from the W3C standard model for data interchange at the 10th annual Semantic Technology & Business Conference in San Jose next month. David Booth, senior software architect at Hawaii Resource Group, will be hosting a session explaining how the standard’s unique capabilities can have a profound effect on projects that seek to connect data coming in from multiple sources.

“One of the assumptions that people make looking at RDF is that it is  analogous to any other data format, like JSON or XML,” says Booth, who is working at the Hawaii Research Group’s on a contract the firm has with the U.S. Department of Defense to use semantic web technologies to achieve healthcare data interoperability. “It isn’t.” RDF, he explains, isn’t just another data format – rather, it’s about the information content that is encoded in the format.

“The focus is different. It is on the meaning of data vs. the details of syntax,” he says.

Read more

How to Build Your Own Knowledge Graph (Video – Part 1)

Photo of Jarek WilkiewiczStraight out of Google I/O this week, came some interesting announcements related to Semantic Web technologies and Linked Data. Included in the mix was a cool instructional video series about how to “Build a Small Knowledge Graph.” Part 1 was presented by Jarek Wilkiewicz, Knowledge Developer Advocate at Google (and SemTechBiz speaker).

Wilkiewicz fits a lot into the seven-and-a-half minute piece, in which he presents a (sadly) hypothetical example of an online music store that he creates with his Google colleague Shawn Simister. During the example, he demonstrates the power and ease of leveraging multiple technologies, including the schema.org vocabulary (particularly the recently announced ‘Actions‘), the JSON-LD syntax for expressing the machine readable data, and the newly launched Cayley, an open source graph database (more on this in the next post in this series).

Read more

RDF is Critical to a Successful Internet of Things

Depiction of RDF and the internet of ThingsDo you still remember a time when a utility company worker came to your house to check your electric meter? For many of us already, this is in the past. Smart meters send information directly to the utility company and as a result, it knows our up-to-the-minute power usage patterns. And, while we don’t yet talk to our ovens or refrigerators through the Internet, many people routinely control thermostats from their smart phones. The emerging Internet of Things is real and we interact with it on the daily basis.

The term Internet of Things refers to devices we wouldn’t traditionally expect to be smart or connected, such as a smoke detector or other home appliance. They are being made ‘smart’ by enabling them to send data to an application. From smart meters to sensors used to track goods in a supply chain, the one thing these devices have in common is that they send data – data that can then be used to create more value by doing things better, faster, cheaper, and more conveniently.

The physical infrastructure needed for these devices to work is largely in place or being put in place quickly. We get immediate first order benefits simply by installing new equipment. For example, having a smart meter provides cost savings because there is no need for a person to come to our houses. Similarly, the ability to change settings on a thermostat remotely can lower our heating costs. However, far vaster changes and benefits are projected or are already beginning to be delivered from inter-connecting the data sent by smart devices:

  • Health: Connecting vital measurements from wearable devices to the vast body of medical information will help to improve our health, fitness and, ultimately, save lives.
  • Communities: Connecting information from embedded devices and sensors will enable more efficient transportation. When a sprinkler system meter understands weather data, it will use water more efficiently. Once utilities start connecting and correlating data from smart meters, they might deliver electricity more efficiently and be more proactive in handling infrastructure problems.
  • Environment: Connecting readings from fields, forests, oceans, and cities about pollution levels, soil moisture, and resource extraction will allow for closer monitoring of problems.
  • Goods and services: Connecting data from sensors and readers installed throughout factories and supply chains will more precisely track materials and speed up and smooth out the manufacture and distribution of goods.

Read more

RDF 1.1 and the Future of Government Transparency

rdf11-shdw

Following the newly minted “recommendation” status of RDF 1.1, Michael C. Daconta of GCN has asked, “What does this mean for open data and government transparency?” Daconta writes, “First, it is important to highlight the JSON-LD serialization format.  JSON is a very simple and popular data format, especially in modern Web applications.  Furthermore, JSON is a concise format (much more so than XML) that is well-suited to represent the RDF data model.  An example of this is Google adopting JSON-LD for marking up data in Gmail, Search and Google Now.  Second, like the rebranding of RDF to ‘linked data’ in order to capitalize on the popularity of social graphs, RDF is adapting its strong semantics to other communities by separating the model from the syntax.  In other words, if the mountain won’t come to Muhammad, then Muhammad must go to the mountain.” Read more

Musicians Can Now Include Official Tour Dates in Google Knowledge Graph

gwc

The Google Webmaster Central blog reports, “When music lovers search for their favorite band on Google, we often show them a Knowledge Graph panel with lots of information about the band, including the band’s upcoming concert schedule. It’s important to fans and artists alike that this schedule be accurate and complete. That’s why we’re trying a new approach to concert listings. In our new approach, all concert information for an artist comes directly from that artist’s official website when they add structured data markup.” Read more

Is A Knowledge Graph-Related Acquisition In Yahoo’s Future?

sdtechIs SindiceTech about to be acquired by Yahoo? Just last month The Semantic Web Blog reported on the formal relaunch of the company’s activities following the finalization of its separation from its university incubation setting at the former DERI institute in Ireland. Now, according to the Sunday Independent, Yahoo – which the article says had originally planned on buying the company late last year but saw negotiations collapse – may resume talks on the matter.

Yahoo, the article says, “refused to comment on the Sindice-Tech deal, calling it as ‘rumour and speculation.’” SindiceTech CEO Giovanni Tummarello also says that he cannot comment on this. He did note, however, that media, search and advertising are prime sectors for employing Knowledge Graphs. “In scenarios where there is much more (semi-structured) information than one knows how to leverage right away, Big Data graph-like knowledge management and moving from search to relational and entity search is a common theme these days,” he wrote in an email to The Semantic Web Blog.

Read more

“Webize” Your Data with JSON-LD

json-ld-button

Benjamin Young of Cloudant reports, “Data is often stored and distributed in esoteric formats… Even when the data is available in a parse-able format (CSV, XML, JSON, etc), there is often little provided with the data to explain what’s inside. If there is descriptive meta data provided, it’s often only meant for the next developer to read when implementing yet-another-parser for said data. Really, it’s all quite abysmal… Enter, JSON-LD! JSON-LD (JSON Linked Data) is a simple way of providing semantic meaning for the terms and values in a JSON document. Providing that meaning with the JSON means that the next developer’s application can parse and understand the JSON you gave them.” Read more

RDF 1.1 is a W3C Recommendation

RDF 1.1Almost exactly 10 years after the publication of RDF 1.0 (10 Feb 2004, http://www.w3.org/TR/rdf-concepts/), the World Wide Web Consortium (W3C) has announced today that RDF 1.1 has become a “Recommendation.” In fact, the RDF Working Group has published a set of eight Resource Description Framework (RDF) Recommendations and four Working Group Notes. One of those notes, the RDF 1.1 primer, is a good starting place for those new to the standard.

SemanticWeb.com caught up with Markus Lanthaler, co-editor of the RDF 1.1 Concepts and Abstract Syntax document, to discuss this news.

photo of Markus LanthalerLanthaler said of the recommendation, “Semantic Web technologies are often criticized for their complexity–mostly because RDF is being conflated with RDF/XML. Thus, with RDF 1.1 we put a strong focus on simplicity. The new specifications are much more accessible and there’s a clear separation between RDF, the data model, and its serialization formats. Furthermore, the primer provides a great introduction for newcomers. I’m convinced that, along with the standardization of Turtle (and previously JSON-LD), this will mark an important point in the history of the Semantic Web.”

Read more

SindiceTech Announces Freebase Distribution in the Cloud (Video)

sin

With the support of Google Developers, SindiceTech has announced the availability of its Freebase Distribution for the cloud. According to SindiceTech, “Freebase is an amazing data resource at the core of Google’s ‘Knowledge Graph’. Freebase data is available for full download but today, using it ‘as a whole’ is all but simple. The SindiceTech Freebase distribution solves that by providing all the Freebase knowledge preloaded in an RDF specific database (also called triplestore) and equipped with a set of tools that make it much easier to compose queries and understand the data as a whole.”

Your Own Private Freebase

Read more

Meet Spaun: The Future of Artificial Intelligence

spaun

Naomi Eterman of McGill Daily recently discussed a technology developed in 2012 by scientists at the University of Waterloo: “Spaun, short for Semantic Pointer Architecture Unified Network, is the largest computer simulation of a functioning brain to date. It is the brainchild of Chris Eliasmith, a professor in philosophy and systems design engineering at the University of Waterloo, who developed the system as a proof-of-principle supplement to his recent book: How to Build a Brain. The model is composed of 2.5 million simulated neurons and four different neurotransmitters that allow it to ‘think’ using the same kind of neural connections as the mammalian brain. Read more

<< PREVIOUS PAGENEXT PAGE >>