Posts Tagged ‘Government Data’

Government Linked Data Goes With George Thomas

[EDITOR’S NOTE: Thank you to John Breslin for authoring this guest post remembering our friend and colleague, George Thomas.]

Photo of George ThomasWhen writing about a person’s significant achievements, it would be so much better if the person themselves could hear the good things you were saying about them. Unfortunately, the person I am writing about, George Thomas, passed away last week after a long battle with cancer. However, I think it is important to note the huge impact that George had on Government Linked Data, Linked Data in general, and on his friends and colleagues in the Semantic Web space. If there’s one name that Government Linked Data ‘goes with’, it would be George Thomas.

Although I only physically met George a handful of times, I would count him as one of those who influenced me the most – through his visionary ideas, his practical nature, his inspiring talks at conferences like SemTechBiz, and his willingness to build bridges between people, communities, and of course data.

For those who may not have met him, George worked in the US Government for the past 12 years – most recently as an enterprise architect in the US Department of Health and Human Services (HHS) – and previously he held Chief Architect/CTO roles in other agencies and various private companies.

I first came across George when he was Chief Architect at the CIO’s office in the US General Services Administration. He had given a presentation about how Semantic Web technologies similar to SIOC could potentially be used to “track the dollar instead of the person” on Recovery.gov. Later on, DERI’s Owen Sacco and I collaborated with George on a system to create and enforce fine-grained access control policies (using PPO/PPM) for the HHS’s Government Linked Data on IT investments and assets stored in multiple sources. (George also sung DERI’s praises in a blog post on Data.gov – “Linked Data Goes With DERI” – echoed in this article’s title.)

Read more

Driverless Cars Coming to the Streets of UK Town, Milton Keynes

6255901473_8d72027999

A new article out of Information Daily reports, “Milton Keynes may see driverless cars on its roads in 12-18 months, says Geoff Snelson, Strategy Director of MK Smart, the innovation programme being run in the city. The driverless two-person pods are one of the outputs of the MK Smart programme, which is a collaboration between a number of organisations including the Open University (which is located in Milton Keynes) and BT. Central to the project is the creation of the ‘MK Data Hub’, which will support the acquisition and management of vast amounts of data relevant to city systems from a variety of data sources. As well as transport data, these will include data about energy and water consumption, data acquired through satellite technology, social and economic datasets, and crowd-sourced data from social media or specialised apps. Building on the capability provided by the MK Data Hub, the project will innovate in the areas of transport, energy and water management, tackling key demand issues.” Read more

RDF 1.1 and the Future of Government Transparency

rdf11-shdw

Following the newly minted “recommendation” status of RDF 1.1, Michael C. Daconta of GCN has asked, “What does this mean for open data and government transparency?” Daconta writes, “First, it is important to highlight the JSON-LD serialization format.  JSON is a very simple and popular data format, especially in modern Web applications.  Furthermore, JSON is a concise format (much more so than XML) that is well-suited to represent the RDF data model.  An example of this is Google adopting JSON-LD for marking up data in Gmail, Search and Google Now.  Second, like the rebranding of RDF to ‘linked data’ in order to capitalize on the popularity of social graphs, RDF is adapting its strong semantics to other communities by separating the model from the syntax.  In other words, if the mountain won’t come to Muhammad, then Muhammad must go to the mountain.” Read more

Open Data: The Key To $3 Trillion In Economic Value

rsz_mckinsey__company_logoThere’s money in that open data. A new report from the McKinsey Global Institute finds that machine-readable information that’s made available to others has the potential to generate significant economic value: $3 trillion annually in seven domains, to be exact.

The report, entitled Open Data: Unlocking Innovation And Performance With Liquid Information, sees the potential economic effect unfolding in education, transportation, consumer products, electricity, oil and gas, health care and consumer finance. Data becomes more liquid, the report authors note, when it is open, widely available and in shareable formats, and when advanced computing and analysis can yield from it — potentially in conjunction with proprietary data — novel insights. It doesn’t specifically mention Linked Data, but hones in on government open data platforms – including the Linked-Data infused data.gov.UK, which it cites as having had 1.4 million page views this summer – as critical to the economic good tidings. It records more than 40 countries with open data platforms, and up to 1 million data sets as having been made open by governments worldwide.

Read more

MIT Lincoln Laboratory Wins R&D 100 Award for Structured Knowledge Space Software

Dorothy Ryan of the MIT Lincoln Laboratory reports, “Two technologies developed at MIT Lincoln Laboratory were among the 2013 choices for prestigious R&D 100 Awards. The Photoacoustic Sensing of Explosives system detects and discriminates trace amounts of explosives from significant standoff distances. The Structured Knowledge Space software and information system enables analysts to mine the vast store of intelligence reports available to government decision makers.” Read more

Expert System Announces Cogito Intelligence API for Government and Corporate Intelligence

MODENA, ITALY–(Marketwired – April 24, 2013) - Expert System, the semantic technology company, today introduces its newest solution, the Cogito Intelligence API, bringing advanced semantic functions to enable Government and Corporate Security analysts to access and exploit their most strategic sources of information.

Cogito Intelligence API is available for free proof of concept testing, with volume pricing and annual subscription levels. The API enables Government, Intelligence, Law Enforcement Agencies and enterprise Corporate Security functions to add semantic processing, text mining, categorization and tagging features to their analysis platforms and applications for faster evaluation of intelligence data. Read more

Open Gov Survey Looking for Participants

The University of Leeds is conducting a survey to determine the barriers to realizing the value of open government data. According to the survey website, “The University of Leeds, Socio-technical Centre and Centre for Integrated Energy Research, are conducting a research project on realising the value of open government data. This survey plays a key role in the project and focuses on developing understanding of: the potential barriers to improving the supply of open government data; the potential barriers to increasing the use of open data; and approaches to overcoming these potential barriers. By participating in this survey and providing your viewpoint you will be helping to shape policy, research and the wider dialogue on open data.” Read more

Open Data Bootcamp Heads to Tanzania

Michael Bauer of the Open Knowledge Foundation recently wrote, “I am on the Road in Tanzania and Ghana to spread the data love. Last week Tanzania’s first data journalism event happened. The Data Bootcamp, organized by the World Bank Institute and the African Media Initiative, brought together international experts, journalists, civil society organizations and technologists to work on data related projects. In 2010 Tanzania committed to release open government data as part of the open government partnership. Nevertheless, the Tanzanian government has only released two datasets so far. One goal of the data bootcamp was to spur demand by implementing small data projects.”

He goes on, “The format was tested before in South Africa, Kenya and Moldovia and helped to raise awareness of Open Data. In preparation and during the workshop four more datasets were scraped and liberated. Further data was collected by the participants to work on their specific projects. Of the 40 participants only 7 were able to code – the majority were journalists and activists who never handled data before. Through the three days they received an intensive training in how to use spreadsheets and tools like Google Refine or Fusion Tables to tell stories with data. The data bootcamps not only consist of intense hands-on learning experience, they also are a small competition, where $2000 are awarded to the winner.”

Read more here.

Image: Courtesy OKF

NASA Challenge Seeks Solution to Big Data Problems

Derrick Harris of GigaOM reports that NASA has launched a series of Big Data challenges aimed at finding innovative solutions to some of the nation’s most pressing Big Data problems. He writes, “Some of the U.S. government’s most research-intensive agencies want your help to come up with better ways to analyze their expansive data sets. NASA, along with the National Science Foundation and the Department of Energy, launched a competition on TopCoder called the Big Data Challenge series. Essentially, it’s a competition to crowdsource a solution to the very big problem of fragmented and incompatible federal data.” Read more

Linked Open Government Data: Dispatch from the Second International Open Government Data Conference

“What we have found with this project is… the capacity to take value out of open data is very limited.”

With the abatement of the media buzz surrounding open data since the first International Open Government Data Conference (IOGDC) was held in November 2011, it would be easy to believe that the task of opening up government data for public consumption is a fait accompli.  Most of the discussion at this year’s IOGDC conference, held July 10-12, centered on the advantages and roadblocks to creating an open data ecosystem within government, and the need to establish the right mix of policies to promote a culture of openness and sharing both within and between government agencies and externally with journalists, civil society, and the public at large.   According to these metrics the open government data movement has much to celebrate:  1,022,787 datasets from 192 catalogs in 24 languages representing 43 countries and international organizations.

The looming questions about the utility of open government data make it clear, however, that the movement is still in its early stages.    Much remains to be done to to provide usable, reliable, machine-readable and valuable government data to the public.

Read more

NEXT PAGE >>