Posts Tagged ‘Government Data’
A new article out of Information Daily reports, “Milton Keynes may see driverless cars on its roads in 12-18 months, says Geoff Snelson, Strategy Director of MK Smart, the innovation programme being run in the city. The driverless two-person pods are one of the outputs of the MK Smart programme, which is a collaboration between a number of organisations including the Open University (which is located in Milton Keynes) and BT. Central to the project is the creation of the ‘MK Data Hub’, which will support the acquisition and management of vast amounts of data relevant to city systems from a variety of data sources. As well as transport data, these will include data about energy and water consumption, data acquired through satellite technology, social and economic datasets, and crowd-sourced data from social media or specialised apps. Building on the capability provided by the MK Data Hub, the project will innovate in the areas of transport, energy and water management, tackling key demand issues.” Read more
Following the newly minted “recommendation” status of RDF 1.1, Michael C. Daconta of GCN has asked, “What does this mean for open data and government transparency?” Daconta writes, “First, it is important to highlight the JSON-LD serialization format. JSON is a very simple and popular data format, especially in modern Web applications. Furthermore, JSON is a concise format (much more so than XML) that is well-suited to represent the RDF data model. An example of this is Google adopting JSON-LD for marking up data in Gmail, Search and Google Now. Second, like the rebranding of RDF to ‘linked data’ in order to capitalize on the popularity of social graphs, RDF is adapting its strong semantics to other communities by separating the model from the syntax. In other words, if the mountain won’t come to Muhammad, then Muhammad must go to the mountain.” Read more
There’s money in that open data. A new report from the McKinsey Global Institute finds that machine-readable information that’s made available to others has the potential to generate significant economic value: $3 trillion annually in seven domains, to be exact.
The report, entitled Open Data: Unlocking Innovation And Performance With Liquid Information, sees the potential economic effect unfolding in education, transportation, consumer products, electricity, oil and gas, health care and consumer finance. Data becomes more liquid, the report authors note, when it is open, widely available and in shareable formats, and when advanced computing and analysis can yield from it — potentially in conjunction with proprietary data — novel insights. It doesn’t specifically mention Linked Data, but hones in on government open data platforms – including the Linked-Data infused data.gov.UK, which it cites as having had 1.4 million page views this summer – as critical to the economic good tidings. It records more than 40 countries with open data platforms, and up to 1 million data sets as having been made open by governments worldwide.
Dorothy Ryan of the MIT Lincoln Laboratory reports, “Two technologies developed at MIT Lincoln Laboratory were among the 2013 choices for prestigious R&D 100 Awards. The Photoacoustic Sensing of Explosives system detects and discriminates trace amounts of explosives from significant standoff distances. The Structured Knowledge Space software and information system enables analysts to mine the vast store of intelligence reports available to government decision makers.” Read more
MODENA, ITALY–(Marketwired – April 24, 2013) - Expert System, the semantic technology company, today introduces its newest solution, the Cogito Intelligence API, bringing advanced semantic functions to enable Government and Corporate Security analysts to access and exploit their most strategic sources of information.
Cogito Intelligence API is available for free proof of concept testing, with volume pricing and annual subscription levels. The API enables Government, Intelligence, Law Enforcement Agencies and enterprise Corporate Security functions to add semantic processing, text mining, categorization and tagging features to their analysis platforms and applications for faster evaluation of intelligence data. Read more
The University of Leeds is conducting a survey to determine the barriers to realizing the value of open government data. According to the survey website, “The University of Leeds, Socio-technical Centre and Centre for Integrated Energy Research, are conducting a research project on realising the value of open government data. This survey plays a key role in the project and focuses on developing understanding of: the potential barriers to improving the supply of open government data; the potential barriers to increasing the use of open data; and approaches to overcoming these potential barriers. By participating in this survey and providing your viewpoint you will be helping to shape policy, research and the wider dialogue on open data.” Read more
Michael Bauer of the Open Knowledge Foundation recently wrote, “I am on the Road in Tanzania and Ghana to spread the data love. Last week Tanzania’s first data journalism event happened. The Data Bootcamp, organized by the World Bank Institute and the African Media Initiative, brought together international experts, journalists, civil society organizations and technologists to work on data related projects. In 2010 Tanzania committed to release open government data as part of the open government partnership. Nevertheless, the Tanzanian government has only released two datasets so far. One goal of the data bootcamp was to spur demand by implementing small data projects.”
He goes on, “The format was tested before in South Africa, Kenya and Moldovia and helped to raise awareness of Open Data. In preparation and during the workshop four more datasets were scraped and liberated. Further data was collected by the participants to work on their specific projects. Of the 40 participants only 7 were able to code – the majority were journalists and activists who never handled data before. Through the three days they received an intensive training in how to use spreadsheets and tools like Google Refine or Fusion Tables to tell stories with data. The data bootcamps not only consist of intense hands-on learning experience, they also are a small competition, where $2000 are awarded to the winner.”
Image: Courtesy OKF
Derrick Harris of GigaOM reports that NASA has launched a series of Big Data challenges aimed at finding innovative solutions to some of the nation’s most pressing Big Data problems. He writes, “Some of the U.S. government’s most research-intensive agencies want your help to come up with better ways to analyze their expansive data sets. NASA, along with the National Science Foundation and the Department of Energy, launched a competition on TopCoder called the Big Data Challenge series. Essentially, it’s a competition to crowdsource a solution to the very big problem of fragmented and incompatible federal data.” Read more
NEXT PAGE >>