Government

Driverless Cars Coming to the Streets of UK Town, Milton Keynes

6255901473_8d72027999

A new article out of Information Daily reports, “Milton Keynes may see driverless cars on its roads in 12-18 months, says Geoff Snelson, Strategy Director of MK Smart, the innovation programme being run in the city. The driverless two-person pods are one of the outputs of the MK Smart programme, which is a collaboration between a number of organisations including the Open University (which is located in Milton Keynes) and BT. Central to the project is the creation of the ‘MK Data Hub’, which will support the acquisition and management of vast amounts of data relevant to city systems from a variety of data sources. As well as transport data, these will include data about energy and water consumption, data acquired through satellite technology, social and economic datasets, and crowd-sourced data from social media or specialised apps. Building on the capability provided by the MK Data Hub, the project will innovate in the areas of transport, energy and water management, tackling key demand issues.” Read more

Constitute: Explore the World’s Constitutions with RDF

screen shot of constituteproject.orgIn the video below, Dr. James Melton, a Lecturer in Comparitive Politics at University College London, gives a presentation on Constitute. Constitute is a new way to explore the constitutions of the world. The origins of the project date back to 2005 with the Comparative Constitutions Project, which has the stated goal of cataloging the contents of all constitutions written in independent states since 1789. To date, that work has resulted in a collection of 900+ constitutions and 2500+ Amendments.  A rigorous formal survey instrument including 669 questions was then applied to each of these “constitutional events,” resulting in the base data that the team had to work with. Melton and his group wanted to create a system that allowed for open sharing of this information, and not just with researchers, but with anyone who wants to explore the world’s constitutions. They also needed the system to be flexible enough to handle changes, when, as Melton points out, “…roughly 15% of the countries in the world change their constitution every single year.”

Read more

6 Suggestions for Making Open Data Work in Government

196883445_f4fc360ac9

Joel Gurin of InformationWeek recently asked, “Will 2014 finally become the year of open data? We’re certainly seeing evidence that open data is moving from the margins into the mainstream, with new uses for data that governments and other sources are making freely available to the public. But if we’re going to see open data’s promise fulfilled, it will be important for governments, and the federal government in particular, to make it easier for the public to access and use their open data.” Read more

A Look Ahead For Linked Data

rsz_swirrllogoAt Swirrl, the focus continues to be bringing more users aboard the Linked Data train. It’s helping to realize this aim in part thanks to the work it’s doing with customers, primarily in the government sector. The company’s bringing its PublishMyData platform (which The Semantic Web Blog first discussed here) to customers such as the U.K.’s Department for Communities and Local Government, which is looking to Linked Data to help publish statistical data and useful reference data about local government and also information about the department’s performance to increase transparency, which is consumed primarily by other public sector organizations, charities, and entities that report to the department but are not part of it.

“Usually that was done in a mish-mash of technologies, and depended on individuals that do lots of hard work with spreadsheets to make it work,” says founder and CEO Bill Roberts. He characterizes the department’s move to Linked Data as a bit of a leap of faith, driven by its open data strategist Steve Peters and a vision of what can be achieved by moving in this direction. During engagements like this one, Roberts notes, Swirrl has gotten some strong insight, as well, into how to improve its solution for people who aren’t “dyed-in-the-wool Linked Data heads. That’s fed into things we’re working on,” including plans in the pipeline to build tools that make the self-service process easier.

Read more

HealthCare.Gov: Progress Made But BackEnd Struggles Continue

rsz_hcgovThe media has been reporting the last few hours on the Obama administration’s self-imposed deadline for fixing HealthCare.gov. According to these reports, the site is now working more than 90 percent of the time, up from 40 percent in October; that pages on the website are loading in less than a second, down from about eight; that 50,000 people can simultaneously use the site and that it supports 800,000 visitors a day; and page-load failures are down to under 1 percent.

There’s also word, however, that while the front-end may be improved, there are still problems on the back-end. Insurance companies continue to complain they aren’t getting information correctly to support signups. “The key question,” according to CBS News reporter John Dickerson this morning, “is whether that link between the information coming from the website getting to the insurance company – if that link is not strong, people are not getting what was originally promised in the entire process.” If insurance companies aren’t getting the right information for processing plan enrollments, individuals going to the doctor’s after January 1 may find that they aren’t, in fact, covered.

Jeffrey Zients, the man spearheading the website fix, at the end of November did point out that work remains to be done on the backend for tasks such as coordinating payments and application information with insurance companies. Plans are for that to be in effect by mid-January.

As it turns out, among components of its backend technology, according to this report in the NY Times, is the MarkLogic Enterprise NoSQL database, which in its recent Version 7 release also added the ability to store and query data in RDF format using SPARQL syntax.

Read more

Semantic Web in Emergency Response Systems – UPDATE

2009-Veiligheidsregios-mediumCoordinated emergency response, built on Linked Data.

That is the vision of Bart van Leeuwen, Amsterdam Firefighter and founder of software company, Netage. We’ve covered Bart’s work before here at SemanticWeb.com and at the Semantic Technology & Business Conference, and today, there is news that the work is advancing to a new stage.

In the Netherlands, there exist 25 “Safety Regions” (pictured on the left). These organizations coordinate disaster management, fire services, and emergency medical teams. The regions are designed to enable various first responders to work together to deal with complex and severe crises and disasters.

Additionally, the Dutch Police acts as a primary partner organization in these efforts. The police is a national organization, separate from the safety regions and divided into its own ten regions. Read more

A Look Inside the NSA’s XKeyscore

Sean Gallagher of Ars Technica recently wrote, “The National Security Agency’s (NSA) apparatus for spying on what passes over the Internet, phone lines, and airways has long been the stuff of legend, with the public catching only brief glimpses into its Leviathan nature. Thanks to the documents leaked by former NSA contractor Edward Snowden, we now have a much bigger picture. When that picture is combined with federal contract data and other pieces of the public record—as well as information from other whistleblowers and investigators—it’s possible to deduce a great deal about what the NSA has built and what it can do.” Read more

Helping Citizen Searches For Government Services

Photo courtesy: Flickr/Arjan Richter

By 2016, ABI Research has it, as much as $114 billion could be saved worldwide through the implementation of online e-government services. It predicted that investment in these services is set to increase from $28 billion in 2010 to $57 billion in 2016, and that the number of users will nearly triple over the forecast period.

Here in the states, according to a 2012 survey by GovLoop, 83 percent of respondents say that they can access government-oriented customer service efforts via a website. And the number of people who are taking advantage of the ability to access information and services on government web sites is pretty significant, even going back to 2010, when the Pew Internet & American Life Project reported that 82 percent of American Internet users – 62 percent of adults – were doing so. Among its findings at the time were that 46 percent have looked up what services a government agency provides; 33 percent  have renewed a driver’s license or auto registration; 23 percent have gotten information about or applied for government benefits; and 11 percent have applied for a recreational license, such as a fishing or hunting license.

Given the interest in accessing information via the Internet about government services by the citizenry — not to mention accessing the services themselves, and not only in the US but abroad — it makes sense for governments to put an emphasis on customer service online. The Govloop survey finds that there’s room for some improvement, with the majority of respondents rating service a 3 or 4 on the scale of 1 to 5. Perhaps additional help will come from some efforts in the semantic web space, like a vocabulary for describing civic services that government organizations can use to help citizens using search engines hone in on the service that’s their true interest from the start.

Read more

Auburn University Students Gather Intelligence for US Government

Elliot Mass of Information Management reports, “A partnership between Auburn University and Intelligent Software Solutions is adding a novel wrinkle to the old adage of learning by doing. In this case, Auburn students will hone real-world data analytics skills by gathering military intelligence for the U.S. government. Auburn, a public university in Auburn, Alabama, with more than 25,000 students, trains students in data modeling and simulation, cyber forensics and cyber risk analysis at its Cyber Research Center. ISS, based in Colorado Springs, develops software solutions for the U.S. government. Its data analysis and visualization, geo-temporal analysis and semantic data processing products are used by the Department of Defense, the Department of Homeland Security as well as foreign governments.” Read more

“Coverlet Meshing” Weighs in on PRISM

The debate about PRISM continues. One of the latest volleys was posted in InformationWeek by Coverlet Meshing (a pseudonym used by “a senior IT executive at one of the nation’s largest banks.”) Meshing wrote: “Prism doesn’t scare me. On 9/11, my office was on the 39th floor of One World Trade. I was one of the many nameless people you saw on the news running from the towers as they collapsed. But the experience didn’t turn me into a hawk. In fact, I despise the talking heads who frame Prism as the price we pay for safety. And not just because they’re fear-mongering demagogues. I hate them because I’m a technologist and they’re giving technology a bad name.” Read more

NEXT PAGE >>