Posts Tagged ‘government’
The media has been reporting the last few hours on the Obama administration’s self-imposed deadline for fixing HealthCare.gov. According to these reports, the site is now working more than 90 percent of the time, up from 40 percent in October; that pages on the website are loading in less than a second, down from about eight; that 50,000 people can simultaneously use the site and that it supports 800,000 visitors a day; and page-load failures are down to under 1 percent.
There’s also word, however, that while the front-end may be improved, there are still problems on the back-end. Insurance companies continue to complain they aren’t getting information correctly to support signups. “The key question,” according to CBS News reporter John Dickerson this morning, “is whether that link between the information coming from the website getting to the insurance company – if that link is not strong, people are not getting what was originally promised in the entire process.” If insurance companies aren’t getting the right information for processing plan enrollments, individuals going to the doctor’s after January 1 may find that they aren’t, in fact, covered.
Jeffrey Zients, the man spearheading the website fix, at the end of November did point out that work remains to be done on the backend for tasks such as coordinating payments and application information with insurance companies. Plans are for that to be in effect by mid-January.
As it turns out, among components of its backend technology, according to this report in the NY Times, is the MarkLogic Enterprise NoSQL database, which in its recent Version 7 release also added the ability to store and query data in RDF format using SPARQL syntax.
There’s money in that open data. A new report from the McKinsey Global Institute finds that machine-readable information that’s made available to others has the potential to generate significant economic value: $3 trillion annually in seven domains, to be exact.
The report, entitled Open Data: Unlocking Innovation And Performance With Liquid Information, sees the potential economic effect unfolding in education, transportation, consumer products, electricity, oil and gas, health care and consumer finance. Data becomes more liquid, the report authors note, when it is open, widely available and in shareable formats, and when advanced computing and analysis can yield from it — potentially in conjunction with proprietary data — novel insights. It doesn’t specifically mention Linked Data, but hones in on government open data platforms – including the Linked-Data infused data.gov.UK, which it cites as having had 1.4 million page views this summer – as critical to the economic good tidings. It records more than 40 countries with open data platforms, and up to 1 million data sets as having been made open by governments worldwide.
By 2016, ABI Research has it, as much as $114 billion could be saved worldwide through the implementation of online e-government services. It predicted that investment in these services is set to increase from $28 billion in 2010 to $57 billion in 2016, and that the number of users will nearly triple over the forecast period.
Here in the states, according to a 2012 survey by GovLoop, 83 percent of respondents say that they can access government-oriented customer service efforts via a website. And the number of people who are taking advantage of the ability to access information and services on government web sites is pretty significant, even going back to 2010, when the Pew Internet & American Life Project reported that 82 percent of American Internet users – 62 percent of adults – were doing so. Among its findings at the time were that 46 percent have looked up what services a government agency provides; 33 percent have renewed a driver’s license or auto registration; 23 percent have gotten information about or applied for government benefits; and 11 percent have applied for a recreational license, such as a fishing or hunting license.
Given the interest in accessing information via the Internet about government services by the citizenry — not to mention accessing the services themselves, and not only in the US but abroad — it makes sense for governments to put an emphasis on customer service online. The Govloop survey finds that there’s room for some improvement, with the majority of respondents rating service a 3 or 4 on the scale of 1 to 5. Perhaps additional help will come from some efforts in the semantic web space, like a vocabulary for describing civic services that government organizations can use to help citizens using search engines hone in on the service that’s their true interest from the start.
Here are some final thoughts from our panel of semantic web experts on what to expect to see as the New Year rings in:
Broader deployment of the schema.org terms is likely. In the study by Muehlisen and Bizer in July this year, we saw Open Graph Protocol, DC, FOAF, RSS, SIOC and Creative Commons still topping the ranks of top semantic vocabularies being used. In 2013 and beyond, I expect to see schema.org jump to the top of that list.
Christine Connors, Chief Ontologist, Knowledgent:
I think we will see an uptick in the job market for semantic technologists in the enterprise; primarily in the Fortune 2000. I expect to see some M&A activity as well from systems providers and integrators who recognize the desire to have a semantic component in their product suite. (No, I have no direct knowledge; it is my hunch!)
We will see increased competition from data analytics vendors who try to add RDF, OWL or graphstores to their existing platforms. I anticipate saying, at the end of 2013, that many of these immature deployments will leave some project teams disappointed. The mature vendors will need to put resources into sales and business development, with the right partners for consulting and systems integration, to be ready to respond to calls for proposals and assistance.
As we close out 2012, we’ve asked some semantic tech experts to give us their take on the year that was. Was Big Data a boon for the semantic web, or is the opportunity to capitalize on the connection still pending? Is structured data on the web not just the future but the present? What sector is taking a strong lead in the semantic web space?
We begin with Part 1, with our experts listed in alphabetical order:
John Breslin, lecturer at NUI Galway, researcher and unit leader at DERI, creator of SIOC, and co-founder of Technology Voice and StreamGlider:
I think the schema.org initiative really gaining community support and a broader range of terms has been fantastic. It’s been great to see an easily understandable set of terms for describing the objects in web pages, but also leveraging the experience of work like GoodRelations rather than ignoring what has gone before. It’s also been encouraging to see the growth of Drupal 7 (which produces RDFa data) in the government sector: Estimates are that 24 percent of .gov CMS sites are now powered by Drupal.
Martin Böhringer, CEO & Co-Founder Hojoki:
For us it was very important to see Jena, our Semantic Web framework, becoming an Apache top-level project in April 2012. We see a lot of development pace in this project recently and see a chance to build an open source Semantic Web foundation which can handle cutting-edge requirements.
Still disappointing is the missing link between Semantic Web and the “cool” technologies and buzzwords. From what we see Semantic Web gives answers to some of the industry’s most challenging problems, but it still doesn’t seem to really find its place in relation to the cloud or big data (Hadoop).
Christine Connors, Chief Ontologist, Knowledgent:
One trend that I have seen is increased interest in the broader spectrum of semantic technologies in the enterprise. Graph stores, NoSQL, schema-less and more flexible systems, ontologies (& ontologists!) and integration with legacy systems. I believe the Big Data movement has had a positive impact on this field. We are hearing more and more about “Big Data Analytics” from our clients, partners and friends. The analytical power brought to bear by the semantic technology stack is sparking curiosity – what is it really? How can these models help me mitigate risk, more accurately predict outcomes, identify hidden intellectual assets, and streamline business processes? Real questions, tough questions: fun challenges!
Every day New York City is getting closer to being the Digital City of the Future. It’s a long journey, though, and one that the semantic web community can lend a hand with.
At this week’s Semantic Technology & Business Conference in NYC, Andrew Nicklin of the Office of Strategic Technology and Development, NYC Department of Information Technology & Telecommunications (DoITT) provided a look at what has been accomplished so far, and what’s on the to-do roadmap. Recent months have seen accomplishments including the passage of Local Law 11 of 2012 – the “most progressive legislation in the U.S. as far as cities being mandated to open data,” Nicklin said in an interview with The Semantic Web Blog before his keynote address at SemTech. “It ensures permanency for our program past any administrative changes….The whole notion of open data doesn’t go way because it is written into law.”
Six months ago, Ontodia’s NYCFacets walked away with the win at New York City’s BigApps 3.0 conference. In the months since, the Smart Open Data Exchange that catalogs all the NYC-related data sources (which we first covered here) has been busy expanding its team, moving into the NYU-Poly hosted incubator, and getting ready to launch its Smart City platform for general use next year.
A preview of that platform will take place at the upcoming Semantic Technology & Business Conference in NYC. “We are going to our original mission of really creating that data exchange using semantic technology,” says Ontodia co-founder Joel Natividad. It’s putting the focus not on raw data or learning new technologies, but on being a linked answers marketplace – converting raw data to answers rather than just linking raw data.
Helen Walters reports that Beth Noveck recently gave a TED talk regarding open-source government. Walters writes, “As the US’s first Deputy CTO, Beth Noveck founded the White House Open Government Initiative, which developed administration policy on transparency, participation and collaboration. She starts her talk by reminding us that in the old days, the White House was literally an open house. At the beginning of the 19th century, John Quincy Adams met a local dentist who happened in to shake his hand. Adams promptly dismissed the Secretary of State, with whom he was meeting, and asked the dentist to remove an aching tooth. ‘When I got to the White House in 2009, the White house was anything but open,’ she says. Bomb blast curtains covered the windows; they were running Windows 2000. Social media was verboten. Noveck’s mandate: to change this system.” Read more
NEXT PAGE >>