Posts Tagged ‘open data’

New World War I Encyclopedia Relies on Semantic MediaWiki

19141918onlineHeritage Daily recently reported, “A comprehensive, English-language, open access encyclopedia of what was deemed the ‘Great War’ was introduced and released on Wednesday 8th October, in Brussels. The project ‘1914-1918-online. International Encyclopedia of the First World War’ is managed by researchers at Freie Universität Berlin in cooperation with the Bavarian State Library. It is funded by the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG). The encyclopaedia combines the latest historical research with the many advantages of the Semantic Web. The content was written and compiled by 1,000 experts from 54 countries, and is continuously being updated and expanded.” The encyclopedia can be accessed here. Read more

NSF Awards $15M to DataONE Environmental Linked Data Project

dataoneMary Martialay of RPI News reports, “The National Science Foundation (NSF) has awarded $15 million to a team of environmental and earth science data researchers, including researchers at Rensselaer Polytechnic Institute, who are providing tools and infrastructure that improve access to vast amounts of scientific data. As part of the project, Rensselaer will provide semantic technology leadership to help improve scientific discovery, said Deborah McGuinness, director of the Rensselaer Web Science Center,  and Tetherless World Senior Constellation Chair and professor of computer science and cognitive science at Rensselaer.” Read more

Modern Science Must Be Open Science

3515348126_4315caf417Peter Murray-Rust of OpenSource.com recently wrote, “Open is about sharing and collaboration. It’s the idea that ‘we’ is more powerful, more rewarding and fulfilling than ‘I’. I can’t promise jobs, but I do know that openis becoming very big. Governments and funders are pushing the open agenda, even though academics are generally uninterested or seriously self-interested. Some governments and some companies recognize the value of teams; academia and academics generally don’t. The false values of impact factor and the false values of academic publishing mean that open access is a poor reflection of open, or what you may recognize as the open source way.” Read more

Symplectic Becomes the First DuraSpace Registered Service Provider for the VIVO Project

vivoResearch Information recently reported, “Symplectic Limited, a software company specialising in developing, implementing, and integrating research information systems, has become the first DuraSpace Registered Service Provider (RSP) for the VIVO Project. VIVO is an open-source, open-ontology, open-process platform for hosting information about the interests, activities and accomplishments of scientists and scholars. VIVO aims to support open development and integration of science and scholarship through simple, standard semantic web technologies.” Read more

Setting Government Data Free

taAs July 4 approaches, the subject of open government data can’t help but be on many U.S. citizens’ minds. That includes the citizens who are responsible for opening up that data to their fellow Americans. They might want to take a look at NuCivic Data Enterprise, the recently unveiled cloud-based, open source, open data platform for government from NuCivic, in partnership with Acquia and Carahsoft. It’s providing agencies an OpenSaaS approach to meeting open data mandates to publish and share datasets online, based on the Drupal open source content management system.

NuCivic’s open source DKAN Drupal distribution provides the core data management components for the NuCivic Data platform; it was recognized last week as a grand prize winner for Amazon Web Services’ Global City on a Cloud Innovation Challenge in the Partner in Innovation category. Projects in this category had to demonstrate that the application solves a particular challenge faced by local government entities. As part of the award, the NuCivic team gets $25,000 in AWS services to further support its open data efforts.

Read more

HTTPA Will Let You Track How Your Private Data is Used

oshani-seneviratne_bv13_0

Larry Hardesty of the MIT News Office reports, “By now, most people feel comfortable conducting financial transactions on the Web. The cryptographic schemes that protect online banking and credit card purchases have proven their reliability over decades. As more of our data moves online, a more pressing concern may be its inadvertent misuse by people authorized to access it. Every month seems to bring another story of private information accidentally leaked by governmental agencies or vendors of digital products or services. At the same time, tighter restrictions on access could undermine the whole point of sharing data. Coordination across agencies and providers could be the key to quality medical care; you may want your family to be able to share the pictures you post on a social-networking site.” Read more

Data.gov Turns Five

datagov

Nextgov reports, “When government technology leaders first described a public repository for government data sets more than five years ago, the vision wasn’t totally clear. ‘I just didn’t understand what they were talking about,’ said Marion Royal of the General Services Administration, describing his first introduction to the project. ‘I was thinking, ‘this is not going to work for a number of reasons.’’ A few minutes later, he was the project’s program director. He caught onto and helped clarify that vision and since then has worked with a small team to help shepherd online and aggregate more than 100,000 data sets compiled and hosted by agencies across federal, state and local governments.” Read more

The Debate Over Net Neutrality

9019092727_6372a9f87c_z

Jeff Sommer of the New York Times recently wrote, “The future of the Internet — which means the future of communications, culture, free speech and innovation — is up for grabs. The Federal Communications Commission is making decisions that may determine how open the Internet will be, who will profit most from it and whether start-ups will face new barriers that will make it harder for ideas to flourish. Tim Wu, 41, a law professor at Columbia University, isn’t a direct participant in the rule making, but he is influencing it. A dozen years ago, building on the work of more senior scholars, Mr. Wu developed a concept that is now a generally accepted norm. Called ‘net neutrality,’ short for network neutrality, it is essentially this: The cable and telephone companies that control important parts of the plumbing of the Internet shouldn’t restrict how the rest of us use it.” Read more

Tax Time And The IRS Is On Our Minds

irs

Have you checked out the IRS Tax Map this year? If not, what better way to spend April 15 (aside from actually filing those returns, of course).

The IRS Tax Map, as explained here, actually began as a project in 2002, as a prototype to address the business need for improved access to tax law technical information by the agency’s call center workers. These days, Tax Map is available to taxpayers to offer them topic-oriented access to the IRS’s diverse information products, as well. It aims at delivering semantic integration via the Topic Maps international standard (ISO/IEC 13250), grouping information about subjects, including those referred to by diverse names, in a single place.

It was created for the IRS by Infoloom in cooperation with Plexus Scientific and Coolheads Consulting. Infoloom explains on its web site that it lets customers control what is returned by search queries via a topic map approach that lets them extract from existing content information on the topics they need to represent, without having to build a taxonomy of terms, and add specific knowledge to that information as part of the extraction process.

Read more

Rewarding Improved Access to Linked Data

swj

A new paper out of the Semantic Web journal shares a proposed system, Five Stars of Linked Data Vocabulary Use. The paper was written by Krzysztof Janowicz, Pascal Hitzler, Benjamin Adams, Dave Kolas, and Charles Vardeman II. The abstract states, “In 2010 Tim Berners-Lee introduced a 5 star rating to his Linked Data design issues page to encourage data publishers along the road to good Linked Data. What makes the star rating so effective is its simplicity, clarity, and a pinch of psychology — is your data 5 star?” Read more

NEXT PAGE >>