Nicole Laskowski of SearchCIO recently wrote, “When Brett Goldstein was appointed as Chicago’s first chief data officer (CDO) in May 2011, he found himself in the middle of a classic IT struggle. The city’s data was spread across the municipality and mired in silos, making it difficult to get a holistic view… That needed to change — in a hurry. The city was set to host the North Atlantic Treaty Organization (NATO) Summit in May 2012. The event would bring in heads of state — and throngs of protesters — to Chicago. Goldstein wanted to provide public safety officials with better ‘situational awareness,’ or the ability to understand what was happening in any given place at any given time. To do so, Goldstein, who became Chicago’s CDO/CIO in 2012, needed to break data out of silos in a cost-effective manner that didn’t require overhauling the city’s infrastructure.” Read more
A recent press release states, “Transforming our cities into the Smart Cities of the future will encompass incorporating technologies and key digital developments all linked by machine-to-machine (M2M) solutions and real-time data analytics which sit under the umbrella term of the Internet of Things. Smart cities however must be underpinned by the appropriate ICT infrastructure based on fibre optic and high-speed wireless technologies, which is well underway in many developed cities around the world. This infrastructure allows for the development of smart communities; supporting connected homes; intelligent transport systems; e-health; e-government and e-education; smart grids and smart energy solutions – just to name a few of the exciting solutions smart cities will incorporate. Many of the technological advancements emerging around the world today can, and will be, applied to smart cities. Artificial Intelligence; Electric Vehicles; Autonomous Vehicles; Mobile applications; Drones; Wearable and Smart devices and so on are just some of the key developments to watch.” Read more
Kat Megas of NSTIC recently wrote, “Among the questions we’re asked most frequently about NSTIC is: why are trusted identities good for business? The NSTIC pilots have collectively started to answer that question, highlighting how better privacy, security and convenience are enabling new online business models, and driving higher sales and profits. One of the better examples of this has been the work done by NSTIC pilot awardee ID.me. In 2013, ID.me received a $2.8M cooperative agreement from NIST to pilot its trusted identity solution, which enables members of the military community and their families, First Responders, and students to access exclusive benefits and services online both securely and efficiently without having to share sensitive information with the brands directly. While this easy-to-use and interoperable solution aligns with the NSTIC guidelines, it also benefits partner companies’ bottom line.”
As July 4 approaches, the subject of open government data can’t help but be on many U.S. citizens’ minds. That includes the citizens who are responsible for opening up that data to their fellow Americans. They might want to take a look at NuCivic Data Enterprise, the recently unveiled cloud-based, open source, open data platform for government from NuCivic, in partnership with Acquia and Carahsoft. It’s providing agencies an OpenSaaS approach to meeting open data mandates to publish and share datasets online, based on the Drupal open source content management system.
NuCivic’s open source DKAN Drupal distribution provides the core data management components for the NuCivic Data platform; it was recognized last week as a grand prize winner for Amazon Web Services’ Global City on a Cloud Innovation Challenge in the Partner in Innovation category. Projects in this category had to demonstrate that the application solves a particular challenge faced by local government entities. As part of the award, the NuCivic team gets $25,000 in AWS services to further support its open data efforts.
Shaunacy Ferro of Fast Company reports, “In 2011, researchers at the MIT Media Lab debuted Place Pulse, a website that served as a kind of ‘hot or not’ for cities. Given two Google Street View images culled from a select few cities including New York City and Boston, the site asked users to click on the one that seemed safer, more affluent, or more unique. The result was an empirical way to measure urban aesthetics. Now, that data is being used to predict what parts of cities feel the safest. StreetScore, a collaboration between the MIT Media Lab’s Macro Connections and Camera Culture groups, uses an algorithm to create a super high-resolution map of urban perceptions. The algorithmically generated data could one day be used to research the connection between urban perception and crime, as well as informing urban design decisions.” Read more
Jessica Leber of Co.Exist recently wrote, “Since opening in 2003, New York City’s pioneering 311 center for non-emergency questions and complaints has become a massive operation, handling an average of around 60,000 questions a day via phone, text message, website, and mobile app. That’s added up to more than 180 million queries to date, processed by the hundreds of real, live humans that staff a call center in Manhattan 24 hours a day, seven days a week. Yet it all seems rather antiquated at a time when a quick query to Siri or Google can almost instantly provide answers in other realms of life.” Read more
A new article out of Information Daily reports, “Milton Keynes may see driverless cars on its roads in 12-18 months, says Geoff Snelson, Strategy Director of MK Smart, the innovation programme being run in the city. The driverless two-person pods are one of the outputs of the MK Smart programme, which is a collaboration between a number of organisations including the Open University (which is located in Milton Keynes) and BT. Central to the project is the creation of the ‘MK Data Hub’, which will support the acquisition and management of vast amounts of data relevant to city systems from a variety of data sources. As well as transport data, these will include data about energy and water consumption, data acquired through satellite technology, social and economic datasets, and crowd-sourced data from social media or specialised apps. Building on the capability provided by the MK Data Hub, the project will innovate in the areas of transport, energy and water management, tackling key demand issues.” Read more
In the video below, Dr. James Melton, a Lecturer in Comparitive Politics at University College London, gives a presentation on Constitute. Constitute is a new way to explore the constitutions of the world. The origins of the project date back to 2005 with the Comparative Constitutions Project, which has the stated goal of cataloging the contents of all constitutions written in independent states since 1789. To date, that work has resulted in a collection of 900+ constitutions and 2500+ Amendments. A rigorous formal survey instrument including 669 questions was then applied to each of these “constitutional events,” resulting in the base data that the team had to work with. Melton and his group wanted to create a system that allowed for open sharing of this information, and not just with researchers, but with anyone who wants to explore the world’s constitutions. They also needed the system to be flexible enough to handle changes, when, as Melton points out, “…roughly 15% of the countries in the world change their constitution every single year.”
Joel Gurin of InformationWeek recently asked, “Will 2014 finally become the year of open data? We’re certainly seeing evidence that open data is moving from the margins into the mainstream, with new uses for data that governments and other sources are making freely available to the public. But if we’re going to see open data’s promise fulfilled, it will be important for governments, and the federal government in particular, to make it easier for the public to access and use their open data.” Read more
NEXT PAGE >>