Does data spell relief for congested cities?
Can data kill your pain? The city of Los Angeles is hoping it will, at least where some data sources are concerned.
In May, the city launched a new DataLA site that features data downloads on topics such as crime statistics and budget information, as well as easy-to-understand visualizations of key metrics at a separate portal called PerformanceLAcity.
A June hackathon encouraged developers to take these datasets and create solutions that improve city life. Projects focused on affordable housing, public transit, and — spurred by a devastating statewide drought — apps to report water waste.
Code for America has similar objectives to enhance the quality of civic life on a broader landscape, organizing hackathons in over 130 U.S. cities so far. Its fourth annual Summit occurred in late September in San Francisco. The non-profit organization places software developers, user interface designers, and data enthusiasts into projects to re-imagine, re-think and/or redesign existing processes to optimize productivity, experiences, and satisfaction.
Traffic hacks with ATSAC
For many cities around the world, one of the most intractable problems is traffic congestion. It’s certainly one of the biggest problems for L.A., where 65 percent of commuters are solo travelers. This sprawling metropolis, which installed the world’s first traffic lights in 1924, has ambitious hopes for innovative solutions based on their traffic data.
The data is collected by Automated Traffic Surveillance and Control and city parking management systems. ATSAC, first rolled out to manage signal timing on the streets surrounding venues used for the 1984 Olympic Games, is now implemented citywide at over 4.400 intersections with traffic signals. Street sensors monitor vehicle passage, speed and congestion in one-second increments. This realtime data delivers situational awareness to the ATSAC operations center to adjust traffic signal timings to reduce congestion. The ATSAC system has a number of measurable benefits, most specifically in travel times, CO2 emissions and fuel use. Any concomitant reductions in road rage haven’t been tracked, but that’s not as easy to measure.
[Learn more about resilient cities at VERGE SF 2014, Oct. 27-30.]
On Sept. 22, the city published Request for Information focused on that realtime ATSAC data. The objective is to learn who is interested in this data and what new information and valuable services can be derived with this data.
Smarter grids, smarter decisions
Imagine if electric and water utilities operated this way. If meter data, properly anonymized and aggregated into data sets to protect privacy, were available for hackathons, more feasible solutions for residential rentals and multi-family housing might pop up — two markets sorely underserved by existing home energy management applications.
The federal Green Button initiative has sponsored and participated in hackathons, most recently an event in August in San Francisco, and in September at the KTH Royal Institute of Technology in Stockholm, Sweden. Kudos to the organizers, sponsors and participants of these hackathons that take existing energy data sets and create new applications to address the event challenges. It would be interesting to see utilities get engaged in hackathons. One starting point would be to consider what types of data and data sets could be made available to answer a wide range of their challenges.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services and improve civic life. Could similar activities help utilities engaged in smart grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points potentially could be addressed with intelligent data visualizations and analytics. Maybe expanding the pool of solution contributors could accelerate development and deployment of painkillers.