How citizens have become sensors
How citizens have become sensors
This piece was originally featured in "City 2.0: The Habitat of the Future and How to Get There." “Citizens as sensors” has been reprinted with permission from TED Books (TED Books and the Atlantic Cities, 2013). Access information about the ebook online here.
Our cities are talking, and we’re talking back.
Today, the data of a city is flowing constantly — across intersections, over power lines, aboard buses and through the water pipes beneath our streets — and it’s being quantified and scrutinized more than ever. The data is endless. And although it has always been there, only recently have we figured out ways to use it. Cities now examine data (or, more accurately, a computer-aided interpretation of it) to anticipate spikes in electricity demand or to synchronize streetlights to prevent traffic snarls. In the urban context, information feedback loops are making it easier for us to optimize the ways our cities work. The so-called smart city we all keep hearing about is being built on a fundamental premise: With enough data and powerful analytical tools, we can make our cities better.
Smartphones put this process into overdrive; they’re already collecting and sharing the sort of information that will underpin the future smart city. We can get directions, learn transit times or find parking thanks to our Internet-equipped mobile phones. As smartphones advance technologically, their owners will become — knowingly or not — integral nodes in this vast and dynamic network.
Indeed, the citizen sensor is already reality. Through check-in applications like Foursquare and geotagging photo tools like Instagram, we’re able to provide fairly precise data about ourselves, what we’re doing and where we are. The utility here may be more for social purposes than anything else, but it’s important to recognize that we apparently have no problem actively providing information for others to consume. We’re comfortable being the data source.
Next page: How much data collection is acceptable?
Perhaps more important, we’re also comfortable being passive data sources — sharing our whereabouts, for instance, without knowing we’re doing so. The most ubiquitous example is the traffic layer on Google Maps. This tool allows users to overlay real-time information about the traffic volumes on major roads. It works by passively — and anonymously — collecting GPS and motion data from phones running the Android operating system, enabled by those terms and conditions people agree to when first starting up service. The result is surprisingly accurate and can help you avoid a gnarly traffic jam en route to work. Yet passive data collection can do so much more.
Drivers in Boston are employing the GPS and accelerometer sensors in their smartphones to detect and map potholes throughout the city — and doing so with hardly any explicit human action. The application is called Street Bump, and when installed and run on phones in traveling vehicles, it automatically monitors jolts and bumps to identify possible problem areas on city streets. Once the app is running, the driver does nothing. The phone is the one paying attention, mapping the location of likely potholes and road bumps, and sharing that data with the city.
Moreover, as the phones we carry become mobile data collection units, they’ll provide not only commonplace information — like potholes and traffic jams — but also data that could save lives. Consider the mobile air-quality monitoring systems that researchers at the University of California, San Diego are developing. Once the technology is scaled down to fit inside a cellphone, it could provide localized reports on air quality to help people suffering from allergies and asthma.
Sometimes we’ll have to do very little to share this information with the world. Other times we’ll eagerly go out of our way to make it available, as users do with Hollaback, a smartphone app for tracking street harassment. Hollaback lets people instantly submit geotagged reports of harassment, with maps and a description and even photos of the perpetrators. Now, when a leering pervert catcalls a woman passing by, she can quickly turn around, snap a photo, and within half a minute post the man’s mugshot and misdeed online for the world to see.
The cameras, gyroscopes, microphones and accelerometers in phones are documenting the wonders and horrors of urban life while also parsing data to paint a vivid picture of the city we couldn’t otherwise see. The possibilities abound. A sensor in your phone could detect elevated carbon monoxide readings at a friend’s house you are visiting and automatically notify the fire department. An app might warn you when you’re driving into an area that has had a spate of carjackings and advise a safer route, or detect the sound of gunshots in your neighborhood and alert authorities.
Next page: The Internet of Things
Ultimately, it won’t be just our phones collecting and interfacing with the information around us. Data sensors will be embedded in streets, buildings and even in cars and transit vehicles. Building such a network could bring about what many have called the Internet of Things — a concept that foresees the ability of physical objects and people to communicate and share information. A water pipe could tell a central computer that it’s about to fracture. A road could communicate with a streetlight to tell it that, after hours of sitting dark, it will need to illuminate for a car heading in its direction. An apartment building could determine that an oven is still on after the resident leaves for work and could switch it off.
Much of the progress being made toward the Internet of Things has occurred in transportation. In the very near future, cars will communicate with other cars to improve the safety and flow of traffic. Google’s self-driving car is one high-profile example, but another project, run by the U.S. National Highway Traffic Safety Administration, is a more likely predictor of where this concept could go. The NHTSA recently launched a yearlong test enabling 2,800 cars in Ann Arbor, Mich., to effectively communicate directly with one another. They have short-range radio devices installed that constantly let each other know exactly where they are. Through visuals in the dashboard and physical cues like seat vibrators, the test cars are able to notify their drivers when they’re veering out of a lane too close to other cars or driving too fast toward stopped cars ahead. The goal of the test is to determine how well this technology can increase road safety, eventually paving the way for a requirement that all automakers install this type of car-to-car communication capability.
For now, the sensors of choice for the smart city are our smartphones. Researchers at the Massachusetts Institute of Technology’s Senseable City Lab have been exploring the potential of various mobile devices for nearly a decade. One effort tapped into anonymized cellphone data to map the movement of people throughout Rome on a given night. Another used mobile phones and other urban sensor data to create a real-time map of Singapore that was able to tell people detailed information about their surroundings, such as which nearby restaurants were busiest and which transportation routes were more congested than usual. These projects provide a rare and comprehensive look at the ways people interact with their city. This type of data could fuel all kinds of analyses, from identifying traffic bottlenecks to determining a good place to open a bar.
There’s an undeniable Big Brother element in a lot of these concepts. Do we really want our phones to spy on everything we do and everything that happens around us? Probably not. There’s also the question of who has access to this data, how much they can know about each data point, and whether or not the data can be owned — all questions that must be resolved in the near future. But for all the worrisome prospects, there’s much to gain from the ability of our phones to collect and analyze and share the data points that surround us.
That power can translate into dramatic changes in the way we live in urban areas and the way governments improve cities and engage with residents. While some cities have adopted so-called Government 2.0 principles by putting records online, open for all to see, much of the information is banal and not often real-time: building permits, crime reports and the like. There’s currently no central database that collects live information from all the mobile phones and other sensors within a city. Yet that reality may not be far off. Several large-scale technology and computer companies, including IBM and Cisco, are hoping to develop just such a tool. This central hub, rich with data, could enhance how we understand our cities and ourselves.
Meanwhile, smartphones — both actively and passively — will continue to engage with the urban environment to foster a conversation between city and citizens that is mutually beneficial. Such citizen sensors will help build intelligence into almost all facets of life in the city. In this looming future, your phone will know how fast you’re walking and where you’re going. It will know that your bus is late. It will notify the local utility about your neighbor’s gas leak. Your city will learn from the data you provide and will adapt accordingly. And you will learn as well, tapping into the data you and your city are working together to collect. This collaboration will weave an intelligence into the urban experience that improves life in remarkable ways. The city will be not only smarter but also better.