Tech, climate change, big data, and making a difference

A while ago I wrote about the challenges of writing a tech blog about apps and gadgets when world-altering events are going on. This came into focus this week after the president’s withdrawal from the Paris Accord and the ensuing conversation. Then, surprisingly, commitment to support the Accord poured in from cities, states, universities and companies around the US. Michael Bloomberg pledged to make up some of the $2 billion in lost funds toward climate action programs. He’s also “leading a coalition, made up of three states, dozens of cities, and 80 university presidents, that vows to uphold the Paris Agreement.”

Listening to a talk with Paul Hawken about this topic this week educated me a bit more about what the Paris Accord really means and what Princeton’s Carbon Mitigation Initiative set out to do. By adopting 15 different strategies aimed at reducing carbon emissions, and meeting the goals set out in them, we could avoid some of the more disastrous consequences global warming. Yet out of the 15, says Mr Hawken, 11 are aimed at larger corporations and utilities. The only actions relevant to individuals were to drive less and install solar power. This is what he set out to change, and came up with a way “map, measure and model the 100 most substantive ways to reduce global warming.”

Mapping air quality at a block-by-block level.
Source: Google

The interesting takeaway from this for me is that maybe there is more that tech can do with the “map, measure, and model” part of the equation. After all, collecting and analyzing data is their bread and butter. Google’s new pollution mapping initiative seems to be a step in the right direction. By attaching relatively cheap sensors to its street map that were out and about on city streets anyway, Google was able to create a street-by-street, block-by-block map of pollution levels in three cities, including Oakland. They then took a closer look at what data points suppose them on the map. In Oakland their analysis exposed areas where quieter residential streets were exposed to higher levels of pollution because of wind direction and spots where vehicles accelerate. This gives the City of Oakland a way to understand how to prioritize public works projects if reducing pollution levels for residents is a priority. Says Google: “With nearly 3 million measurements and 14,000 miles captured in the course of a year, this is one of the largest air quality datasets ever published, and demonstrates the potential of  neighborhood-level air quality mapping. This map makes the invisible, visible, so that we can breathe better and live healthier. It helps us understand how clean (or not clean) our air is, so that we can make changes to improve it.”

In the light of political change, it will be up to local entities, not the federal government, to take action on global warming. To do so they’ll need to collect and analyze many data points. Google’s mapping initiative shows that tech companies, especially those driven by location and mapping data, can relatively inexpensively help with this component proving, perhaps, that change is possible from the bottom after all.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s