| ||Sarah Dimick |
Technology and Innovation
On June 18, Prime Minister Stephen Harper adopted the G8 Open Data Charter at the Leaders Summit and announced the coinciding launch of the next generation open data portal at data.gc.ca.
You might assume that as a researcher, especially as a researcher focused on technology and innovation, I’d be excited about this announcement … and I am, at least in principal. More access to more data has the potential to make my job not only easier but better—if I can access more data and analyze it, I’ll have more accurate results. The possibilities with open data are seemingly endless and have myriad applications.
The use of open data can help in a natural disaster, such as the real-time collaborative map development that was used for the city of Port au Prince, Haiti, after the earthquake in January 2010. Before the earthquake, maps of the city depicted major highways, the outline of cities, and very little else. Crisis responders had very little to go on when planning their approach and response. Within two days of the earthquake, maps—complete with roads, buildings, and even refugee camps—had been developed.
Open data is also being applied closer to home, harnessed in applications that make information about our communities more accessible. Ottawa recently announced the winners in its 2013 Apps4Ottawa contest, which saw a range of apps developed using at least one data set from the City of Ottawa Open Data catalogue. Winning submissions included Ottawa’s Heart, which maps the city’s defibrillators, and multiple transit-focused apps.
The vast potential in harnessing open data and making applications that are able to distill the information most important for users can leave one’s head swimming. It’s exciting, and that excitement is very easy to get swept away in. As a researcher I’m always looking for more data, more inputs for my work—and open data promises just that.
Can all of this open data be too much of a good thing? Does the quantity of data available make finding useful information akin to finding that lost needle in a haystack? This vast quantity of open data takes work—a lot of work—to sift through and assess. If the general population—those of us who are the non-developers—can’t do that with ease, then are the data really open?
Perhaps the prevalence of city-based open data applications has a lot to do with developers being able to deal with smaller data sets that don’t need to be aligned across different collection or sorting methods. One of the key principles of the G8 Open Data Charter is that the data be usable by all. Usable by all implies that the data need to be accessible; this goes further than just making sure the numbers are out there somewhere beyond the portal. Data need to be organized, streamlined, and sorted to be usable. Additionally, data from different sources need to be organized along the same principles if they are to be integrated together in the same applications. This “usability” principle puts significant onus on the open data providers to coordinate their efforts amongst themselves and to do a fair bit of upfront data “clean up” work.
Governments around the world are busy implementing their second- or third-generation open data environments. If they fully adopt the “usability” principle in this enormous data dump, the potential for open data truly will be extensive. Looking forward, the really impressive open data will be that which is not only online but also accessible to everyone, and the furthest-reaching applications will be the ones that are able to synthesize data from multiple jurisdictions, move seamlessly across jurisdictional boundaries and provide all of us “non-developer” types with the information we need.
Office of the Prime Minister, Open Data, June 18, 2013. http://pm.gc.ca/eng/media.asp?id=5545
Martian Lucas-Smith, “Using Open Data and Crowdsourcing to Develop CycleStreets.” http://www.slideshare.net/cyclestreets/talk-using-open-data-and-crowdsourcing-to-develop-cyclestreets
City of Ottawa, Apps4Ottawa. http://apps4ottawa.ca/en.