Crowdsourcing, Citizen Science & INSPIRE Muki Haklay & Claire Ellul Extreme Citizen Science (ExCiteS) research group, UCL @mhaklay / @UCL_ExCiteS
Outline Three eras of environmental information: By experts, for experts (1969 1992) By experts, for experts & the public (1992 2012) By experts & the public, for experts & the public (2012 on) Crowdsourced geographic information & citizen science Challenges within the INSPIRE framework case study
First era: 1969 [1987 92] Experts Experts Public Decision Makers http://wp.me/p7dnf gx
First era: 1969 [1987 92] Experts responsible for creating environmental information and using it to advise government Top down attitude to environmental decision making Information Deficit model towards the public Environmental information by experts, for experts http://wp.me/p7dnf gx
Second era: 1992 [2005 12] http://wp.me/p7dnf gx
Second era: 1992 [2005 12] Rio Principle 10, Aarhus Convention Public access to environmental information is a prerequisite to participation, civil society organisations as intermediaries The Web as the dissemination medium Information by experts, for experts and the public (but in expert form) http://wp.me/p7dnf gx
Third era: Since 2012 http://wp.me/p7dnf gx
Third era & INSPIRE
2006 Crowdsourcing/VGI Nick Black
Mapping parties Nick Black
2014
Billy Brown
2008 More than maps Prof. Jacquie McGlade, head of European Environment Agency, 2008 (Aarhus + 10): Often the best information comes from those who are closest to it, and it is important we harness this local knowledge if we are to tackle climate change adequately people are encouraged to give their own opinion on the quality of the beach and water, to supplement the official information.
2008 EEA WaterWatch
Citizen science While Citizen Science has a long history, new formed emerged in the past decades, facilitated by the web citizen cyberscience Types: biodiversity/conservation observations recording; volunteer computing; volunteer thinking; Do It Yourself (DIY) science; community/civic science See Haklay, M., 2013, Citizen Science and Volunteered Geographic Information overview and typology of participation in Crowdsourcing Geographic Knowledge
More information at http://publiclaboratory.org
2008 Air Quality Source: West Wiltshire
2008
2012 Mapping for Change
June 2012
June 2013
EEA Work Programme 2014 18 As Part of Strategic Area 3 activities: to widen and deepen the European knowledge base by developing communities of practice and engaging in partnerships with stakeholders beyond Eionet, such as business and research communities, Civil Society Organisations (CSO), and initiatives concerning lay, local and traditional knowledge and citizen science
Data Quality Assurance Crowdsourcing the number of people that edited the information Social gatekeepers and moderators Geographic broader geographic knowledge Domain knowledge the knowledge domain of the information Instrumental observation technology based calibration Process oriented following a procedure
Citizen Science & Metadata The Challenge Increasing creation and user base of spatial data open data movement, FOSS4G software Lack of expertise users may come from variety of backgrounds, and don t have GIS training or understanding of spatial data quality aspects Limitations (production) Metadata standards producer centric: Complex; No guidelines as to the amount of detail required; Difficult to understand; held in non standard methods e.g. PDF, website, wiki
Citizen Science & Metadata Limitations (Data and metadata de coupled) Metadata not updated automatically when data changes Metadata Capture not integrated into workflow Some Citizen Science projects do capture metadata, by accident rather than design, to meet a specific research aim Metadata not used by end users of data, consequent lack of understanding of data quality & fitness for purpose Limitations (Use) Metadata presentation not use focus: what do people need to know to re use data and combine it with other sources? Does not keep track of derived data or record, attribute or geometry level updates. But do is this level of detail useful?
Case Study Noise Data Scenario: You are an environmental consultant, wanting noise information about Deptford in South London for a project in the area. You have no specific GIS expertise or training You identify three maps of noise all online ideally you d like to re use the data to save capturing more Do you chose one or take information from many? How do you make the choice?
Dataset A
Dataset B
Dataset C
Integrating Noise Data Just by looking at the maps: Is there data for the areas I am interested in? There is some data for Deptford in all three maps Does the data cover the entire area of interest, or are there gaps in the data Maps C and A both have gaps What is the db(a) range covered by the maps does it cover really loud noise? Yes, all datasets cover noise over 70dB(A), although the resolution differs (but what about underlying data?)
Integrating Noise Data Things you can learn from the websites Dataset B:
Integrating Noise Data Things you can learn from the websites Dataset C:
Hidden Information
Crowdsourced Geographic information in Government 29 case studies from across the world Success factor, challenges and lessons crowdgov.wordpress.com Jo Somerfield Kathmandu Living Labs
Summary Citizen produced environmental information is on the rise and will continue to increase Characteristics heterogeneous, temporal & spatial variability, sources Different procedures, organisational structure and practices that demand consideration of data management and curation