Land Surface Processes and Land Use Change Lex Comber ajc36@le.ac.uk
Land Surface Processes and Land Use Change Geographic objects in GIS databases Detecting land use change using multitemporal imaging techniques Uncertainty and land use change Land surface modelling and feedbacks to climate Land Cover
Remote sensing, land cover and uncertainty Lex Comber ajc36@le.ac.uk
Why bother with uncertainty? Uncertainty originates from many sources It is important to consider uncertainty When we map we want to make the best map possible BUT there are other maps We may have to defend our assumptions, our maps so we need to know what the uncertainties are!
Themes Mapping Paradigm: What is it we are trying to do Represent the real world Turn data to information But choices in mapping What is really measured? Remotely sensed data Pixel Uncertainty & error in remotely sensed data Semantics is also important what is a. Forest? Bog? Beach?
The Mapping Paradigm Real world infinitely complex all representations involve Abstraction, Aggregation, Simplification etc. Choices about representation depend on Commissioning variation (who paid for it?) Observer variation (what do you see?) Institutional variation (why do you see it?) Representational variation (how do you record it?) Almost everything in geography is a matter of interpretation The same processes may be recorded in different ways User awareness of these data aspects A backdrop of methodological variation & change is the norm in GI so how does this relate to remote sensing?
The Mapping Paradigm Sensor Raw Data Processed data Land cover map Sensor characteristics Number of bands Swath width Return times Age Resolving power (scale) Corrections Atmospheric Geometric Calibration Classification Supervised vs. Unsupervised Number of classes Statistics / Biology Objectives of study Algorithm (MLC, Fuzzy, etc) Technological Changes in Environmental Scientific Choices Choices Choices developments Policy agencies understanding
Big Points 1. Many choices when moving from Data to Information
What is really measured? What about Atmosphere? Projection What does this mean for Radiative Transfer? Pixel? http://www.discover-aai.com/technology/spectral.htm
What is really measured? Corrections Geometric Atmospheric Different types of filter, seed points, projections Variation in data (before it s classified) Remote sensing assumption: Radiative data Z collected on board remote sensing platforms in space can be interpreted quantitatively in terms of the variables of interest Y. Verstraete et al. (1996) argued a very different point
What is really measured? Verstraete et al. (1996) argued Measurements Z in space are not controlled exclusively, or even directly by the variables of interest Y at the surface But by the state variables S of the radiative transfer A physical interpretation of electromagnetic measurements Z obtained from remote sensing can provide reliable quantitative information only on the radiative state variables S that control the emission of radiation from its source and its interaction with all intervening media and the detector. When did you see this written anywhere in the remote sensing literature?
Big Points 1. Many choices when moving from Data to Information. Choices result in variation. 2. But what does the data represent? The earth s surface or efficiency of transfer?
What is really measured? What is a pixel? The basic unit of remotely sensed data Raw data True, objective, fact?
What is really measured? What is a pixel? Imposes a division of space Landscape in which we live in may not fit Pixel classifications assume small areas are homogenous Each pixel belongs to 1 land cover only Fisher, P., (1997). The pixel: A snare and a delusion. International Journal of Remote Sensing, 18 (3): 679-685.
What is really measured? What does a pixel record?
What is really measured? Mixture of several surface types The resulting reflectance spectrum is an area weighted average of the constituent parts
What is really measured? Sensors degradation over time Calibration Forest in Mexico Affects pixel values Barsi et al 2003 Markham et al 2004 (Landsat)
What is really measured? Sensor Raw Data Processed data Land cover map But actually there is a lot more than we like to think going on before you process the data: Corrections, sensor and platform issues Generally most remote sensing work concerns what goes on here: Classes, training data, algorithms etc What is our raw remotely sensed data? Sensor has been calibrated. Recently? Pixel records an area weighted average Images are corrected pixel values adjusted
Big Points 1. Many choices when moving from Data to Information. Choices result in variation. 2. But what does the data represent? The earth s surface or efficiency of transfer? 3. The pixel is heavily processed before YOU get your hands on it
Thus far variation & uncertainty Assumptions of homogeneity & space Sensors degrade and have to be calibrated Images have to be corrected Many ways for doing this Images have to be cleaned for cloud etc Many ways for doing this Pixel record an area weighted average And if the neighbours change? State variables of radiative transfer geometric properties of the media (atmosphere, vegetation, soil) position, size, shape, orientation or density of the objects constituting these media the physical properties of the feature of interest (leaf reflectance, pigment concentration What is it that the pixel records?
Uncertainty & Error Got some data Happy with it Got some training data Got Imagine (or some software) Done some classification Made some maps How good are they?
Uncertainty & Error How do we measure error in remote sensing analyses? Compare predicted against observed Table or Matrix Correspondence, confusion, error, validation etc What measures do we get out? Type I, Type II, User and producer accuracies False positives, false negatives Correspondence (Kappa) Conditional kappa (k-hat)
Uncertainty & Error Compare the classified (predicted) to an alternative (observed) Field data, manual interpretation of AP, etc http://www.storm.uni.edu/rs/2001/ars/accuracy/
Uncertainty & Error Congalton 1991. A Review of Assessing the Accuracy of Classifications of Remotely Sensed Data Standard text, every report, paper in the remote sensing literature
Uncertainty & Error But hang on, what are we comparing? Are we comparing like with like? Aerial photography Field survey Random visits Often 2 fundamentally different things http://landcover.usgs.gov/accuracy/figure2.asp
Summary so far Looked at data what it is Variation from lots of sources Pre-processing Nature of the sensor Nature of the pixel Looked at how some of the variation is managed Error matrices Issue of what is being compared Consider some more sources of uncertainty and variation Semantics
The Mapping Paradigm Real world infinitely complex all representations involve Abstraction, Aggregation, Simplification etc. Choices about representation depend on Commissioning variation (who paid for it?) Observer variation (what do you see?) Institutional variation (why do you see it?) Representational variation (how do you record it?) Almost everything in geography is a matter of interpretation The same processes may be recorded in different ways User awareness of these data aspects A backdrop of methodological variation & change is the norm in GI so how does this relate to remote sensing?
Semantics Classes, labels & meanings We all have prototypes in our heads We match a class label with that prototype There may be a mis-match Many examples in land cover (and every other type of geographic information), e.g. What is a forest? What is a bog? Where is the beach?
Semantics What is a forest? We all have a prototype in our heads We match a data label with that prototype This may be a mis-match Interesting website: http://home.comcast.net/~gyde/defpaper.htm
What is a forest? Minimum physical requirements of a Forest from http://home.comcast.net/~gyde/defpaper.htm 16 Zimbabwe 14 12 10 Sudan Tree Height (m) 8 6 4 2 Turkey United Nations -FRA 2000 PNG Malaysia United States Luxembourg Israel Belgium Gambia Mexico Portugal New Zealand Netherlands Namibia Somalia Switzerland Kyrgyzstan Mozambique Morocco Denmark Cambodia Australia Japan Estonia UNESCO Kenya Tanzania Ethiopia SADC Jamaica South Africa 0 0 10 20 30 40 50 60 70 80 90 Canopy Cover (%) Does not include species, area, strip width
Semantics FAO - Forest Resource Assessments categories change their meaning
Semantics Spatial characteristics also changed
Semantics Where is the beach?
Semantics Where is the beach? Northern Europeans (UK & Netherlands) the beach is between high and low tide level In CORINE (Coordinated Information on European Environment) the beach is above the high water mark
Semantics Where is the beach?
Semantics What is a Bog? In LCMGB (1990) Bog was defined as permanent waterlogging, permanent or temporary standing water Myrica gale and Eriophorum spp. water-logging, perhaps with surface water In LCM2000 Bog was defined as areas with peat >0.5 m deep; Consequences in one 100 x 100km square: 1990 12 pixels of bog (<1 ha); 2000 120728 pixels of bog (~75 km 2 )
Implications RS uncertainty reported in terms of error Mis-classification Much variation is hidden Producers know all this stuff, but don t communicate it Awareness of wider issues is important Robust decision making Acknowledging the uncertainties
Sources of variation They all have an impact on the end result Comber, A.J., Fisher, P.F., Wadsworth, R.A., (2005). You know what land cover is but does anyone else? International Journal of Remote Sensing, 26 (1): 223-228
Concluding remarks Remotely sensed data Recording of radiation as determined by surface or transfer medium? Is it objective fact? Constructed and subjective? You must understand the data you use in your analyses Look for the undeclared assumptions Pixel? Error? Bog? Forest?
Concluding remarks Remotely sensed data Recording of radiation as determined by surface or transfer medium? Is it objective fact? Constructed and subjective? You must communicate the issues in your data Declare your assumptions Pixel? Error? Bog? Forest? Tell people what your data means
Concluding remarks: the General Problem Real world infinitely complex all representations involve Abstraction, Aggregation, Simplification etc. Choices about representation depend on Commissioning variation (who paid for it?) Observer variation (what do you see?) Institutional variation (why do you see it?) Representational variation (how do you record it?) A backdrop of methodological variation & change is the norm in GI
Big Points Many choices when moving from Data to Information. Choices result in variation. But what does the data represent? The earth s surface or efficiency of transfer? The pixel is heavily processed before YOU get your hands on it Uncertainty and error in remotely sensed data BUT semantics is most important
Further reading Verstraete, M.M., Pinty, B. and Myneni, R.B., (1996). Potential and limitations of information extraction on the terrestrial biosphere from satellite remote sensing. Remote Sensing of Environment, 58 (2): 201-214. Fisher, P., (1997). The pixel: A snare and a delusion. International Journal of Remote Sensing, 18 (3): 679-685. J. A. Barsi, J. R. Schott, F. D. Palluconi, D. L. Helder, S. J. Hook, B. L. Markham, G. Chander, and E. M. O Donnell, Landsat TM and ETM+ thermal band calibration, Can. J. Remote Sens., vol. 29, pp. 141 153,Apr. 2003. Congalton, R.G., (1991). A review of assessing the accuracy of classifications of remotely sensed data, Remote Sensing of Environment, 37(1): 35-46 Comber, A.J., Fisher, P.F., Wadsworth, R.A., (2005). You know what land cover is but does anyone else? an investigation into semantic and ontological confusion. International Journal of Remote Sensing, 26 (1): 223-228 Comber, A.J., Fisher, P.F., Wadsworth, R.A., (2005). What is land cover? Environment and Planning B: Planning and Design, 32:199-209.