This document discusses Eddy covariance measurements and the need for standardization. It notes that Eddy covariance stations have grown worldwide, and processing can vary between stations and networks. The document examines the impact of differences in setup and processing on flux comparisons. It finds that differences in setup have a stronger impact than processing, and that centralized processing by the ICOS ETC produces comparable results to processing by principal investigators. While variability between processing options is minor, proper documentation of setup and processing in metadata remains important.
Hot Sexy call girls in Moti Nagar,🔝 9953056974 🔝 escort Service
Standardizing Eddy Covariance Measurements
1. Eddy covariance measurements:
a (not so) standard method
Sundas Shaukat, Simone Sabbatini, Giacomo Nicolini, Domenico Vitale, Dario Papale
Aurore Brut, Christophe Chipeaux, RadekCzerny, Anne De Ligne, Thomas Gruenwald, PasiKolari,
Sebastien Lafont, Virginie Moreaux, Frederik Schrader
2. The history of Eddy Covariance measurements is probably reaching a
turning point: from “local campaign” to “global assessment”
Growth of EC stations
worldwide is continuous.
Today only AmeriFlux and
Europe have together more
than 1000 sites registered.
E moved from local campaigns
to continental networks to
global collaboration in
FLUXNET
Growth of EC stations worldwide
FLUXNET status in 2015 - Chu et al. 2017
3. Also the increasing interest in that sector by sensor producers fosters
the process: different technologies, “better” sensors, wider choice,
more possibilities…
Growth of EC sensors availability
4. In addition to the setup also the processing chain can follow different
paths (different corrections, filtering, software, versions…)
This heterogeneity in setup and processing among stations and
networks could challenge the across-sites comparisons
Is it needed to go towards higher degree of standardisation?
Growth of (the need of) standardization?
5. Standardization or not?
Note: raw data processing for fluxescalculation and QC
Different networks
followed different
strategies
ICOS Setup
Historical Setup
PI processing
ETC processing
PI processing
ETC processing
ICOS ETC applied four processing schemes
(combination of 2 options for coordinate
rotation (2D/PF) and trend removal (BA/LD))
PI fluxes calculations of the two systemswith
they standard procedure
Total of 10 runs (2 systems,5 versions each)
10. • Heterogeneity can be an issue if we want to detect small changes
• The difference in the setup has a stronger impact, this is supporting the
ICOS decision for a standardization
• Centralized processing is possible, ICOS ETC fluxes are comparable with the
PI versions, even with different software
• Variability among the 4 processing options applied by the ETC is relatively
minor, probably smaller than the other uncertainty sources
• A proper documentation of setup and processing in the metadata is crucial
Parallel measurements: lessons learned
11. • Nine ICOS stations contributed with a long timeseries (up to 15 years)
of high-frequency data, metadata and processed data (PI method,
including both a different calculation and a different filtering)
• Different setup (sonic and IRGA) switched to ICOS standard at one
point
• All raw data were processed by the ETC with the standard “four
options” method
Legacy data and reprocessing
12. Examples of results (FC, yearly averages)
• Difference in calculation
and filtering
➢ PI less interannual
variability,same pattern
➢ Difference in the ICOS
processing few years
• Difference only in
calculation
➢ ICOS processing options
overlapping
➢ 2014 (full year with ICOS
setup) almost identical
ICOS setup
2015: only half year
BE-Lon PI software:EddyFlux
13. Examples of results (FC, yearly averages)
• Difference in calculation
and filtering
➢ SimilarIAV, PI version
lower fluxes
➢ ICOS optionsoverlapping
• Difference only in
calculation
➢ Increased consistency PI-
ETC
➢ Still differences even if
same software (processing
optionsrole)
DE-Geb PI software:EddyPro
ICOS setup
14. Examples of results (LE, yearly averages)
• Difference in calculation
and filtering
➢ Large difference until2011
(setup or software)
➢ ICOS larger IAV
➢ Overlap between ICOS
methods
• Difference only in
calculation
➢ Differences reduced but
still present.
➢ Period with ICOS setup and
same software consistent.
CZ-BK1 PI software:Edire until 2011, EddyProfrom 2012
ICOS setup
15. Good data management strategy is crucial for quality, traceability and
reproducibility. Main issues found:
• Different formats (ASCII, several type of binary files) also in the same site
and year
• Analog data without the parameters for conversion to physical units
• Sensor diagnostics not recorded
• Incomplete or missing metadata (setup, maintenance and disturbance
periods, processing applied). In the past of paper notebook at the site…
• Unclear timestamps (benning or end of averaging period? DST shift?
Timezone?)
Legacy data: lessons learned
16. • Standardization of setup helped to reduce inter-sites biases
• Differences due to processing are less important than differences in
the setup
• A proper data and metadata management is crucial. ICOS on this is
doing important steps forward
• Reprocessing of long timeseries (10 years) requires large efforts (find
and organize data and metadata, convert formats and finally process)
• Standardization could be not needed (function of the aim), however
in case of heterogeneous setup and processing the full metadata are
crucial for a correct fluxes interpretation
Conclusions