Eddy Flux Measurements, Pleistocene Park, Cherskii, Russia - 2011


In contribution to the Arctic Observing Network, the researchers have established two observatories of landscape-level carbon, water and energy balances at Imnaviat Creek, Alaska and at Pleistocene Park near Cherskii, Russia. These will form part of a network of observatories with Abisko (Sweden), Zackenburg (Greenland) and a location in the Canadian High Arctic which will provide further data points as part of the International Polar Year. This particular part of the project focuses on simultaneous measurements of carbon, water and energy fluxes of the terrestrial landscape at hourly, daily, seasonal and multi-year time scales. These are the major regulatory drivers of the Arctic climate system and form key linkages and feedbacks between the land surface, the atmosphere and the oceans.
In support of these objectives, a new 32m tower was deployed in Pleistocene Park, about 20km south of North-East Science Station in Cherskii, Russia. This station is currently measuring fluxes of carbon dioxide, methane, water vapor and energy in addition to other meteorological variables.

Project Keywords: 

Data set ID: 


EML revision ID: 

Published on EDI/LTER Data Portal


Bret-harte, M., Zimov, S., Euskirchen, E., Shaver, G. 2011. Eddy Flux Measurements, Pleistocene Park, Cherskii, Russia - 2011 Environmental Data Initiative. http://dx.doi.org/10.6073/pasta/afb6900e4d0d15aeb15c92279200199f

Date Range: 

Thursday, September 22, 2011 to Saturday, December 31, 2011

Publication Date: 



Similar to the other flux stations involved in the AON project, the Pleistocene Park Station generates two types of data: high frequency eddy covariance (EC) data and low frequency means of meteorological and subsurface data. On a daily basis, approximately 75Mb of high frequency binary data and 16Kb of low frequency ASCII data is collected.

This station uses a CR3000 data logger and a laptop to collect and store data. The CR3000 is used to measure the open-path EC equipment which is sampled at 10Hz. Micrometeorological data is scanned at 0.33Hz and all data points are averaged every half hour. The CRBASIC program that controls the data logger has been written in such a way that only the most basic corrections and filtering are applied to the raw data. These would include shifting the CSAT3 and LI7500 data arrays by 2 and 3 scans respectively to account for the inherent processing delays of these sensors. The high frequency data is processed to yield mass and energy fluxes using a Reynold's decomposition after which the following corrections are applied: the WPL correction, a coordinate rotation, a spectral correction and the 'Burba' correction. Further quality controls flags are generated such as a stationarity test and a foot print analysis. The high frequency processed tables are then combined with the low frequency micrometeorological data after which the data is both filtered and gap-filled. The following procedures are used to filter the data: 1) Parameters that are based on the engineering specifications of each instrument. This normally involves filtering data based on the operating temperature range. 2) Parameters that are based on the Automatic Gain Control of the LI7500 (AGC - lens transmissivity flag). If this value exceeds a given threshold, the lens of the LI7500 is assumed to be obstructed by ice or snow. All measurements from this sensor and all radiation sensors are then assumed to be similarly obstructed and this data is filtered. 3) Any sources of air flow distortion are identified at each site and all EC measurements from those azimuth directions are filtered. Thus, if any wind that originates between these wind rejection angles, then this EC data is filtered out. 4) Parameters for impossible measurements (i.e. negative values from a precipitation gauge or pyranometer.) 5) Parameters for previously flagged data including '-9999'. Data loggers or other processed datasets for which raw data isn't available will sometimes have various flag strings in use. This will standardize everything to 'NaN'. 6) A three-standard deviation filter to get rid of extreme outliers. 7) A similarity/cluster filter - With certain instruments, the appearance of a string of identical values in a time series usually indicates measurement errors. Any time series with clusters of 5 identical values are filtered. The following procedures are used to gap-till the data: 1) A Pth order autoregressive model. Two values for each missing element in a time series are predicted with a forward-looking model and a back-looking model. This function is currently looking 168 elements in both directions in a time series and both predictions are averaged to produce a final estimate. This model is still capable of producing impossible values (i.e. negative measurements from a pyranometer) which are filtered out - thus, there may still be gaps in the time series, but they will be minimized. 2) Filtered values from the beginning and end ranges of a time series are predicted using a variant of the MDV method wherein values are estimated using binned half-hourly averages from the following or the previous seven days.
See the file CH_2044_Metadata_2011.csv for data collection statistics.

AON Data Processing, Filtering and Gap-Filling


Version Changes: 

Jan 2011: Colin Edgar - updated quality conrol information and corrected units
Oct 2014: Bonnie Kwiatkowski updated Metadata.
June 2019-Version 7: Updated metadata, removing abbreviations. Added detail to site descriptions. BK

Sites sampled.

Full Metadata and data files (either comma delimited (csv) or Excel) - Environmental Data Initiative repository.

Use of the data requires acceptance of the data use policy --> Arctic LTER Data Use Policy