Building on the same techniques as discussed in my last blog, which focused on using python and matplotlib to plot ERA5 temperature data in West Africa. These plots illustrate the effect of gaussian filter smoothing on hourly ERA5 temperature data, obtained from the Climate Data Store. Gaussian filters are linear and typically used to reduce noise.
A quick/dirty tech blog today, getting to know some of Matplotlib’s extra features for generating some attractive plots!
Using yearly ERA5 temperature data from 1979 to present obtained from the Copernicus Climate Data Store, the data was masked by country using shapefiles from Natural Earth and then an average was taken for the area (see previous blogs on area averaging for information on how to do this).
For any data produced with the intention of being downloaded and used by other users, it important to include information on the dataset. For example, details on the data origins should be provided, such as who produced the dataset, who can be contacted about the data and when/where it was produced. In addition, properties on the dataset itself, such as variable names and units of measurement all help the end user in comprehending the data.
In order to build and maintain a robust operational system, which relies on many datasets, file naming becomes an important factor in identifying and describing the data contained within.
This post serves as a continuation on the techniques described in my last blog post. So please familiarise with those steps beforehand.
As before, load in the NUTS shapefiles, this time selecting NUTS2.
NUTS2 is higher resolution and as a result, there are many more shapefiles. NUTS2 contains polygons at a regional (sub-country) level. In total there are 332 shapes for the Eurostat EU region.
This example will look at how to mask ERA5 data from the European region, downloaded from ECMWF Climate Data Store (CDS). The procedure heavily relies on functions provided by the scitools Iris package, although similar results could be obtained using a combination of packages like xarray and regionmask.
I wanted to see if I could improve on the standard power law coefficient (1.389495) for calculating 100m wind speed from 10m data. The currently available ten years of ERA5 U and V netCDF wind components from CDS were concatenated, then calculated for wind speed using CDO; a handy collection of command-line operators to manipulate and analyse climate and numerical weather prediction data:
In this series of technical blog posts, I will be looking at using open source tools to examine energy and climate data.
I am currently working for a non profit organisation, working on enhancing the interaction between the energy industry and the weather, climate and broader environmental sciences community.
Not coming from a climate science background, I had to become quickly accustomed to the terminology and technologies associated with this field of research. A commonly used file types produced to represent weather and climate data, are Network Common Data Files (aka NetCDF files).