There was cause for celebration earlier this week for those working on the Ecostress in LIST's ERIN (Environmental Research and Innovation) department, as an important milestone of their “Phase 1”, that began in March last year, was reached. Around 30 terabytes of ecosystem data for Europe and Africa is now available for downloading and further processing at the Food Security Thematic Exploitation Platform, known as TEP as a fast-track data access catalogue service.
But in order to understand the relevance of this news, we need to take a step back and explain the origins of what Ecostress actually is. Head of the project at LIST Kaniska Mallick, explained. “It is a scientific mission launched by NASA using international space stations to understand some of the important properties of the earth’s ecosystem. Specifically, it is to understand how the ecosystem responds to differential levels of water stress, water availability, how plants photosynthesise, how plants modulate their water loss, and how to strategize their resource capture and use during different drought periods”.
Surface temperature is very sensitive to evaporative cooling. Therefore, if there is evaporative cooling or warming occurring because of the variations in soil moisture, it is reflected in the thermal signature. This is one of the preconditions that can be used later on for diagnostic modelling of how plants transfer between each other, how ecosystems evaporate, or what the patterns are for ecosystem water use during different magnitudes of water availability.
“We have this project with European Space Agency called European Ecostress Hub. What ESA wanted us to do was develop global maps of land surface temperature and evaporation over Africa and Europe because NASA is mostly looking at north and south American sector,” stated Kaniska. “It was a major agreement between ESA and NASA, so that’s how we got this proposal. One of the biggest challenges in this kind of project is the huge data volume, so how to manage multiple algorithms with such large data volume. One of the requirements of ESA is that everything should be done in a centralised cloud platform, but of course to do that we need all the data on a cloud server”.
ESA wanted the project to be completed in two phases. In the first phase all data should be transformed into a searchable format on the TEP cloud platform so that anyone can access it, which is what the Ecostress team has just achieved.
“In phase 1 what we’ve done is produce all the data for one year for all the African and European sectors and is now already searchable in a fast track data access catalogue service in a full security thematic exploitation platform, as this is one of the requirements for the future LSTM (Land Surface Temperature Monitoring) mission of ESA,” explained Kaniska. “There are mission advisory groups of the European Space Agency and they would like to see whether or not for future missions, they could obtain the data in such a searchable format, so this was the purpose of the phase 1”.
The data is now openly available can be changed to meet the users’ needs. Different temperature models can be selected as well as evaporation models to run in the centralised cloud server, “instead of us producing information from a particular model and requesting it to be used, the user has been given the freedom to apply complete full scientific analysis of different algorithms, and help us get to a point whereby we know which algorithms work optimally with a certain set of environmental conditions,” elaborated Kaniska.
With about 30 terabytes of data just for one year and the Ecostress mission lasting just over three years, Kaniska reckoned that “input data would go up to about 100 terabytes, and then we process that huge volume in the cloud server".
The Ecostress misson is already moving on with the second phase in which the project injects its own algorithms and runs them on the cloud server.
“We started phase 2, we are already testing algorithms, the surface temperature data coding algorithm is already done, and now we are testing its implementation on the platform, especially how much memory it needs!” concluded Kaniska.