Although scientists need to follow standardized methodologies, designing effective protocols is not without its challenges.
It was another morning on the way to the field and we checked the weather forecast for the final day of this sampling session. Will we finally have the 48-hour dry period needed to place the invertebrate traps in our project’s 24 cages? Leaving the traps removed during periods of high rainfall probability not only risks reducing insect activity and resulting in unrepresentative numbers in the traps, but also potentially flooding the traps and rendering the data unusable in the event of heavy rain. The previous sampling was conducted during good weather when insect activity is high. Over two years, sampling is set up at specific times of the season each year, allowing comparisons of changes over time. Therefore, what is more important is that the data can be compared with previous sampling and other studies with the same objective.

Obtaining data relevant to the research question requires researchers to think critically during protocol design. For example, subtle differences like the color and material of invertebrate traps can influence which invertebrates are attracted to the trap. In addition to that, every action you take, such as the orientation of the trap relative to the sun, the right time of day to set the trap, and the weather conditions, will affect the final result. Projects such as RECODYN (ERC project #101043548) make this even more complex by sampling multiple components of the ecosystem. To understand how ecosystem recovery dynamics play out at the community scale, this project will study both plant and invertebrate species at multiple trophic levels and collect data on environmental and biological factors such as temperature and soil respiration. Each element requires its own protocol, and standardization is important.
Standardization across studies
One of the challenges when conducting fieldwork is producing results that can be replicated by other researchers. During the design process, researchers should check the cutting-edge research in their field so that their study can be compared with similar studies. This allows you to see scientific trends on a larger scale across studies. This project employs established sampling methods such as quadrats to define limits of plants to sample, and randomization of sampling order and location to eliminate selection bias. The RECODYN project scoured the literature on sampling methods and contacted researchers directly when methodological details were sparse.

This process resulted in the decision to follow the advice of Brown & Mathews (2016), for example. He appealed to those studying invertebrates to design traps in a consistent manner using transparent materials of specific dimensions. In addition to these ground-set pitfall traps, each cage on site is also equipped with a suspended Malaise trap to fit the dimensions of each cage. This famous trap, named after its designer, a Swedish entomologist, is used to capture airborne invertebrates. When deciding between these two methods, our decision means not focusing on nocturnal insects whose behavior requires completely different traps. Next, we needed to decide what kinds of feeding patterns to look for and quantify to study the relationships between herbivores and plants growing in situ.

Standardization within a project
But how is data collected consistently when there are up to 10 team members involved in data collection? In this sense, data standardization is important not only at the level of the broader scientific community but also within a project. The project chose to focus on three feeding guilds not only because they are widely present in the literature on plant-herbivore interactions, but also because they are the easiest for interdisciplinary teams to discover in a consistent manner. In addition to whether the plants were chewed, mined by larvae, or fell prey to the sap of insects such as aphids, the way we quantify the proportion of each plant that is eaten also influences our accuracy as a team. Here, we followed the recommendations of Johnson et al. (2016) who showed that visually estimating damage percentages within categories, or “bins,” is more accurate and time-efficient than estimating them using software. However, Cornelissen et al. (2026) found that visual estimation is 10 times faster but tends to be overestimated. All team members underwent training and practice sessions to minimize observer influence in the data. Additionally, when data began to be collected regularly, the person estimating feeding damage was recorded so that this variable was considered a random factor during data analysis. These methods have also been shown to improve data quality for a wide range of public volunteers who support scientific progress by participating in citizen/community science initiatives (Kosmala et al., 2016). With clear instructions and appropriate training, anyone can contribute to the generation of scientific knowledge.
But how to deal with mistakes and inconsistencies that surface despite all procedures and training sessions? First, team members review each other’s work at the moment. However, spending long hours in the field can make researchers sleepy and lead to more mistakes. For this project, it was important to document everything in multiple ways. Forgot to note the feeding damage? The datasheet does not say whether the plants had aphids. We often look back at photographic records of plants and invertebrates classified in the laboratory. In some cases, these photos may reveal mistakes that can be easily corrected, but if questions remain, a meeting will be held to discuss and correct them.
make a decision
Once you arrive at the scene, you need to make a judgment about the reliability of the weather forecast. The best bet seems to be to set the trap this morning and retrieve it within 48 hours. If we put it off any further, rain could arrive over the weekend and the situation in the foothills of the Pyrenees could change in an instant. If current forecasts hold true, soil breathing devices could be used tomorrow. According to the information we have read, it is best to avoid using it after heavy rains as the soil will become too saturated and the respiration rate will no longer be representative of other samples. A project of this complexity highlights the importance of being able to improvise and adapt in the field while maintaining standardization of protocols.

It is thanks to fellow scientists that researchers can decide how to sample multiple components of ecological communities. Sharing best practices on how to collect data helps standardize methods. The scientific community must work together to obtain robust and reliable data. We want to see and share the results of our efforts before, during, and after data collection. Sharing our methods transparently ensures that our results are reproducible and useful to the scientific community and society. Through the production of this knowledge, science points to trends in the natural world and helps inform decision-making about ecosystem restoration and conservation.
References
GR Brown, I M Matthews (2016). A review of the wide variations in pitfall trap design and a proposal for a standard pitfall trap design for monitoring terrestrial arthropod biodiversity. Ecology and Evolution, 6(12), 3953-3964. T. Cornelissen, GM Mendez, FA Silveira, W. Dattilo, R. Guevara, R. Aguilar, … & WC Wetzel (2026). Quantifying foliar herbivory: A guide to methodological trade-offs and best practices. Ecology, 107(2), e70308. Johnson, Mont., J. A. Bertrand, and M. M. Turcotte (2016). Precision and accuracy in herbivore quantification. Ecological Entomology, 41(1), 112-121. Kosmala, M., Wiggins, A., Swanson, A., Simmons, B. (2016). Assessing data quality in citizen science. Frontiers in Ecology and the Environment, 14(10), 551-560.
Disclaimer

Funded by the European Union (ERC, RECODYN, 101043548). However, the views and opinions expressed are those of the authors alone and do not necessarily reflect those of the European Union or the European Research Council Executive Agency. Neither the European Union nor the licensing authorities can be held responsible for them.
This research benefited from state aid managed by the French National Research Agency under the Future Investments Program with reference number ANR-11-INBS-0001AnaEE-Services.

claudia christensen garcia
Adaia Cid Alarcon
Tania D. Costa
Andrea Llora-Jiménez
project engineer
Basque Climate Change Center
Please note: This is a commercial profile
This article will also be published in the quarterly magazine issue 26.
Source link
