Artificial intelligence significantly reduces the time required to analyze wildlife camera capture data. This process has traditionally taken conservation groups months, or even a year.
A new study published in the Journal of Applied Ecology and led by researchers at Washington State University (WSU) and Google has demonstrated that AI can perform wildlife tracking analysis in just a few days, while drawing scientific conclusions that closely match those drawn by human experts.
Camera traps, which are motion-activated cameras placed in natural habitats, are widely used to monitor wildlife populations and generate vast amounts of image data. Projects can generate hundreds of thousands to millions of images that require thorough review to identify species and their behaviors. Even with a dedicated team of assistants, this review process can delay analysis by six to seven months or more, delaying conservation efforts.
The AI analysis matched the human-derived model with approximately 85% accuracy.
This study tested whether a fully automated AI system could replace human analysts in processing this vast amount of data. The researchers used SpeciesNet, a general-purpose AI model developed by Google, to analyze images collected from ecosystems as diverse as Washington state, Glacier National Park in Montana, and the Maya Biosphere Reserve in Guatemala. The results produced by the AI were then compared to traditional datasets labeled by human experts.
“We’re not trying to replace people,” said Daniel Thornton, lead author of the study and a wildlife ecologist at WSU. “The goal is to help researchers get answers faster and make better decisions about wildlife management and conservation.”
Dan Morris, a senior researcher at Google and co-creator of SpeciesNet, emphasized, “The key question wasn’t whether the AI got all the images correctly. What mattered was whether the ecological conclusions of interest ended up being essentially the same.”
They found that for most species, AI-generated models matched human-derived models approximately 85 to 90 percent of the time. Key ecological indicators such as species occupancy and environmental factors influencing its presence were consistent between AI and human analyses. Even when the AI made errors such as misidentifying species or missing detections, the overall occupancy model remained robust. This is because it relies on repeated observations over time, which weakens the impact of individual mistakes.
Traditionally, early AI tools helped by filtering out blank images, which often made up 60-70% of camera trap data, but still required human experts to review tens of thousands of photos containing animals. This study went further by completely eliminating the final human review step, demonstrating that fully automated analysis is feasible for many species.
Eliminate analysis bottlenecks for small research teams
The time savings are dramatic. What once took six to 12 months can now be completed in just a few days or about a week, eliminating a major bottleneck in wildlife monitoring. This speed will allow conservationists and wildlife managers to move from data collection to actionable decisions more quickly, potentially enabling near real-time monitoring of species such as jaguars, wolves, and grizzly bears.
This efficiency is especially transformative for small or underfunded conservation organizations, which often lack the resources to process large datasets quickly. Faster analysis also allows monitoring programs to scale without being limited by processing capacity, increasing the scope and scale of conservation efforts.
The research team also contributed to the broader AI for conservation community by making part of the dataset publicly available. Data sharing supports the improvement of AI tools such as SpeciesNet and improves them through access to diverse and extensive training datasets.
“We weren’t trying to invent a new model,” Morris said. “We were asking, given the state of technology today, could we trust that technology to do the kinds of analysis that people are already doing?”
AI analysis may be less effective for rare species
However, limitations still remain. This study focused on a subset of species commonly captured on camera. Rare and easily confused species continue to pose challenges for AI detection, and many other uses of camera trap data still require human review. Despite these caveats, the results suggest that image processing no longer needs to be a major constraint in large-scale camera trapping studies.
“The big takeaway is that this doesn’t have to be a bottleneck,” Thornton says. “The faster we can process data, the faster we can respond. That’s what’s really important for conservation.”
Other co-authors of the study include Travis King and Lucy Perera Romero of Washington State University. Alyssa Anderson of WSU and Montana Fish, Wildlife and Parks; Ronnie García Unruh of the Wildlife Conservation Society’s Guatemala Program; Scott Fitkin of the Washington Department of Fish and Wildlife. Carly Vynne of RESOLVE all contributed to data collection, analysis, and manuscript preparation across research sites in Washington, Montana, and Guatemala.
This research represents an important step forward in leveraging AI technology to accelerate wildlife monitoring and conservation decision-making, promising a future where technology and ecology work together to protect biodiversity.
Source link
