digital twins to foster peaceful Human-wildlife coexistence

Quick-response team in action. Photo by Keith Hellyer

All over the world, human-wildlife coexistence is under pressure. There are many different reasons, but almost all are related to land competition. People tend to use more and more space, but if the natural areas become too small, fragmented or depleted, wildlife is forced to enter human-dominated landscapes to search for food, water, partners and shelter. The consequences are sometimes grave. Crops are eaten and livestock and people attacked. People and wildlife even get killed.

To reduce this problem, many nature conservation organisations engage with local communities to mitigate the damage and establish conditions for peaceful coexistence between people and wildlife. Known solutions include guarding shepherd and dogs, financial schemes to compensate farmers, the funding and placement of fences to keep wildlife at bay, and early warning systems to detect wildlife.

All these measures are very costly and not always effective. Hence, innovations are needed to drive down the costs and boost the effectiveness of existing measures.

The Digital Twin-innovation that we are working on is aimed at boosting the effectiveness of existing measures.This is possible because Digital Twins are sophisticated simulation models that do 3 things:

  1. they predict where (groups of) animals are right now,

  2. the predictions are continuously updated through real-time observations in the field, and

  3. the prediction algorithms automatically become better through time as they learn from every new observation.

Together with the Wageningen University and others, we are developing digital twins for cranes, bears, and elephants (more species will follow!).

Stay tuned if you want to use our Digital Twins to foster peaceful human-wildlife coexistence!

Intro to CAIMAN

Camera traps are very handy tools for nature conservation professionals. Amongst others, they are used to

  • spot rare species,

  • record nocturnal species,

  • assess biodiversity,

  • recognise individuals,

  • conduct a structured census,

  • monitor places of interest, such as water holes or bird nests,

  • detect poachers and other intruders.

Each use case brings its own challenges and implications for the camera setup and the processing of the recorded images. To cater for each of the above use cases, our CAIMAN solution consists of 4 services that can be fine-tuned. That is:

  1. a connection service,

  2. a process configuration service,

  3. a human-in-the-loop service,

  4. and reporting services.

Step 1: Connection Service

If real-time is of the essence, like when you want to intercept poachers, the cameras need to be connected to the internet. If they are, they can stream their images directly to CAIMAN. Through an API, for the technicians amongst us.

Most camera’s though, store their images on a SD-card, which is collected every now and then. After collecting the images from the field, they can be uploaded to the data upload service of Sensing Clues.

Step 2: Process configuration service

In its essence, AI-driven image classification is a statistical exercise. Species are identified with a certain level of probability. As some species are easy to recognise while others are very hard to distinguish from other species, the quality of the AI-model varies per species. 100% confidence is very hard if not impossible to reach. Above 80 to 90% is often more realistic.

AI models are being trained and made available per geographic region and per use case, as illustrated above. The first solution that we are currently testing is aimed at identifying over 200 species that live in Southern Africa.

To tune the classification process to your needs, minimise mistakes and minimise your time spent behind the computer, thresholds can be set per species. If a species is very important for you, you can set the threshold for automatically accepting the outcome of the algorithm very high. If the species are more abundant and classification mistakes less costly, you can lower the threshold for that species.

STEP 3: Human-in-the-loop validation service

Classifications with probabilities below the threshold are treated in a separate process. In this process we select images from which can learn most. As soon as you’ve verified the images and confirmed or corrected its class, the AI-model is re-trained. This speeds up the learning process and decreases the number of images that need to be sifted through manually.

ps. We are still working on the Human-in-the-loop app. The picture above shows an experiment to quickly verify series of images and find oddities, potentially saving you hundreds of hours.

STEP 4: Reporting services

The classified images are stored in the WITS dataplatform. Similar to Cluey-observations, classified cam-trap images are treated as observations. Hence, they are organised in a Group and are made by an Agent (in this case, the name of the camera). And like Cluey-observations, you can visualise and analyse them with Focus, WIldCAT, ArcGIS Online, or any other tool of your preference (e.g. R-Studio, Python, Jupyter Notebooks).

Sound recognition - live in Amsterdam!

Below is a translated article about our sound event recognition sensor SERVAL. To speed up development we open-sourced it, resulting in a great collaboration with the IoT Sensemakers AMS.

Enjoy the read!

How does the Marineterrein sound?

10 AUGUST 2021

Noise pollution is a big problem in the city. But to tackle it, you first need to know what exactly causes the noise. At the Marineterrein Amsterdam, tests are now being conducted with a sensor that can classify sounds and thus pinpoint the source. And this with technology that originated in the jungle to stop poachers.

It all started with a gunshot in the jungle of Laos, followed by the sound of a boat sailing away. It made Jan Kees Schakel of Sensing Clues realise that it is easy for poachers to make off with their 'loot' under cover of night. But he also realised that the sound the poachers produce is the way to stop them.

Sound sensor

Humans are noisy creatures', says Jan Kees. Basically any sound we make - talking, driving a vehicle and certainly gunshots - carries far and can therefore be picked up well by a sensor. A big advantage over cameras, which are limited by what their lens can 'see'. A sound sensor can help conservationists map out what and where is happening at any given time. After all, if voices can be heard in the jungle in the middle of the night, you can be reasonably sure that something is wrong.'

Complex

That sounds good, but it is easier said than done. Until about five years ago, only the number of decibels of a sound could be detected, but the sound could not be classified. In other words, a sensor could indicate that a loud sound was being produced somewhere, but not whether it came from a slamming car door or an elephant. And it is extremely difficult to be able to make that distinction,' says Jan Kees. You need an enormous database of sounds which then serve as a frame of reference for a complex algorithm to classify sounds'.

Practical challenges

'But since 2016, artificial intelligence and machine learning has taken off,' he continues. 'The technology is getting better and better and we can now recognise multiple sounds with one algorithm. But it is still important to have a large database of sounds and there are also practical challenges. There has to be power, a good internet connection and the hardware around the sensor has to be able to withstand the elements. To get everything working optimally we need to do a lot of testing.

Test location: urban jungle

Since testing sound sensors in the jungle is expensive and complicated, Jan Kees decided to do it closer to home. In the urban jungle to be precise. The Marineterrein has recently been equipped with a sound sensor that maps out city noise and noise pollution. Jan Kees joined forces with Sensemakers AMS, an old friend of ours who has been active at the Marineterrein for years, measuring and interpreting the water quality in the inner harbour, for example. With their combined knowledge, they developed a test set-up that will provide insight into the sound of the Marineterrein, and the extent to which there is nuisance.

The sensor of Sensing Clues and the IoT Sensemakers AMS

The sensor of Sensing Clues and the IoT Sensemakers AMS

Training algorithms

With this project, we can gain a wealth of experience in training our algorithms', says Jan Kees. That is useful for conservationists, because in the jungle we look for the same kind of sounds as here. Voices, laughter, the sound of car engines and doors slamming. But with what we do here now, we can also tackle nuisances in the city. Noise pollution can reduce residents' enjoyment of living and even cause stress. But it is often difficult for people to say exactly what is bothering them. By classifying urban noise, we can pinpoint the exact time and source of various noises and use this information to tackle the problem in a targeted way.

Alarming noises

According to Jan Kees, this mainly concerns intrusive, loud noises. As a resident, you often no longer hear overflying aircraft or trams, you get used to it. But sudden, loud noises trigger us. It is still deeply ingrained in our cognition to be alarmed by such sounds, as a reaction to possible danger. Think of accelerating scooters and alarms, but also of bicycle bells. By classifying these types of sounds, we can pinpoint exactly where the nuisance comes from. It is important that we do this while preserving privacy. The sound is processed on the sensor and not stored. Only the sound labels, such as 'moped' or 'scooter alarm', are stored. So only labels remain, and no sound.'

Jungletech also for a liveable city

With the knowledge gained from this test at the Marineterrein, this technology can be used on a larger scale in the city in the future. In squares, in entertainment areas and even in individual cafés, to make visitors aware of the potential nuisance they are causing. In this way, Jan Kees hopes that his 'jungle technology' can contribute to a liveable city. But of course we eventually want to bring the lessons we learn here back to the jungle. That will require some further development: in the bush there are no power outlets or wifi, so we have to find good and cheap solutions for that. The most important thing is that the technology remains affordable for conservationists, so that they can always stay a step ahead of poachers.

Sensing Clues

Sensing Clues is a non-profit foundation and largely relies on volunteers. You can support Jan Kees' fight against poachers by making a donation. Do you have the knowledge and skills to contribute to this project? Then get in touch with Sensemakers AMS. Electronics engineers and people who like to get started with data analysis and visualisation are particularly welcome.

Text: Sjoerd Ponstein

Translated by Deepl.com

Sponsor our open source projects!

CAIMAN

In the Kaggle iWildcam 2021 competition the species recognition and counting algorithm of the Sensing Clues team became split-second second. To increase its spread and impact we recently open sourced the project. With the funds we receive from sponsors we integrate results of CAIMAN in our WITS-platform, which increases the value of these images tremendously.

OpenEars & SERVAL

With our volunteer friends of IoT Sensemakers Amsterdam we have developed a highly sophisticated sound recognition sensor. It recognises many different sounds related to the presence of people, ranging from gunshots, motorbikes, trucks, chainsaws, music, barking dogs, cattle, and other. The sensor is currently being tested in the city jungle of Amsterdam. To be ready for monitoring and protecting the jungles of Africa, Asia and the Amazon a few more development steps have to be taken. Your sponsorship brings that reality closer!

Oh yes, OpenEars is the name of the sensor, hardware and all; SERVAL is the name of the sound recognition algorithm that we created. And both our open source.

Human-Wildlife Conflict Mitigation in Tanzania

As of March this year, PAMS foundation will be using Cluey when inspecting their chili fences. These fences support farmers living adjacent to wildlife areas to mitigate human-wildlife conflicts, as elephants avoid these fences.

PAMS uses both Cluey and the analytical Focus application to better protect the enormous biodiversity in the remote southern areas of Tanzania.

slack-imgs.com.jpeg

Cheetah monitoring in Kenya

Cheetah are the fastest mammals on land. At top speed they may reach 100 km per hour, which they can sustain for 20 seconds. Enough to outrun their prey. In the wild, there are only about 7000 individuals left.

Sensing Clues supports CSI Wildlife in the Cheetah Research Project in Kenya, aimed at identifying, counting, and monitoring Cheetah, and at determining their range and territory.

The information is critical for taking informed protection measures and to conserve their habitat.

Please hit the Donate-button if you want to contribute to the protection of Cheetah. Every gift, big or small, counts!

Release of the year: completely renewed back-end and Focus!

Release of the year: completely renewed back-end and Focus!

We are very proud to announce a completely renewed back-end and brand new Focus with a clean interface, improved interaction and optimized performance. Thanks to this, analyzing and exploring your area will be much more intuitive and efficient. Also we enhanced security so you feel even more safe to create your safe havens.

training and support in Burkina Faso

Since a few months, Sensing Clues has partnered up with Chengeta Wildlife and the CWTI to tackle incredibly hard wildlife conservation problems in an area rigged by conflicts.

Read more about Chengeta Wildlife and CWTI in a blog post from ESRI, another valued partner of ours.

“When we deal with a problem in a given area, analytics and actionable intelligence inform our planning, coordination, and execution of missions,” Young said. “Improving the technology allows for improved command and control.”

Services that Sensing Clues is providing include:

  • Conducting Foundational Analysis to provide Anti-Poaching Units with a head-start

  • Training and mentoring analysts to conduct such analytics on their own

  • Training and mentoring law enforcement and criminal investigations principles and (judicial) processes

  • Training APU-members and analysts in data collection methods and tools

  • Supporting them with the tools to conduct their work efficiently and effectively


Webinar: Using computer vision to keep track of animals in the wild

Webinar: Using computer vision to keep track of animals in the wild

On 29 October Jan Kees Schakel will join Mike Kraus of Vantage AI for the webinar ‘Using computer vision to keep track of animals in the wild’ to show how our Solution Partner Vantage AI helped us to develop an image recognition solution.

Camera trapping for wildlife insights

Camera trapping for wildlife insights

Last year we announced Vantage AI as our newest Solution Partner, and we are proud to announce our first close collaboration: creating an algorithm that is able to detect species from different regions across the globe that are captured by a camera trap.

Optimising Ranger Patrols

Optimising Ranger Patrols

In this blog post we illustrate how our tools and engineers created models that estimate risks and suggest patrol routes to support our Field Partners.

Protecting the lion in Africa, starts at Leidseplein in Amsterdam

Protecting the lion in Africa, starts at Leidseplein in Amsterdam

Dutch newspaper ‘Het Parool’ interviewed Sensing Clues founder Jan Kees Schakel about our Serval Sensor, the Amsterdam Sounds project and Sensing Clues. Read the full article right here.