acm-header
Sign In

Communications of the ACM

ACM News

Using AI to Improve Farm Animal Welfare


View as: Print Mobile App Share:

Facial recognition technology and machine learning can be used to determine the emotional states of individual farm animals.

Credit: sensiblevision.com

The ability to assess the emotions of farm animals could help improve their welfare.

When animals are stressed or distressed, for example, the result can be a variety of harmful behaviors: pigs may bite each other's tails, which can cause infections, while chickens may engage in cannibalism by picking their flockmates' feathers, combs, or toes which can kill them.

While farmers typically focus on checking their animals' physical health, rather than their mental state, "If we knew how to measure their emotions, we could bring down that unwanted behavior and enhance their quality of life," says Suresh Neethirajan, an associate professor at Wageningen University in the Netherlands.

Researchers are now working on automated approaches that could help detect how animals are feeling. Although experienced farmers often can gauge the emotional states of their livestock by observing multiple cues such as tail wagging and an animal's gait, their assessments are subjective. Blood tests also can be carried out to measure levels of hormones such as oxytocin, higher levels of which correlate with positive feelings, but such testing is intrusive and can cause additional stress.

Further, it is difficult to continuously monitor a large number of animals manually and identify problematic behavior early on. "There is a need to move from a reactive to a predictive approach," says Neethirajan. "Artificial intelligence and sensor technologies that help us to measure and quantify emotions can enable proactiveness."   

Neethirajan and his colleagues have been exploring how facial recognition technology and machine learning can be used to uncover the emotional states of farm animals. In recent work focusing on cows and pigs, the researchers collected thousands of videos and still images of the faces of different breeds from farms in Canada, the U.S., and India. The data was annotated by animal behavior experts to map different facial features to three mental states in cows (excited, relaxed, or frustrated) and six in pigs (such as feeling aggressive, neutral, or positive). A cow is considered to be feeling relaxed when its ears hang down loosely and less of the white of its eyes is visible, for example.

Using the visuals captured by the team, three different deep learning models were trained to recognize the facial features of interest and infer an animal's emotions. The models were able to predict the emotional states of a cow or pig with an average accuracy of 85%, and up to 90% for certain emotions, when tested on previously unseen images. Such a system could be developed into a real-time tool for farmers that use cameras to monitor livestock and an app that can decode what an animal is feeling. "It would be a way to communicate with the animals and give them what they need in a particular moment," says Neethirajan.

Peter Gloor, a research scientist at the Massachusetts Institute of Technology Center for Collective Intelligence who is building an emotion-tracking system for cows, thinks the system developed by Neethirajan is valuable work. Machine learning often has been used to classify emotions in humans from their facial expressions, but such work is in its infancy for animals. However, he thinks including body posture cues could provide more information about how an animal is feeling, which Neethirajan says he would like to incorporate into future versions of the system.

Neethirajan and his colleagues also are working on a similar system to determine the emotional states of chickens based on their movements, sounds, and body temperatures as measured by thermal cameras. If a chicken is anxious, for example, the temperature of its throat and comb can change faster than that of the rest of its body. However, capturing data on individual chickens is more difficult, since they are smaller in size and present in large groups. "When we visit a poultry farm, there are hundreds or thousands of chick-laying hens on the rearing floor," says Neethirajan. "That's the biggest challenge."

Another group of researchers is focusing on automatically monitoring the emotional states of pigs by analyzing their vocalizations. Research has shown that pigs produce high-pitched squeals in negative situations, such as when they are fighting or waiting at the slaughterhouse. Low-frequency sounds like grunts typically are uttered in both positive and negative contexts, but their calls last longer in unpleasant scenarios. "Vocalizations are good indicators of emotions," says Elodie Briefer, an associate professor at the University of Copenhagen in Denmark.

Briefer collaborated with colleagues from several European countries to build a dataset of 7,414 pig sounds produced over the course of their lives, from birth to slaughter, and in different situations, such as when nursing or fighting. The team then trained a deep learning convolutional neural network (CNN) model to recognize positive versus negative calls using visual representations of the sounds called spectrograms. When the system was tested on pig sound spectra it hadn't seen before, it was able to classify vocalizations with up to 92% accuracy. It also could identify one of 19 situations in which a pig was making a call, with 82% accuracy.

The team would like to build an app or a tool for farmers that uses their model. A microphone could be hung above a group of pigs, for example, and the system could detect how many positive and negative calls are made each day by the animals. It could then send the farmer a daily message with the percentage of each type of call recorded. "If there are too many negative ones, then (the farmer) can intervene," says Briefer. "(He or she) could check if the pigs are not getting along well, maybe change the group or give them some enrichment or a bit more space."

Briefer thinks combining vocalizations with other cues, such as body posture, could create more accurate models, adding that similar models could be developed for other farm animals, such as chickens, since they are very vocal and are known to express emotions through the sounds they make.

There is potential to monitor less-vocal species too, by keeping track of the number of calls they make. "A decrease in vocalisations might be information that an animal is either calm or actually the opposite, depressed," says Briefer. "It is more limited in other species compared to pigs and chickens but it should also work."

 

Sandrine Ceurstemont is a freelance science writer based in London, U.K.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account