Will AI Enable us to Understand and Communicate with Animals?
Summary
TLDRThe Earth Species Project aims to decode animal communication using artificial intelligence, helping us understand and connect with nature beyond human perception. By leveraging AI's power, the project seeks to analyze multimodal data, denoise recordings, and classify vocalizations, ultimately enhancing conservation efforts and offering new scientific perspectives on our relationship with the planet.
Takeaways
- 🌐 The speaker thanks the World Government Summit for the opportunity to discuss decoding animal communication using AI.
- 🐧 A bearded seal's mating call is used as an example to illustrate the foreignness of nature and the limitations of human perception.
- 🌺 Research shows that plants can 'hear' and 'see', emitting sounds at frequencies inaudible to humans, like a distress signal at 70 kHz.
- 🐠 Coral larvae intentionally navigate to suitable settlement locations, guided by the sounds of healthy versus unhealthy reefs.
- 🐢 The Amazonian River turtle communicates with more than 200 distinct vocalizations, and mothers talk to their offspring in the eggs before hatching.
- 🌏 Biodiversity loss is a significant issue, with 69% of wildlife lost since 1970, and only 4% of wild animals remaining.
- 💡 AI is proposed as a tool to reconnect humans with nature, similar to how the telescope expanded our understanding of space.
- 🌟 The Earth Species Project is a nonprofit organization focused on using AI to decode animal communication, with a diverse team of researchers.
- 🛠️ The project involves building AI models to classify animal vocalizations and movements, and to automate tasks for researchers.
- 🎵 The first benchmarks for animal sounds and movement have been published, along with a foundational model for animal vocalizations.
- 🤖 Experiments with generative AI aim to engage in two-way communication with animals, starting with zebra finches, potentially opening new scientific frontiers.
Q & A
What is the main focus of the speaker's work at the World Government Summit?
-The speaker's main focus is on decoding animal communication using artificial intelligence, essentially figuring out what animals are saying, and exploring how technology can bring us closer to nature.
Why did the speaker choose to play the sound of a bearded seal's mating call?
-The speaker played the sound of a bearded seal's mating call to illustrate the foreignness of nature and the limitations of human perception in understanding the world around us, as many species communicate at frequencies inaudible to humans.
What does the study with the evening primrose flower demonstrate about plants?
-The study with the evening primrose flower demonstrates that plants can 'hear' and respond to sounds, as they produced more and sweeter nectar when approached by a pollinator, indicating a form of perception or communication in plants.
How has recent research changed our understanding of coral larvae?
-Recent research has shown that coral larvae intentionally navigate to suitable places for settlement, and they can distinguish between the sounds of healthy and unhealthy coral reefs, suggesting a level of awareness and communication previously unknown.
What is the significance of the Earth Species Project?
-The Earth Species Project is a nonprofit organization focused on using AI to decode animal communication, aiming to better connect humans to nature and potentially address the challenges posed by biodiversity loss and the climate crisis.
What are some of the challenges faced when gathering and analyzing animal communication data?
-Challenges include the vast amount of data, noise interference, and overlapping vocalizations from multiple animals, making it difficult to discern individual communication patterns and apply machine learning effectively.
What is the role of machine learning models in the Earth Species Project?
-Machine learning models are used to analyze multimodal data, including vocalizations and behavioral data, to classify animal communications, make associations, and test hypotheses through experiments.
What are the first benchmarks published by the Earth Species Project?
-The Earth Species Project has published the first-ever benchmarks for animal sounds and animal movement, as well as the first foundational model for animal vocalizations, which can be used for detection and classification across different species.
How does the speaker envision the future of AI in animal communication research?
-The speaker envisions AI enabling two-way communication with animals, potentially leading to a deeper understanding of animal communication structures and offering new scientific frontiers and perspectives on human-nature interactions.
What are the potential conservation benefits of understanding animal communication?
-Understanding animal communication could help mitigate human-wildlife conflicts, inform conservation strategies, and provide insights into animal behaviors and needs, ultimately contributing to more effective and empathetic conservation efforts.
What ethical considerations does the speaker mention regarding AI and animal communication?
-The speaker mentions the need for careful management of AI's interaction with animals to avoid unintended consequences, such as interfering with natural animal cultures or ecosystems without fully understanding the communication's content.
Outlines
🐾 Introduction to Animal Communication and AI
The speaker expresses gratitude to the World Government Summit for the opportunity to discuss the project of decoding animal communication using artificial intelligence. The aim is to understand what animals are saying. The speaker plays a sound from nature, a bearded seal's mating call, to illustrate the foreignness of nature and the limitations of human perception. The speaker emphasizes the vast amount of communication happening around us that we cannot perceive, such as whales, bats, and even plants, highlighting the need to connect with the world we inhabit.
🌿 The Importance of Connecting with Nature
The speaker discusses the existential biodiversity loss and climate crisis, noting that since 1970, over 69% of wildlife has been lost. The speaker points out the disconnection humans have from nature and the importance of technology in bridging this gap. The Earth Species Project is introduced as a nonprofit organization focused on AI research to decode animal communication, emphasizing the interdisciplinary team and the importance of collaboration with field researchers. The speaker also highlights the growing public interest in this field, following the release of models like GPT.
🔍 AI in Decoding Animal Communication
The speaker explains the methodology of the Earth Species Project, which involves collecting data on animal vocalizations and movements, and using machine learning models to classify and understand these communications. The speaker discusses the challenges of gathering and analyzing data, such as noise and overlapping vocalizations. The project's goals include building AI tools to automate tasks that are too time-consuming or difficult for humans, and creating foundational models that can be applied to various species.
🎥 Experiments and Conservation Potential
The speaker shares a video of a scientist placing a sensor on a whale to gather data, highlighting the vast amount of data being collected. The speaker discusses the potential of AI in conservation, such as reducing human-wildlife conflicts and understanding animal behaviors that could lead to better protection measures. The speaker also touches on the ethical considerations of AI communicating with animals and the potential risks involved. The speaker concludes by emphasizing the transformative potential of this work in opening new scientific frontiers and changing human perspectives on nature.
🤝 Inviting Partnerships in the Journey
The speaker invites interested parties to join and support the Earth Species Project in its mission to unlock a deeper understanding of animal communication. A dedicated email address is provided for those interested in following the work and becoming partners in this effort.
Mindmap
Keywords
💡Artificial Intelligence (AI)
💡Animal Communication
💡Biodiversity Loss
💡Conservation
💡Machine Learning Models
💡Multimodal Data
💡Foundation Models
💡Generative AI
💡Human-Wildlife Conflict
💡Ethical Considerations
Highlights
World Government Summit invited the speaker to discuss decoding animal communication using AI.
The speaker played a bearded seal's mating call to illustrate the foreignness of nature and the limits of human perception.
Evening primrose flowers produce sweeter nectar when approached by a pollinator, showing plants can 'hear'.
Plants emit distress signals at 70 kHz, a frequency inaudible to humans but not to cats.
Coral larvae intentionally navigate to good places to settle, guided by the sounds of healthy versus unhealthy reefs.
Amazonian River turtles have over 200 distinct vocalizations and communicate with their offspring before hatching.
Since 1970, over 69% of wildlife has been lost globally, highlighting the biodiversity crisis.
Earth Species Project is a nonprofit organization using AI to decode animal communication, bridging the gap between humans and nature.
The project involves a multidisciplinary team of AI researchers, neuroscientists, mathematicians, and physicists.
A roadmap for decoding animal communication was published in Science magazine, outlining the stages of the process.
The project faces challenges with data analysis, including noise and the simultaneous vocalizations of multiple animals.
The first benchmarks for animal sounds and movement have been published, along with the first foundational model for animal vocalizations.
The project aims to build AI tools that automate tasks too time-consuming or difficult for humans in animal communication research.
The potential of AI in conservation is discussed, including reducing human-wildlife conflict and understanding animal behavior.
The project's goal is to unlock a deeper understanding of nature, potentially changing our relationship with the planet.
Experiments with generative AI and two-way communication with animals, like zebra finches, are planned for the future.
The speaker emphasizes the need for careful management of AI's interaction with animals to avoid unintended consequences.
The project is raising funds and seeking partners to further its mission of connecting humans to nature through AI.
Transcripts
um huge thanks first of all to uh World
Government Summit for inviting me here
for the opportunity to tell you a little
bit more about our work to decode animal
communication using artificial
intelligence or put more simply to
figure out what animals are saying and
uh I'm really excited actually in many
ways that we left this to the last day
because I think this is going to be one
of the most fun sessions uh that you
attend during this Summit I wanted to
start start by playing you uh a sound
from nature and uh I would like you to
have a little think about what animal
makes this
sound okay any thoughts from the
audience whale sorry sperm
whale Orca you're you're
close penguin Seine very close this
is a bearded seal it's the mating call
of a bearded seal from the Arctic he's a
pretty cute little guy uh and I think
the the reason I wanted to play this for
you at the outset is because I think
it's a great illustration of kind of the
the
foreignness of nature of the other
species that we inhabit this planet with
um and I think what this really leads to
is a sense
of the fact that we as human beings are
so limited in our abilities to actually
perceive what is happening in the world
around us whales communicate in
frequencies too low for us to hear bats
communicate in frequencies too high for
us to here and the ultrasonic the
infrasonic and there is just so much
going on around us and this really
limits our ability to understand and
connect to the world that we
inhabit there's some really great
examples of this uh this is an evening
primrose flower amazing study by
University of Tel Aviv done in 2019
where they played a whole Suite of
sounds to the flowers and then tested
the sweet of their nectar following
those sounds and only when the flowers
were approached by a
pollinator did they produce almost
instantly more and sweeter
nectar so what does this tell us it
tells us that in some way plants can
hear and there are other studies that
show that plants can actually probably
see we don't know
how the latest research shows that when
plants are distracted
they actually emit sounds but at
frequencies that we can't even begin to
hear I was attending a talk recently and
the speaker just blew me away with this
analogy which is that humans can hear at
about 20
KZ your cat can hear at about 70 khz and
when a plant is emitting a distress
signal it's at about 70 khz so if you
think about it you could be sitting at
home drinking a cup of tea and you could
have forgotten to water your house
plants and your cat might be able to
hear them calling out for water and you
have no idea that this is going on
around
you this next example Coral larve which
we used to believe were just these
little tiny organisms pushed out by the
coral and they just got floated around
in the current and landed wherever they
landed some successfully some not but
the latest research shows that an actual
fact Coral ly very intentionally
navigate to a place to settle that will
be good they can actually hear the
sounds of a healthy wreath versus an
unhealthy coral reef they're figuring
out where they're going and we actually
have no idea how they do
that this is a great uh Amazonian River
turtle
again up until about 10 years ago we
thought these creatures were entirely
silent new research by uh Camila Ferrara
biologist has shown that not only do
they vocalize have more than 200
distinct
vocalizations but that the mother
Turtles actually talk to their babies in
the eggs before they
hatch so you know what this goes to show
is that the world is just a wash
in sound which is completely Beyond us
and it really allows us to connect to
the fact that nature is Rich and
distinct but you know so what why is
this
important I think we all know that we
are living through a time of existential
biodiversity loss and the climate
crisis more than
69% of wildlife on the planet has been
lost since
1970 that's according to uh The Living
Planet report of
2022 60% of all mammals alive on the
planet today are
livestock Mo many of them existing in
factory farms and another 36% are human
beings so that only leaves about
4% uh of wild animals and you know these
animals these other
species actually form the fabric of the
ecosystems that provide the life support
systems for us here on
Earth so there's a fundamental
disconnect here somehow we have lost our
connection to Nature somehow we've
forgotten that we're part of it and that
poses huge challenges for us as human
beings so how can techn techology
actually bring us closer to Nature this
seems really
counterintuitive um but it's really you
know kind of the founding question for
Earth species project and very much
inspired by the development of the large
language models that you've all been
hearing so much about uh over the course
of this conference and in particular
work in
2017 that allowed us to translate
between two human languages without the
use of a
dictionary and so we think of AI in some
ways like the invention of modern
Optics just like as the telescope
allowed us to look out at space can AI
actually open the aperture for us of our
imagination and reconnect us to Nature
in some
way so who is Earth species project we
are a small nonprofit organization uh
based out of the United States but
globally distributed
um and we have an extraordinary team
larger than what is shown on this slide
here now um primarily made up of AI
research scientists so we have a
dedicated team of people who are working
on this problem coming from very very
diverse backgrounds from Neuroscience
from math from physics and they are
working really really closely with a
whole Suite of Partners there's no way
that this work could be done unless it
was in collaboration deep collaboration
with research institutes and people
who've been out in the field studying
animals for decades and I I also wanted
to just point out too that you know I
joined Earth species project about a
year and a half ago and at that time
even everyone I said I'm you know I'm
going to do this work on you know
decoding animal communication they were
like oh my God that's so crazy nobody
talks about that but in the last year
we've seen several books published on
this topic we have seen huge momentum in
the Press this is beginning to capture
the popular imagination and really
following the release of chat GPT and
people's recognition that AI has
transformative
properties but as I said you know we
couldn't do this without dedicated
research partners and so the people that
we work with are people who have studied
other species for decades people like Dr
Ari friedlander who is one of the
world's leading um Marine Mammal
Specialists working out of UC Santa Cruz
people like Dr Joyce P who literally has
Decades of data gathered from elephants
um in
Kenya and people like Dr Valaria vagara
uh who is a an expert in buuga
Wales with these Partners we have put
together a road map this was published
recently uh in Science magazine and
you'll see that you know it basically
lays out a number of stages on the road
to decoding uh animal communication not
surprisingly it starts with data as all
artificial intelligence does but
thinking very specifically not just
about
vocalizations but also about behavioral
data multimodal data how do we bring all
this together and also alongside
context the machine learning models then
are the the tools that allow us to
analyze that data and that's really
where Earth species project is focused
and helps us to get into the space of
decoding by classifying vocalizations
and movements by starting to make
associations across those things and
then by testing our hypotheses through a
series of
experiments this video uh is of Ari
friedlander who you saw in an earlier
slide placing a sensor a remote sensor
onto a whale and you can see that you
know it's capturing you're not hearing
the vocalizations right now vocal uh
data but also video and environmental
context as well so the exciting part is
that there is actually reams of data
being gathered today the challenge is um
that it's actually probably uh a little
bit too much um to be analyzing
easily and I you know I I think this
is something I wanted to bring home that
although we have tons of data and we
have the power of AI this is still a
really really hard problem to solve so I
just wanted to play you another sound
and see if you can identify this
one
anybody stumped
bats bats anyone else do dolphin again
close that that's actually a beluga
whale sounds a little bit like an alien
modem uh and the the crazy part is that
that data was gathered by Valeria varara
who you saw earlier and she has actually
told us that you know despite the
ability to put a hydrophone in the water
and gather those vocalizations she can't
use more than 9
96% of the data that she gathers because
it's too noisy it's either too noisy or
all the animals are speaking all at the
same time and it's really impossible to
tell what's going on so you know it just
gives you a sense that there are these
really really basic challenges when
you're trying to gather data from
animals and make it usable for machine
learning so you know we are operating
essentially in the space of trying to
build AI for the rest of nature we're
trying to help researchers with some of
these really basic problems of how do
they denoise their data how do they do
Source separation how do you then in
10,000 hours of of recordings of
orangutans as an example how do you
actually go through and detect where the
animals are vocalizing and then classify
those so we're building tools that will
allow uh basically automate tasks that
human beings either it's going to take
them too long or really it's not the
best use of their time to do they don't
even do it that well but we're also
going further then and you know using AI
to kind of open up our understanding of
how this problem might be solved in
different
ways so as an organization you know this
is a new field um uh and in a new field
you need to produce benchmarks uh in
order to know whether or not you're
making progress so really really
exciting that in the last year we have
published the first first ever Benchmark
of animal sounds this is a data set of
animal
vocalizations we've also published the
first ever Benchmark for animal movement
and we've gone further than that to
publish the first ever Foundation model
uh for animal vocalizations and so you
could kind of think of this as like the
gpt1 of animal
communication this is a model that is
generalizable can be used for a whole
Suite of different species and allows
biologists to both do detection so
figuring out where vocalizations are and
then classification of the vocalizations
what type of
vocalizations and that model uh performs
incredibly well against the benchmarks
that have been developed and then we're
moving into as we as we go forward over
this year multimodal Foundation models
as well that can pair vocalizations with
movement
data so thinking about this in a
slightly different way we're basically
building the foundations or the
fundamentals uh in terms of the data and
the benchmarks and Foundation models
that you can then build a whole Suite of
applications off uh that will deliver
against a whole Suite of tasks and we're
building the flywheel that you know
essentially brings in reams of data that
allow us to build the tools that power
the work of our partners that then
generates more data so we can iterate uh
on the
models so we are as an organization
working to unlock understanding uh
better connect us to Nature and I just
wanted to play you a super quick video
to bring that to
life the oldest
cultures are not
human they're from the
ocean 40 million years ago before we
walked upright before we sparked fire
whales evolved to build
relationships in the
[Music]
dark I'm trying to start a conversation
is the most basic way you can say
it I'm going to put a speaker in the
ocean and talk to a whale and hope it
talks back starting
playback if this work is successful it
will be the first experiment where we
have engaged in a dialogue with a
humpback
[Music]
whale pretty
incredible so that's Dr Michelle F uh
from her documentary faom and
essentially what she's doing is
recording a humpback whale saying hello
and playing it back to the whale to see
if they will
respond this is essentially a playback
experiment been used by scientists for
decades to test their hypotheses about
what an animal might be
saying but the the question then is can
we go further with AI can we say
something more complex to an animal and
see what the response is can we in fact
potentially generate novel
vocalizations and so that's the really
exciting part of where we're at right
now getting into the space of generative
AI um and so this is an example produced
by one of our scientists genu you uh the
first part of this vocalization is uh
the animal and the second part uh is
generated by an
AI indistinguishable and and this
basically puts us in New Territory and
so this year for the first time ever we
are going to be conducting a series of
experiments with zebra finches in a
captive laboratory setting they're a
model species very well studied and we
are actually going to uh have an A
in two-way conversation with a zebra
finch which you know the potential that
this opens up the opportunities in terms
of understanding the structure of uh an
animal's communication is huge but then
of course there's all kinds of risks as
well once we get into this territory you
know we could have an AI communicating
with another animal without us actually
understanding what that communication is
all about which you know gives rise to
the potential that we could be messing
um with other cultures so this has got
to be managed very very
carefully I wanted to bring us back to
what I raised at the beginning in terms
of uh the biodiversity crisis like so
what so we can communicate with animals
how is this going to help and I think
it's really important to point out that
there are all kinds of potential
conservation benefits um from this I
mean you can imagine that if you were
able to actually in a very rudimentary
way communicate with a wolf um uh or a
bear or an elephant you could actually
in avoid some of the human Wildlife
conflict uh that we're facing in many
parts of the world today imagine if we
were able to understand why dolphins or
whales are stranding themselves on
beaches in large numbers like how would
that help us to do something
differently but I think there's also
something really really powerful here
too that goes Beyond those kinds of
conservation benefits to help us think
about a different way of interacting
with a planet in the same way that when
we saw Earth for the first time from
space it changed our thinking uh about
ourselves I think we're at a place where
um this work has the potential to open
totally new scientific Frontiers it has
the potential to open new human
perspectives and so if you think about
again uh the telescope and and once we
had the ability to look out into
space we actually recognize that Earth
was not at the center so we think about
AI is the tool that can help us look at
the patterns and the complexity of
nature right here on Earth and if we're
able to do that maybe just maybe we will
be able to acknowledge
that humanity is not at the
center thank you so much everyone I I
made it in
time and I just wanted to say we are
raising friends and partners in this
effort so if you're interested in
following our work if you're interested
in finding out more this is a dedicated
email address so please uh please reach
out thank
you
[Music]
Посмотреть больше похожих видео
5.0 / 5 (0 votes)