Nick Emmons: Allora
Summary
TLDRIn this video, Nick, co-founder of Aora, introduces their self-improving decentralized AI network. The platform aims to address the challenge of siloed machine intelligence by enabling different AI models to learn from each other in a modular, decentralized way. Aora creates a collective intelligence network, optimizing machine learning objectives and producing aggregated outputs that outperform individual models. With applications in finance, AI-powered data feeds, and DeFi, the network is designed to continuously improve through a combination of inference and forecasting workers, all while preserving privacy and scalability.
Takeaways
- 🧠 The AI stack can be broken down into four core layers: data, compute, intelligence, and execution.
- 🛠️ Decentralized AI networks, like blockchain, are built modularly for increased coordination and composability.
- 📊 Aura aims to solve the problem of siloed machine intelligence by creating a decentralized AI network that allows models to learn from one another.
- 🤖 The network uses an inference synthesis mechanism, which stitches together model outputs for a more intelligent and performant outcome.
- 💡 Aura's core focus is on turning intelligence into a digital commodity and enabling models to work together to achieve collective intelligence.
- 🔐 The network supports both open and closed-source models without revealing proprietary data, making it suitable for financial settings where privacy is crucial.
- 📈 Financial applications, especially AI-powered price feeds for longtail assets, are a key area where decentralized AI networks like Aura can provide value.
- 💸 Aura sees decentralized finance (DeFi) and AI agents as the next frontier for executing advanced trading and risk management strategies.
- 📉 AI-driven risk modeling can help address complex risks in decentralized finance, especially for staking and liquidity provisioning.
- 🚀 Aura is currently in the testnet phase, focusing on onboarding machine intelligence to enhance the network's collective intelligence before the mainnet launch.
Q & A
What is the core concept behind Aora?
-Aora is a self-improving, decentralized AI Network that aims to turn intelligence into a digital commodity by allowing different AI models to learn from one another and become more composable or compatible in a decentralized setting.
How does Aora address the issue of siloed machine intelligence?
-Aora addresses the issue of siloed machine intelligence by enabling models to learn from each other and collectively optimize different machine learning objectives, resulting in an aggregate model output that outperforms any individual model within the network.
What are the four core layers of the decentralized AI stack as described in the script?
-The four core layers of the decentralized AI stack are Data, Compute, Intelligence, and Execution.
What is the role of blockchains in the decentralized AI stack according to the script?
-Blockchains serve as powerful coordination mechanisms that can turn potentially nebulous resources into more tangible digital commodities, facilitating the modular construction of the decentralized AI stack.
How does Aora's network work at a high level?
-Aora's network operates through an inference synthesis mechanism that stitches together different model outputs to create a more intelligent network-level output. It involves inference workers responding to requests, forecasting workers predicting the loss of inference workers, and reputers evaluating model performance.
What is the significance of the forecasting workers in Aora's network?
-Forecasting workers in Aora's network are crucial for introducing context awareness, allowing the network to outperform individual workers by learning the contexts in which different models perform well, thus enhancing the network's overall performance.
Why is the ability to support open and closed-source models important in Aora's network?
-Supporting open and closed-source models is important because it allows the network to maintain a wide range of use cases, especially in financial settings where preserving proprietary information of models is crucial.
What are the four core features of Aora as mentioned in the script?
-The four core features of Aora are collective intelligence, iterative learning at both the network and model levels, contextual awareness, and privacy protection that supports closed-source model participation.
What are some initial focus areas for Aora's application?
-Some initial focus areas for Aora's application include AI-powered data feeds, particularly for longtail assets, AI agents for decentralized finance, and risk modeling in the context of reaking and other financial primitives.
What is the current phase of Aora's development as per the script?
-As per the script, Aora is currently in the test net phase one, with test net phase 2 launching in about a week and a half, followed by the main net soon after.
What is the primary focus of Aora Labs at the moment?
-The primary focus of Aora Labs at the moment is onboarding workers and model creators to bring a critical mass of machine intelligence onto the network, as the network's intelligence increases with more models participating.
Outlines
🤖 Decentralized AI Network Overview
Nick, co-founder of Aora, introduces the concept of building a decentralized AI network that self-improves. He outlines a four-layer AI stack consisting of data, compute, intelligence, and execution, emphasizing the modular nature of such systems. The focus is on transforming machine intelligence into digital commodities and overcoming the challenges of siloed AI models by enabling them to interact and improve collectively. Aora aims to build a network where different models collaborate, creating aggregate intelligence that outperforms individual models.
💡 How Inference and Forecasting Workers Collaborate
Nick explains how Aora’s decentralized AI network functions through the inference synthesis mechanism, where different models, termed inference workers, contribute to solving tasks. Simultaneously, forecasting workers predict how well the inference workers will perform, allowing the network to assign weights to each contribution. This collaborative process leads to better performance compared to individual models by integrating and optimizing outputs. The introduction of forecasting workers creates a system that learns and improves over time, adapting to different domains and contexts.
🧠 Collective Intelligence and Iterative Learning
Aora’s decentralized AI network promotes collective intelligence by combining heterogeneous models that iteratively learn from each other. The forecasting workers’ context-aware predictions enable better performance at the network level, while individual models also improve by learning how their outputs are weighted relative to others. The privacy of models is maintained, allowing closed-source models to participate. The discussion highlights the network’s core features: collective intelligence, iterative learning, contextual awareness, and privacy protection.
💹 Financial Applications of Aora’s AI Network
Aora’s decentralized AI network has significant potential in financial applications, especially within the intersection of crypto and AI. Nick outlines three main focus areas: AI-powered price feeds for long-tail assets, AI agents for decentralized finance (DeFi), and risk modeling. These applications aim to improve market operations by providing real-time data feeds, enabling more complex financial strategies, and managing the complexities of risk in capital staking. Aora’s network is positioned to enhance financial operations and innovation through its advanced AI capabilities.
🚀 Aora’s Roadmap and Expansion
Nick concludes by discussing Aora’s current progress and future plans. The network is in the testnet phase, with the next phase and mainnet launch coming soon. Aora is actively onboarding machine intelligence to expand the network, emphasizing that the network's intelligence grows with the number of models participating. The focus remains on reaching a critical mass of models and collaborating with top-tier applications within the crypto space to bring collective intelligence to various sectors.
Mindmap
Keywords
💡Decentralized AI Network
💡Modular Construction
💡Siloed Machine Intelligence
💡Composable Models
💡Inference Synthesis Mechanism
💡Context Awareness
💡Aggregators
💡Objective-Centric Paradigm
💡Longtail Assets
💡DeFi Agents
Highlights
Introduction to Aora, a self-improving decentralized AI Network
AI stack simplified into four core layers: Data, Compute, Intelligence, and Execution
Blockchains as powerful coordination mechanisms for decentralized networks
Modularity in blockchain stack leading to rapid innovation in DeFi
Aora's focus on the Intelligence layer of the AI stack
Addressing the problem of siloed machine intelligence with decentralized networks
Aora's network allows different AI models to learn from each other, enhancing collective intelligence
The output of Aora is an aggregate model output that outperforms individual models
Aora as an abstraction layer for intelligence, similar to how Bitcoin is for power
Background on the company's evolution from Upshot to Aora Labs
Research in mechanism design and Pure prediction leveraging information theory
Shift from human inputs to AI in crypto infrastructure due to inefficiency
Aora's inference synthesis mechanism for combining different model outputs
The role of inference workers and forecasting workers in the Aora network
How Aora maintains privacy by only sharing inferences, not model weights
Simulation results showing Aora's network outperforming individual models
The importance of context awareness in Aora's forecasting workers
Transition from model-centric to objective-centric paradigm in AI interaction
Core properties of Aora: Collective intelligence, iterative learning, contextual awareness, and privacy protection
Initial focus areas for Aora in financial applications and crypto
AI-powered data feeds for longtail assets in decentralized finance
AI agents for executing complex strategies in DeFi
Risk modeling in decentralized finance with Aora's AI network
Upcoming test net phases and main net launch timeline
Transcripts
[Music]
hey guys I'm Nick co-founder of aora and
we're building a self-improving
decentralized AI Network um before we
get into things I just want to level set
a line on sort of uh endtoend view of
the AI stack as how as how we see it and
we kind of see it as these four core
layers this is somewhat of a
simplification but fundamentally the the
decentralized AI stack the AI stack in
general can be broken up into Data
compute intelligence and execution and
we think that like with other
decentralized networks this end to-end
system will be realized via a kind of
modular construction I think blockchains
are are incredibly powerful coordination
mechanisms for turning potentially
nebulous resources into more tangible
digital Commodities and when you're
coordinating disparate networks of
individuals around different actions
it's already a complex undertaking on
itself and when you try to build
monolithically that complexity compounds
and I think that's why we've seen the
blockchain stack modularize I think
that's why we've seen such a rapid rate
of innovation and progress in things
like defi it's because of this
modularity and the the ability to build
uh very tightly scoped highly composable
building blocks that get stitched
together and so when we think of allora
we think of the kind of intelligence
layer of the stack we're building and
kind of turning intelligence itself into
a digital
commodity um and the problem we're
trying to solve is is this kind of
fundamental problem in AI of siloed
machine intelligence there's many
different models in the world and
they're not very kind of composable or
compatible with one another I can't
merge Model A and model B to create a a
mega model that represents the
intelligence of both and this is a real
problem because it means that these
large monoliths with access to the most
kind of raw resources of data and
compute will continually making the the
better models and so what we're trying
to solve with aura is through
decentralized networks a way for models
to kind of learn off of one another and
become more composable or compatible in
a decentralized setting and so that's
what allur is allur is a a network where
many different models can learn off of
one another in collectively optimizing
different ml objectives and the output
of the network is a kind of aggregate
inference or an aggregate model output
that consistently outperforms any of the
individual models within that Network
and in doing so you've created a kind of
commoditized version of IG itself and
significantly accelerated the rate of
innovation of machine intelligence
itself um and another way to think of
allore is kind of this abstraction layer
for intelligence the same way Bitcoin is
kind of an abstraction layer for power
or other networks are abstraction layers
for trust or compute or things like this
allora is this abstraction layer for
intelligence allowing people to access
intelligence and the kind of best
aggregate form of intelligence through
the network itself as opposed to
interacting with individual models and
needing to go through the processes of
of deciding between different models and
different domains or
contexts um and just some quick
background on who we are so when we
first started the company it was
originally called upshot we're now
called a Lowa Labs we're building a
laowa network um we were doing a bunch
of research in this really new field of
mechanism design called Pure prediction
that leverages information theory in a
pretty interesting way to incentivize
people to be honest in the face of
subjectivity so enabling people to reach
consensus on the answers to subjective
questions that's important for a bunch
of different problem spaces the Oracle
problem more generally and it's also
useful for Designing decentralized
networks around especially nebulous
resources like intelligence and then as
we were building this out we realized
that humans as inputs to these systems
were too inefficient they're they're too
inaccurate uh too incompetent I guess to
support many real world tasks and that's
when we shifted to building some of the
earlier AI ex crypto infrastructure
especially in longtail Market setting so
uh AI powered price feeds for longtail
assets AI derivative infrastructure Etc
and what we're doing with allur now is
taking that early work we did around
subjective consensus mechanism design
and all of the learnings and
infrastructure we've built in the AI
crypto space over the past three years
to build this kind of collective
intelligence Network or self-improving
decentralized AI
Network and this is kind of how the
network works uh at a high level um the
the kind of core mechanism of the
network is what we call the inference
synthesis mechanism and it's a way of
stitching together different model
outputs in a productive way to create a
more performant or more intelligent
Network level output and so the the
general flow is consumers ask a a given
topic a topic is kind of like a sub
Network in the system it's a very
tightly scoped uh ml objective that
model creators uh kind of coordinate
around and are attempting to
collectively optimize a consumer would
would ping one of these topics asking
for some AI output and two flows are
happening one these kind of inference
workers are responding to that request
with inference that is being assessed or
weighted based on uh how well it
optimizes the core objective function in
that topic and simultaneously
forecasting workers are predicting the
loss that's going to be realized on
those inference workers uh as assessed
Again by that objective function and
they're they're over time learning the
context the different domains Etc in
which different inference workers within
a given topic perform well or perform
less well um to produce the the most
accurate kind of predicted loss and so
as these things get stitched together
you uh establish weights across the
different inferences developed by
different workers based on uh kind of
approximation of a shly value across
different models and produce an output
that over time increasingly outperforms
the individual models performance within
that topic and then there's a a kind of
second or third type of actor called
reputer who are tasked with actually
evaluating the performance of different
models over time so they they take some
ground truth they take some kind of
basis or validation set and assess the
performance of the different uh uh
workers as as assessed by or as
calculated by that objective
function um and and I I guess one one
thing to note here I I haven't looked at
this presentation in a while um is
because workers are just sharing
inference through the network the
network can support open and close
Source models they're not revealing the
weights of their models they're not
revealing any kind of proprietary
information of those models and we think
that's important for supporting a wide
number of use cases and we'll get into
it later but especially kind of use
cases in financial settings where
maintaining that proprietary alpha or
the proprietary information of your
models is especially
important um and the key to this
mechanism working is really around
context awareness we've done a bunch of
work around this but it's that
introduction of the forecasting worker
that allows you to realize a lot of the
outperformance at the network level
relative to the individual workers in
the network and so we can see some
simulation results here uh in five
predictors they're they're inference
workers but they're working in a Time
series prediction task um the best uh
predictor is predictor one and they've
achieved a log loss of uh about negative
3.3 uh so that's pretty good they've
they've uh if we didn't have forcasting
workers we would waight them the most
heavily as this kind of base set of
predictors and then produce an inference
that achieves close to that performance
but it's over time that these
forecasting workers sort of learn the
different contexts in which different
predictors perform well in this time
series prediction case maybe some models
are especially competent in say volatile
market conditions other uh models are
are more competent less volatile market
conditions and these forecasters would
learn that over time and uh adjust their
their predicted losses to uh uh kind of
represent that and what we see in this
this bottom figure is that with the
introduction of aggregators or
forecasters as as they're called the the
best forecaster achieves a a loss of
around -4.4 and so that's around 33%
better than even the best model in the
network because they're learning the
context in which different workers
outperform other workers in the network
and additionally we can see that even
the worst aggregator the one who is
taking kind of the uh who is waiting uh
contextual awareness uh least
sensitively is still outperforming even
the best inference worker within that
Network and so it's this this kind of
element of context awareness this
additional dimension of performance
where we really see outperformance at
the network level in this kind of
setting and then another nice property
of this is when we interact with with AI
today it's a very kind of model Centric
Paradigm I'm saying I'm going to use gp4
I'm going to use CLA 3 for some task and
I'm asserting that that model is going
to be the best in every domain and every
context that I'm using it in and that's
not the case I don't think there's ever
going to be a single model that is
dominant across all domains
and when you when you align things
around given objectives or given topics
you start to move to this objective
Centric Paradigm of interacting with AI
in that I care about this ml objective
this evaluation criteria that matters
for my application or for my use and
then this underlying set of Ever
Changing models is being productively
coordinated and stitched together to
produce the best possible output is that
you can think of this as kind of
analogous to the move from transaction
too intense of moving from specifying a
specific set of uh State transitions to
achieve some objective to just
specifying the intent that you care
about at the end of the day and as we
move towards an objective Centric
Paradigm of interacting with AI a
similar thing will unfold is in instead
of interacting with specific models
we'll interact with specific ml
objectives that we ultimately care about
and ultimately satisfy our needs either
in use and application development
whatever it
is um and so just to recap these are
sort of like the four core Properties or
features of allora 1 collective
intelligence I think that's pretty
simple we're bringing together many
different heterogeneous models to
produce a more intelligent kind of
network level intelligence and because
of forecasters the model is or the
network is kind of learning over time
and so you achieve this property of kind
of iterative learning at the network
level Additionally you achieve iterative
learning at the model level as well
because each of the individual models in
the network are learning the the weights
that are being assigned across other
models in the network and using that to
to improve or inform their future
outputs uh third contextual awareness as
we talked about is kind of the key to
stitching together heterogeneous
instances of machine intelligence in a
really productive way in a way that
outperforms any of the individual models
within that network uh and then lastly
as we talked about because the workers
are only sharing inference within the
network there's a level of privacy
protection that supports this kind of
closed Source uh model participation
within the network
uh and then I just quickly want to go
over some initial Focus areas so I think
as we think about where this network is
most applicable at the application Level
we see a lot of financial applications
as the initial kind of uh like verticals
to tackle I think as we're talking about
crypto and AI crypto is largely a
financial innovation in my opinion it's
it's found most if not all of its
product Market fit across different
Financial settings and AI in financial
settings is also much more mature than
then some of the newer types of
generative AI as well and so I think a
lot of the immediate value within this
intersection of crypto and AI will
happen in various Financial uh settings
and verticals so we'll quickly go over
some of
these one AI powered data feed so this
is this is pretty broad but I'll Focus
specifically on kind of AI powered price
feeds for longtail assets this is what
we've been doing historically at upshot
for the past few years um but when we
start to move into the long taale of
assets where crypto like in defi edly
really shine in in codifying certain
Market interactions to enable the
support of more longtail asset
categories we still lack a a kind of
robust and accurate source of price
Discovery for these assets uh they don't
change hands very often and so these
transfers of ownership that act as price
Discovery events for other asset classes
often don't happen enough uh for that to
be the sole source of price Discovery
for many different assets and AI can be
used to take into account many other
pieces of data outside of the market
itself as well as upsampling Market data
in productive ways to produce re near
realtime price feeds for a much broader
set of asset categories and sort of
equipped with a a price feed you can
start to build a bunch of really
interesting Primitives in in
decentralized finance lending protocols
Perpetual systems synthetic
representations of assets your your kind
of scope of compatible assets in
financial settings expand significantly
with with AI as a a kind of supplement
to that price Discovery
mechanism uh second this is again pretty
broad uh but AI agents for defi I think
today defi has showed us uh the power of
decentralized composability the power of
kind of codified financial operations
but it's still very primitive and and uh
I guess unexpressive and with AI we can
start to leverage this highly
deflationary form of compute that can
take into account many different pieces
of information in an incredibly
efficient and effective way to start
executing more complex or Advanced uh
trading strategies liquidity
provisioning strategies manage uh
different dii parameters and lending
systems or Perpetual systems in a far
more kind of advanced or efficient way
and so I think AI inter intersecting
with defi via the form factor of Agents
built on top of this kind of collective
intelligence network will represent the
sort of next explosion of innovation uh
and sort of uh experimentation within
decentralized Finance uh and then lastly
this this is risk modeling uh especially
as reaking becomes a like a larger part
of the space represents a larger
percentage of the space's capital the
complexity of risk when you're reapo a
like pool of capital across many
different heterogeneous networks becomes
incredibly complex I think we saw that
some of this play out uh earlier today
or last night um and AI in in this kind
of decentralized form factor can can
offer us a really robust tool for making
sense of all of these really complex
risks and develop kind of adequate
hedging vehicles or or other kind of uh
Financial Primitives to help protect
against that risk and as such protect
against this this mass of new networks
that are are emerging on top of uh a
kind of source of restak
capital um and just quickly launch
timeline so we're in test net phase one
right now we're about to launch test net
Phase 2 in the next week and a half main
net shortly after and in the meantime
we're just uh onboarding workers model
creators that's really our top Focus
right now is bringing together a kind of
critical mass of of machine intelligence
onto the network network gets smarter
the more models there are and so that's
that's really our top priority and we're
also working with a bunch of kind of top
tier applications and different networks
in the space and bringing this source of
collective intelligence to different
kind of parts of the crypto space so I
think that's it
[Music]
Browse More Related Video
Exploring the Value of Knowledge in Web3 | #Consensus2024 AI Summit Recap
How Federated Learning works? Clearly Explained|
Fluence: Cloudless DePIN Computing Platform
Understanding Artificial Intelligence and Its Future | Neil Nie | TEDxDeerfield
Verifiable Compute for AI, ML and More: Web3's New Frontier | Alison Haire at SmartCon 2023
Avec OmniVista Network Advisor, mettez l’IA au service de votre réseau
5.0 / 5 (0 votes)