Nick Emmons: Allora

Archetype
10 May 202415:45

Summary

TLDRIn this video, Nick, co-founder of Aora, introduces their self-improving decentralized AI network. The platform aims to address the challenge of siloed machine intelligence by enabling different AI models to learn from each other in a modular, decentralized way. Aora creates a collective intelligence network, optimizing machine learning objectives and producing aggregated outputs that outperform individual models. With applications in finance, AI-powered data feeds, and DeFi, the network is designed to continuously improve through a combination of inference and forecasting workers, all while preserving privacy and scalability.

Takeaways

  • 🧠 The AI stack can be broken down into four core layers: data, compute, intelligence, and execution.
  • 🛠️ Decentralized AI networks, like blockchain, are built modularly for increased coordination and composability.
  • 📊 Aura aims to solve the problem of siloed machine intelligence by creating a decentralized AI network that allows models to learn from one another.
  • 🤖 The network uses an inference synthesis mechanism, which stitches together model outputs for a more intelligent and performant outcome.
  • 💡 Aura's core focus is on turning intelligence into a digital commodity and enabling models to work together to achieve collective intelligence.
  • 🔐 The network supports both open and closed-source models without revealing proprietary data, making it suitable for financial settings where privacy is crucial.
  • 📈 Financial applications, especially AI-powered price feeds for longtail assets, are a key area where decentralized AI networks like Aura can provide value.
  • 💸 Aura sees decentralized finance (DeFi) and AI agents as the next frontier for executing advanced trading and risk management strategies.
  • 📉 AI-driven risk modeling can help address complex risks in decentralized finance, especially for staking and liquidity provisioning.
  • 🚀 Aura is currently in the testnet phase, focusing on onboarding machine intelligence to enhance the network's collective intelligence before the mainnet launch.

Q & A

  • What is the core concept behind Aora?

    -Aora is a self-improving, decentralized AI Network that aims to turn intelligence into a digital commodity by allowing different AI models to learn from one another and become more composable or compatible in a decentralized setting.

  • How does Aora address the issue of siloed machine intelligence?

    -Aora addresses the issue of siloed machine intelligence by enabling models to learn from each other and collectively optimize different machine learning objectives, resulting in an aggregate model output that outperforms any individual model within the network.

  • What are the four core layers of the decentralized AI stack as described in the script?

    -The four core layers of the decentralized AI stack are Data, Compute, Intelligence, and Execution.

  • What is the role of blockchains in the decentralized AI stack according to the script?

    -Blockchains serve as powerful coordination mechanisms that can turn potentially nebulous resources into more tangible digital commodities, facilitating the modular construction of the decentralized AI stack.

  • How does Aora's network work at a high level?

    -Aora's network operates through an inference synthesis mechanism that stitches together different model outputs to create a more intelligent network-level output. It involves inference workers responding to requests, forecasting workers predicting the loss of inference workers, and reputers evaluating model performance.

  • What is the significance of the forecasting workers in Aora's network?

    -Forecasting workers in Aora's network are crucial for introducing context awareness, allowing the network to outperform individual workers by learning the contexts in which different models perform well, thus enhancing the network's overall performance.

  • Why is the ability to support open and closed-source models important in Aora's network?

    -Supporting open and closed-source models is important because it allows the network to maintain a wide range of use cases, especially in financial settings where preserving proprietary information of models is crucial.

  • What are the four core features of Aora as mentioned in the script?

    -The four core features of Aora are collective intelligence, iterative learning at both the network and model levels, contextual awareness, and privacy protection that supports closed-source model participation.

  • What are some initial focus areas for Aora's application?

    -Some initial focus areas for Aora's application include AI-powered data feeds, particularly for longtail assets, AI agents for decentralized finance, and risk modeling in the context of reaking and other financial primitives.

  • What is the current phase of Aora's development as per the script?

    -As per the script, Aora is currently in the test net phase one, with test net phase 2 launching in about a week and a half, followed by the main net soon after.

  • What is the primary focus of Aora Labs at the moment?

    -The primary focus of Aora Labs at the moment is onboarding workers and model creators to bring a critical mass of machine intelligence onto the network, as the network's intelligence increases with more models participating.

Outlines

00:00

🤖 Decentralized AI Network Overview

Nick, co-founder of Aora, introduces the concept of building a decentralized AI network that self-improves. He outlines a four-layer AI stack consisting of data, compute, intelligence, and execution, emphasizing the modular nature of such systems. The focus is on transforming machine intelligence into digital commodities and overcoming the challenges of siloed AI models by enabling them to interact and improve collectively. Aora aims to build a network where different models collaborate, creating aggregate intelligence that outperforms individual models.

05:01

💡 How Inference and Forecasting Workers Collaborate

Nick explains how Aora’s decentralized AI network functions through the inference synthesis mechanism, where different models, termed inference workers, contribute to solving tasks. Simultaneously, forecasting workers predict how well the inference workers will perform, allowing the network to assign weights to each contribution. This collaborative process leads to better performance compared to individual models by integrating and optimizing outputs. The introduction of forecasting workers creates a system that learns and improves over time, adapting to different domains and contexts.

10:02

🧠 Collective Intelligence and Iterative Learning

Aora’s decentralized AI network promotes collective intelligence by combining heterogeneous models that iteratively learn from each other. The forecasting workers’ context-aware predictions enable better performance at the network level, while individual models also improve by learning how their outputs are weighted relative to others. The privacy of models is maintained, allowing closed-source models to participate. The discussion highlights the network’s core features: collective intelligence, iterative learning, contextual awareness, and privacy protection.

15:04

💹 Financial Applications of Aora’s AI Network

Aora’s decentralized AI network has significant potential in financial applications, especially within the intersection of crypto and AI. Nick outlines three main focus areas: AI-powered price feeds for long-tail assets, AI agents for decentralized finance (DeFi), and risk modeling. These applications aim to improve market operations by providing real-time data feeds, enabling more complex financial strategies, and managing the complexities of risk in capital staking. Aora’s network is positioned to enhance financial operations and innovation through its advanced AI capabilities.

🚀 Aora’s Roadmap and Expansion

Nick concludes by discussing Aora’s current progress and future plans. The network is in the testnet phase, with the next phase and mainnet launch coming soon. Aora is actively onboarding machine intelligence to expand the network, emphasizing that the network's intelligence grows with the number of models participating. The focus remains on reaching a critical mass of models and collaborating with top-tier applications within the crypto space to bring collective intelligence to various sectors.

Mindmap

Keywords

💡Decentralized AI Network

A decentralized AI network refers to a distributed system where AI capabilities are not centralized in one location or under one entity's control, but rather spread across multiple nodes or participants. In the context of the video, the co-founder of Aora discusses building such a network that can self-improve and is designed to be modular, allowing for greater flexibility and innovation. The network aims to address the issue of siloed machine intelligence by enabling different AI models to learn from each other and become more compatible.

💡Modular Construction

Modular construction in the video's context refers to the design approach where complex systems are built from smaller, interchangeable parts or modules. This approach is likened to how blockchain technology has evolved, with different components or layers of the technology stack being developed and integrated to create a cohesive system. The video suggests that this method fosters rapid innovation and allows for the creation of systems that are more adaptable and efficient.

💡Siloed Machine Intelligence

Siloed machine intelligence is a term used in the video to describe the current state of AI where different AI models operate independently and are not designed to be combined or work together. This leads to a lack of interoperability and hinders the development of more comprehensive AI systems. The video discusses the need for a decentralized network that can break down these silos and enable models to learn from each other.

💡Composable Models

Composable models are AI models that can be easily combined or integrated with others to create more complex or powerful systems. The video emphasizes the need for such models in the AI industry, as the current state of AI is characterized by models that are not easily merged or compatible with each other. The Aora network aims to enable models to become more composable, allowing for the creation of aggregate intelligence that surpasses the capabilities of individual models.

💡Inference Synthesis Mechanism

The inference synthesis mechanism is a core component of the Aora network described in the video. It is a process that combines the outputs of different AI models to produce a more accurate or intelligent result than any single model could achieve on its own. This mechanism is crucial for the network's ability to create a collective intelligence that outperforms individual models and is an example of how the network stitches together different model outputs.

💡Context Awareness

Context awareness in the video refers to the ability of the AI network to understand and adapt to different situations or environments in which its models are used. This is achieved through the use of forecasting workers that learn the contexts in which different models perform well, allowing the network to assign weights to different models based on their performance in specific contexts. This feature is key to the network's ability to outperform individual models and is integral to its collective intelligence.

💡Aggregators

Aggregators in the video are components of the AI network that combine the outputs of different models to produce a final output. They are particularly important in the context of time series prediction tasks, where they can learn the contexts in which different models perform best and adjust their predictions accordingly. The video highlights how even the best aggregator can achieve better performance than the best individual model, demonstrating the power of collective intelligence.

💡Objective-Centric Paradigm

The objective-centric paradigm mentioned in the video is a shift from using specific AI models for tasks to focusing on the objectives or goals that the AI is meant to achieve. This approach allows for a more flexible and dynamic use of AI, where the best models for a given objective are dynamically selected and combined to produce the best results. The video suggests that this paradigm will lead to more efficient and effective AI applications.

💡Longtail Assets

Longtail assets in the video refer to a wide range of assets that are less commonly traded or have less liquidity. The video discusses how AI can be used to create more accurate price feeds for these assets, which is particularly important in the context of decentralized finance. By using AI to analyze various data sources and upsample market data, the Aora network can provide near real-time price feeds for a broader set of asset categories.

💡DeFi Agents

DeFi agents in the video are AI-driven entities that can execute complex financial operations within decentralized finance platforms. The video suggests that these agents can leverage the collective intelligence of the Aora network to perform advanced trading strategies, manage liquidity, and adjust parameters in lending and perpetual systems. This represents a potential new wave of innovation in decentralized finance, where AI can automate and optimize various financial processes.

Highlights

Introduction to Aora, a self-improving decentralized AI Network

AI stack simplified into four core layers: Data, Compute, Intelligence, and Execution

Blockchains as powerful coordination mechanisms for decentralized networks

Modularity in blockchain stack leading to rapid innovation in DeFi

Aora's focus on the Intelligence layer of the AI stack

Addressing the problem of siloed machine intelligence with decentralized networks

Aora's network allows different AI models to learn from each other, enhancing collective intelligence

The output of Aora is an aggregate model output that outperforms individual models

Aora as an abstraction layer for intelligence, similar to how Bitcoin is for power

Background on the company's evolution from Upshot to Aora Labs

Research in mechanism design and Pure prediction leveraging information theory

Shift from human inputs to AI in crypto infrastructure due to inefficiency

Aora's inference synthesis mechanism for combining different model outputs

The role of inference workers and forecasting workers in the Aora network

How Aora maintains privacy by only sharing inferences, not model weights

Simulation results showing Aora's network outperforming individual models

The importance of context awareness in Aora's forecasting workers

Transition from model-centric to objective-centric paradigm in AI interaction

Core properties of Aora: Collective intelligence, iterative learning, contextual awareness, and privacy protection

Initial focus areas for Aora in financial applications and crypto

AI-powered data feeds for longtail assets in decentralized finance

AI agents for executing complex strategies in DeFi

Risk modeling in decentralized finance with Aora's AI network

Upcoming test net phases and main net launch timeline

Transcripts

play00:02

[Music]

play00:07

hey guys I'm Nick co-founder of aora and

play00:09

we're building a self-improving

play00:11

decentralized AI Network um before we

play00:14

get into things I just want to level set

play00:16

a line on sort of uh endtoend view of

play00:19

the AI stack as how as how we see it and

play00:22

we kind of see it as these four core

play00:24

layers this is somewhat of a

play00:25

simplification but fundamentally the the

play00:27

decentralized AI stack the AI stack in

play00:29

general can be broken up into Data

play00:31

compute intelligence and execution and

play00:34

we think that like with other

play00:36

decentralized networks this end to-end

play00:38

system will be realized via a kind of

play00:40

modular construction I think blockchains

play00:43

are are incredibly powerful coordination

play00:46

mechanisms for turning potentially

play00:48

nebulous resources into more tangible

play00:51

digital Commodities and when you're

play00:53

coordinating disparate networks of

play00:55

individuals around different actions

play00:57

it's already a complex undertaking on

play00:59

itself and when you try to build

play01:01

monolithically that complexity compounds

play01:04

and I think that's why we've seen the

play01:05

blockchain stack modularize I think

play01:07

that's why we've seen such a rapid rate

play01:09

of innovation and progress in things

play01:11

like defi it's because of this

play01:13

modularity and the the ability to build

play01:15

uh very tightly scoped highly composable

play01:18

building blocks that get stitched

play01:19

together and so when we think of allora

play01:21

we think of the kind of intelligence

play01:23

layer of the stack we're building and

play01:25

kind of turning intelligence itself into

play01:27

a digital

play01:28

commodity um and the problem we're

play01:30

trying to solve is is this kind of

play01:31

fundamental problem in AI of siloed

play01:33

machine intelligence there's many

play01:35

different models in the world and

play01:36

they're not very kind of composable or

play01:38

compatible with one another I can't

play01:39

merge Model A and model B to create a a

play01:43

mega model that represents the

play01:44

intelligence of both and this is a real

play01:46

problem because it means that these

play01:48

large monoliths with access to the most

play01:50

kind of raw resources of data and

play01:52

compute will continually making the the

play01:54

better models and so what we're trying

play01:56

to solve with aura is through

play01:59

decentralized networks a way for models

play02:01

to kind of learn off of one another and

play02:03

become more composable or compatible in

play02:05

a decentralized setting and so that's

play02:08

what allur is allur is a a network where

play02:11

many different models can learn off of

play02:12

one another in collectively optimizing

play02:15

different ml objectives and the output

play02:17

of the network is a kind of aggregate

play02:19

inference or an aggregate model output

play02:21

that consistently outperforms any of the

play02:23

individual models within that Network

play02:25

and in doing so you've created a kind of

play02:28

commoditized version of IG itself and

play02:31

significantly accelerated the rate of

play02:33

innovation of machine intelligence

play02:36

itself um and another way to think of

play02:38

allore is kind of this abstraction layer

play02:39

for intelligence the same way Bitcoin is

play02:42

kind of an abstraction layer for power

play02:44

or other networks are abstraction layers

play02:45

for trust or compute or things like this

play02:48

allora is this abstraction layer for

play02:50

intelligence allowing people to access

play02:53

intelligence and the kind of best

play02:55

aggregate form of intelligence through

play02:56

the network itself as opposed to

play02:58

interacting with individual models and

play03:00

needing to go through the processes of

play03:02

of deciding between different models and

play03:03

different domains or

play03:05

contexts um and just some quick

play03:07

background on who we are so when we

play03:09

first started the company it was

play03:10

originally called upshot we're now

play03:11

called a Lowa Labs we're building a

play03:13

laowa network um we were doing a bunch

play03:15

of research in this really new field of

play03:16

mechanism design called Pure prediction

play03:18

that leverages information theory in a

play03:20

pretty interesting way to incentivize

play03:22

people to be honest in the face of

play03:24

subjectivity so enabling people to reach

play03:26

consensus on the answers to subjective

play03:27

questions that's important for a bunch

play03:29

of different problem spaces the Oracle

play03:31

problem more generally and it's also

play03:32

useful for Designing decentralized

play03:34

networks around especially nebulous

play03:36

resources like intelligence and then as

play03:38

we were building this out we realized

play03:40

that humans as inputs to these systems

play03:42

were too inefficient they're they're too

play03:44

inaccurate uh too incompetent I guess to

play03:47

support many real world tasks and that's

play03:50

when we shifted to building some of the

play03:52

earlier AI ex crypto infrastructure

play03:54

especially in longtail Market setting so

play03:56

uh AI powered price feeds for longtail

play03:58

assets AI derivative infrastructure Etc

play04:01

and what we're doing with allur now is

play04:03

taking that early work we did around

play04:05

subjective consensus mechanism design

play04:07

and all of the learnings and

play04:08

infrastructure we've built in the AI

play04:10

crypto space over the past three years

play04:12

to build this kind of collective

play04:13

intelligence Network or self-improving

play04:15

decentralized AI

play04:17

Network and this is kind of how the

play04:19

network works uh at a high level um the

play04:22

the kind of core mechanism of the

play04:24

network is what we call the inference

play04:25

synthesis mechanism and it's a way of

play04:27

stitching together different model

play04:29

outputs in a productive way to create a

play04:31

more performant or more intelligent

play04:33

Network level output and so the the

play04:35

general flow is consumers ask a a given

play04:38

topic a topic is kind of like a sub

play04:40

Network in the system it's a very

play04:41

tightly scoped uh ml objective that

play04:44

model creators uh kind of coordinate

play04:46

around and are attempting to

play04:47

collectively optimize a consumer would

play04:50

would ping one of these topics asking

play04:52

for some AI output and two flows are

play04:55

happening one these kind of inference

play04:56

workers are responding to that request

play04:58

with inference that is being assessed or

play05:01

weighted based on uh how well it

play05:03

optimizes the core objective function in

play05:05

that topic and simultaneously

play05:07

forecasting workers are predicting the

play05:10

loss that's going to be realized on

play05:12

those inference workers uh as assessed

play05:14

Again by that objective function and

play05:17

they're they're over time learning the

play05:18

context the different domains Etc in

play05:21

which different inference workers within

play05:22

a given topic perform well or perform

play05:24

less well um to produce the the most

play05:27

accurate kind of predicted loss and so

play05:30

as these things get stitched together

play05:32

you uh establish weights across the

play05:34

different inferences developed by

play05:35

different workers based on uh kind of

play05:38

approximation of a shly value across

play05:40

different models and produce an output

play05:42

that over time increasingly outperforms

play05:44

the individual models performance within

play05:46

that topic and then there's a a kind of

play05:49

second or third type of actor called

play05:50

reputer who are tasked with actually

play05:53

evaluating the performance of different

play05:55

models over time so they they take some

play05:57

ground truth they take some kind of

play05:59

basis or validation set and assess the

play06:02

performance of the different uh uh

play06:04

workers as as assessed by or as

play06:07

calculated by that objective

play06:09

function um and and I I guess one one

play06:11

thing to note here I I haven't looked at

play06:13

this presentation in a while um is

play06:15

because workers are just sharing

play06:17

inference through the network the

play06:19

network can support open and close

play06:21

Source models they're not revealing the

play06:22

weights of their models they're not

play06:23

revealing any kind of proprietary

play06:25

information of those models and we think

play06:27

that's important for supporting a wide

play06:28

number of use cases and we'll get into

play06:30

it later but especially kind of use

play06:32

cases in financial settings where

play06:34

maintaining that proprietary alpha or

play06:36

the proprietary information of your

play06:37

models is especially

play06:39

important um and the key to this

play06:41

mechanism working is really around

play06:43

context awareness we've done a bunch of

play06:44

work around this but it's that

play06:47

introduction of the forecasting worker

play06:49

that allows you to realize a lot of the

play06:51

outperformance at the network level

play06:52

relative to the individual workers in

play06:54

the network and so we can see some

play06:56

simulation results here uh in five

play06:59

predictors they're they're inference

play07:01

workers but they're working in a Time

play07:02

series prediction task um the best uh

play07:05

predictor is predictor one and they've

play07:08

achieved a log loss of uh about negative

play07:10

3.3 uh so that's pretty good they've

play07:13

they've uh if we didn't have forcasting

play07:16

workers we would waight them the most

play07:17

heavily as this kind of base set of

play07:19

predictors and then produce an inference

play07:21

that achieves close to that performance

play07:24

but it's over time that these

play07:25

forecasting workers sort of learn the

play07:27

different contexts in which different

play07:29

predictors perform well in this time

play07:30

series prediction case maybe some models

play07:33

are especially competent in say volatile

play07:36

market conditions other uh models are

play07:39

are more competent less volatile market

play07:41

conditions and these forecasters would

play07:42

learn that over time and uh adjust their

play07:45

their predicted losses to uh uh kind of

play07:48

represent that and what we see in this

play07:50

this bottom figure is that with the

play07:52

introduction of aggregators or

play07:53

forecasters as as they're called the the

play07:56

best forecaster achieves a a loss of

play07:59

around -4.4 and so that's around 33%

play08:03

better than even the best model in the

play08:04

network because they're learning the

play08:06

context in which different workers

play08:08

outperform other workers in the network

play08:10

and additionally we can see that even

play08:12

the worst aggregator the one who is

play08:14

taking kind of the uh who is waiting uh

play08:17

contextual awareness uh least

play08:19

sensitively is still outperforming even

play08:22

the best inference worker within that

play08:23

Network and so it's this this kind of

play08:25

element of context awareness this

play08:27

additional dimension of performance

play08:30

where we really see outperformance at

play08:32

the network level in this kind of

play08:35

setting and then another nice property

play08:38

of this is when we interact with with AI

play08:41

today it's a very kind of model Centric

play08:43

Paradigm I'm saying I'm going to use gp4

play08:47

I'm going to use CLA 3 for some task and

play08:49

I'm asserting that that model is going

play08:51

to be the best in every domain and every

play08:52

context that I'm using it in and that's

play08:55

not the case I don't think there's ever

play08:56

going to be a single model that is

play08:57

dominant across all domains

play09:00

and when you when you align things

play09:02

around given objectives or given topics

play09:04

you start to move to this objective

play09:06

Centric Paradigm of interacting with AI

play09:08

in that I care about this ml objective

play09:12

this evaluation criteria that matters

play09:14

for my application or for my use and

play09:17

then this underlying set of Ever

play09:19

Changing models is being productively

play09:21

coordinated and stitched together to

play09:23

produce the best possible output is that

play09:25

you can think of this as kind of

play09:26

analogous to the move from transaction

play09:29

too intense of moving from specifying a

play09:32

specific set of uh State transitions to

play09:35

achieve some objective to just

play09:36

specifying the intent that you care

play09:38

about at the end of the day and as we

play09:40

move towards an objective Centric

play09:41

Paradigm of interacting with AI a

play09:43

similar thing will unfold is in instead

play09:45

of interacting with specific models

play09:47

we'll interact with specific ml

play09:49

objectives that we ultimately care about

play09:50

and ultimately satisfy our needs either

play09:52

in use and application development

play09:54

whatever it

play09:55

is um and so just to recap these are

play09:58

sort of like the four core Properties or

play10:00

features of allora 1 collective

play10:01

intelligence I think that's pretty

play10:03

simple we're bringing together many

play10:04

different heterogeneous models to

play10:06

produce a more intelligent kind of

play10:08

network level intelligence and because

play10:10

of forecasters the model is or the

play10:12

network is kind of learning over time

play10:14

and so you achieve this property of kind

play10:16

of iterative learning at the network

play10:17

level Additionally you achieve iterative

play10:20

learning at the model level as well

play10:22

because each of the individual models in

play10:23

the network are learning the the weights

play10:26

that are being assigned across other

play10:27

models in the network and using that to

play10:29

to improve or inform their future

play10:31

outputs uh third contextual awareness as

play10:35

we talked about is kind of the key to

play10:36

stitching together heterogeneous

play10:38

instances of machine intelligence in a

play10:41

really productive way in a way that

play10:42

outperforms any of the individual models

play10:44

within that network uh and then lastly

play10:47

as we talked about because the workers

play10:49

are only sharing inference within the

play10:50

network there's a level of privacy

play10:52

protection that supports this kind of

play10:55

closed Source uh model participation

play10:57

within the network

play11:00

uh and then I just quickly want to go

play11:01

over some initial Focus areas so I think

play11:04

as we think about where this network is

play11:06

most applicable at the application Level

play11:08

we see a lot of financial applications

play11:11

as the initial kind of uh like verticals

play11:13

to tackle I think as we're talking about

play11:15

crypto and AI crypto is largely a

play11:18

financial innovation in my opinion it's

play11:20

it's found most if not all of its

play11:21

product Market fit across different

play11:23

Financial settings and AI in financial

play11:26

settings is also much more mature than

play11:29

then some of the newer types of

play11:30

generative AI as well and so I think a

play11:32

lot of the immediate value within this

play11:35

intersection of crypto and AI will

play11:36

happen in various Financial uh settings

play11:39

and verticals so we'll quickly go over

play11:40

some of

play11:41

these one AI powered data feed so this

play11:44

is this is pretty broad but I'll Focus

play11:46

specifically on kind of AI powered price

play11:49

feeds for longtail assets this is what

play11:51

we've been doing historically at upshot

play11:52

for the past few years um but when we

play11:55

start to move into the long taale of

play11:56

assets where crypto like in defi edly

play11:59

really shine in in codifying certain

play12:01

Market interactions to enable the

play12:04

support of more longtail asset

play12:05

categories we still lack a a kind of

play12:09

robust and accurate source of price

play12:11

Discovery for these assets uh they don't

play12:13

change hands very often and so these

play12:14

transfers of ownership that act as price

play12:17

Discovery events for other asset classes

play12:20

often don't happen enough uh for that to

play12:23

be the sole source of price Discovery

play12:24

for many different assets and AI can be

play12:26

used to take into account many other

play12:29

pieces of data outside of the market

play12:30

itself as well as upsampling Market data

play12:33

in productive ways to produce re near

play12:35

realtime price feeds for a much broader

play12:37

set of asset categories and sort of

play12:40

equipped with a a price feed you can

play12:43

start to build a bunch of really

play12:44

interesting Primitives in in

play12:45

decentralized finance lending protocols

play12:47

Perpetual systems synthetic

play12:49

representations of assets your your kind

play12:52

of scope of compatible assets in

play12:55

financial settings expand significantly

play12:58

with with AI as a a kind of supplement

play13:01

to that price Discovery

play13:03

mechanism uh second this is again pretty

play13:05

broad uh but AI agents for defi I think

play13:08

today defi has showed us uh the power of

play13:12

decentralized composability the power of

play13:15

kind of codified financial operations

play13:18

but it's still very primitive and and uh

play13:21

I guess unexpressive and with AI we can

play13:24

start to leverage this highly

play13:25

deflationary form of compute that can

play13:27

take into account many different pieces

play13:29

of information in an incredibly

play13:30

efficient and effective way to start

play13:33

executing more complex or Advanced uh

play13:36

trading strategies liquidity

play13:37

provisioning strategies manage uh

play13:40

different dii parameters and lending

play13:41

systems or Perpetual systems in a far

play13:43

more kind of advanced or efficient way

play13:46

and so I think AI inter intersecting

play13:49

with defi via the form factor of Agents

play13:52

built on top of this kind of collective

play13:53

intelligence network will represent the

play13:56

sort of next explosion of innovation uh

play13:59

and sort of uh experimentation within

play14:01

decentralized Finance uh and then lastly

play14:04

this this is risk modeling uh especially

play14:07

as reaking becomes a like a larger part

play14:09

of the space represents a larger

play14:11

percentage of the space's capital the

play14:14

complexity of risk when you're reapo a

play14:16

like pool of capital across many

play14:18

different heterogeneous networks becomes

play14:20

incredibly complex I think we saw that

play14:23

some of this play out uh earlier today

play14:25

or last night um and AI in in this kind

play14:28

of decentralized form factor can can

play14:30

offer us a really robust tool for making

play14:33

sense of all of these really complex

play14:34

risks and develop kind of adequate

play14:36

hedging vehicles or or other kind of uh

play14:39

Financial Primitives to help protect

play14:41

against that risk and as such protect

play14:43

against this this mass of new networks

play14:46

that are are emerging on top of uh a

play14:48

kind of source of restak

play14:50

capital um and just quickly launch

play14:53

timeline so we're in test net phase one

play14:55

right now we're about to launch test net

play14:56

Phase 2 in the next week and a half main

play14:59

net shortly after and in the meantime

play15:01

we're just uh onboarding workers model

play15:04

creators that's really our top Focus

play15:05

right now is bringing together a kind of

play15:07

critical mass of of machine intelligence

play15:10

onto the network network gets smarter

play15:12

the more models there are and so that's

play15:13

that's really our top priority and we're

play15:15

also working with a bunch of kind of top

play15:17

tier applications and different networks

play15:19

in the space and bringing this source of

play15:21

collective intelligence to different

play15:23

kind of parts of the crypto space so I

play15:26

think that's it

play15:29

[Music]

Rate This

5.0 / 5 (0 votes)

Связанные теги
Decentralized AISelf-ImprovingAI NetworkBlockchainModularityInnovationCrypto FinanceAI ModelsMachine LearningPrice Feeds
Вам нужно краткое изложение на английском?