LangGraph AI Agents: How Future of Internet Search will look like?

Mervin Praison
27 Jan 202413:46

Summary

TLDR本视频介绍了如何利用Lang Chain生态系统创建语言代理(agents),以实现自动化的互联网搜索和深入分析。首先,创建了两个代理:互联网搜索分析师和洞察研究员。互联网搜索分析师负责基于问题搜索互联网、浏览链接并总结文章。洞察研究员则进一步分析总结中的主题,并为每个主题提供深入的答案。接着,通过安装Lang Chain、OpenAI、lsmith等工具,并设置环境变量和API密钥,展示了如何创建和执行这些代理。视频还演示了如何通过Lang Chain的Lgraph工具创建工具和代理节点,并构建工作流程图(graph),以协调不同代理之间的交互。最终,通过运行构建的图(graph),展示了如何通过一个查询获得深入的搜索结果。此外,还介绍了如何使用Gradio创建用户界面,以便用户可以通过简单的交互获得搜索结果。整个流程展示了Lang Chain生态系统的可扩展性和灵活性,以及如何通过自定义代理和工具来实现复杂的搜索和分析任务。

Takeaways

  • 🚀 引入了Lang chain,这是从Lang chain到GRS(语言代理)的重大更新,允许创建不同的代理来完成任务。
  • 🛠️ 用户可以在Lang chain生态系统中定义工具并将这些工具分配给相应的代理。
  • 🌐 展示了如何将Lang chain集成到lsmith中,作为互联网搜索未来可能的用例。
  • 📈 介绍了如何创建两个代理:互联网搜索分析师和洞察研究员,它们将协同工作以提供深入的搜索结果。
  • 🔍 互联网搜索分析师代理将基于提出的问题搜索互联网,浏览不同链接并总结文章。
  • 🔑 洞察研究员将识别总结中的主题,并针对每个主题进行深入搜索,提供详细的答案。
  • 📚 展示了如何安装Lang chain、Lang graph、Open AI、lsmith等工具,并设置环境变量。
  • 💡 介绍了如何定义工具和代理,以及如何通过系统提示和函数调用来创建代理节点。
  • 📝 通过创建代理状态、边缘和图来定义工作流程,这些是Lang chain中的关键概念。
  • 🔗 描述了如何通过添加节点和边缘来构建工作流程,以及如何使用条件映射来控制工作流程的流程。
  • 📈 展示了如何编译并运行工作流程,以及如何通过终端和lsmith监控代理之间的交互。
  • 🎉 最终,通过这个系统,用户可以获取关于2024年最新AI技术趋势的深入研究报告。
  • 📊 还提到了如何使用Gradio创建用户界面,以便用户可以通过图形界面进行查询。
  • 🤖 讨论了Lang chain生态系统与AutoGen和Crew AI的比较,强调了Lang chain的可扩展性和定制化优势。

Q & A

  • Lang Gro是什么?

    -Lang Gro是一个语言代理系统,它允许创建不同的代理来完成特定的任务,并且可以定义工具并将这些工具分配给这些代理,完全在Lang chain生态系统中进行集成。

  • 如何使用Lang Gro创建代理?

    -首先,你需要安装Lang chain、Lang graph、Lang chain open AI、lsmith、Lang chain Hub、dug duug、go search、beautiful soup和gradio等工具。然后,创建一个名为app.py的文件,并在文件中导入必要的函数和模块。接下来,定义工具和代理,设置环境变量,初始化模型,并创建一个系统提示来指导代理的行为。最后,通过定义代理节点和工作流来使代理协同工作。

  • Lang Gro中的代理和工具有什么区别?

    -在Lang Gro中,代理是指执行特定任务的实体,如互联网搜索分析师或洞察研究员。而工具则是指代理用来完成任务的具体功能,比如搜索互联网以获取URLs,或者读取并处理URL指向的内容。

  • 如何将Lang Gro集成到lsmith中?

    -通过安装Lang chain Hub并导出你的Open AI API密钥和Lang chain API密钥(如果使用lsmith),可以在lsmith中监测Lang chain的状态。在lsmith的用户界面中,可以监控代理之间的交互和任务执行情况。

  • Lang Gro如何改变互联网搜索的未来?

    -Lang Gro通过创建专门的代理来执行搜索任务,比如互联网搜索分析师会基于问题搜索互联网并总结文章,而洞察研究员则会识别总结中的主题并进行深入搜索,提供详细的答案。这种工作流程预示着互联网搜索将变得更加深入和个性化。

  • 如何使用Lang Gro生成用户界面?

    -可以通过Gradio库来创建用户界面。在代码中修改相应的部分,创建一个运行图函数,并返回响应。然后,使用Gradio的界面设置来完成用户界面的建立。

  • Lang Gro与AutoGen和Crew AI相比有什么优势?

    -Lang Gro在Lang chain生态系统中更加可扩展,因为它包含了许多Lang chain工具,可以更容易地实现代理,并且对于边缘、代理、流程等方面有更多的定制化选项。而AutoGen提供一个更简单的版本,Crew AI则是最简单的版本。选择哪种工具取决于具体的用例需求。

  • 如何通过Lang Gro进行深入的互联网搜索?

    -通过创建互联网搜索分析师代理和洞察研究员代理,可以进行深入的互联网搜索。首先,互联网搜索分析师代理会搜索互联网并总结相关内容,然后洞察研究员代理会识别主题并进行更深入的搜索,最终提供详细的见解。

  • Lang Gro中的工作流是如何定义的?

    -在Lang Gro中,工作流是通过定义代理状态、边缘和图来创建的。代理状态保存了不同代理之间的对话历史,边缘是代理之间的连接,而图(或称为管道或工作流)则定义了整个任务执行的顺序和结构。

  • 如何监控Lang Gro中代理的工作状态?

    -可以通过lsmith的用户界面来监控代理的工作状态。在lsmith中,可以查看代理之间的交互和任务执行的实时情况,这需要在Lang Gro设置中集成lsmith并请求访问该仪表板。

  • Lang Gro是否支持多语言?

    -根据提供的脚本内容,Lang Gro主要讨论的是其在搜索和分析任务中的应用,并没有明确指出是否支持多语言。但是,由于其基于Lang chain生态系统,可以推测它可能具有处理不同语言的能力。

  • Lang Gro是否需要编程知识来使用?

    -是的,使用Lang Gro需要一定的编程知识。你需要能够安装必要的库,编写Python代码来定义代理、工具和工作流,以及设置环境变量和API密钥。

Outlines

00:00

🚀 介绍Lang Gro及其生态系统

本段介绍了Lang Gro(可能指的是一种语言模型或工具)及其在Lang chain生态系统中的应用。讨论了如何创建不同的代理(agents)来完成特定任务,并将这些代理与定义好的工具相结合。提到了如何将这些工具集成到Lang chain中,并通过一个用例展示了互联网搜索的未来。最后,介绍了如何创建用户界面,并鼓励观众订阅YouTube频道以获取更多关于人工智能的信息。

05:02

🔍 创建代理和工具的步骤

详细描述了如何安装Lang chain、Lang graph、Lang chain Open AI、lsmith等工具,并导出必要的API密钥。然后,创建了一个名为app.py的文件,并导入了所需的库和模块。接下来,定义了环境变量,初始化了模型,并详细说明了如何创建工具,特别是用于互联网搜索和内容处理的工具。之后,介绍了如何创建代理,包括互联网搜索分析师和洞察研究员,以及它们的职责。最后,通过创建一个名为supervisor的代理来协调其他代理的工作,并通过定义选项、系统提示和根函数来管理任务的分配和完成。

10:05

📈 运行代理和工具的流程

本段展示了如何通过定义代理状态、边缘和图来创建工作流程。代理状态保存了代理之间的对话历史,边缘是代理之间的连接,而图(或称为管道/工作流)则定义了整个任务执行的顺序。通过添加节点和边缘,构建了工作流程,并定义了条件映射来控制任务的流向。最后,通过运行graph.stream来启动整个流程,展示了如何通过终端命令执行搜索任务,并在lsmith仪表板中监控代理的交互。此外,还介绍了如何使用Gradio创建用户界面,并通过运行代码来展示最终的搜索结果。

Mindmap

Keywords

💡Lang Gro

Lang Gro 是视频中提到的一种语言代理,它代表了一种从Lang Chain进化而来的新技术,允许创建不同的代理来完成特定任务。在视频的主题中,Lang Gro 用于展示如何通过定义工具和将这些工具分配给代理来实现复杂的互联网搜索任务,这是构建未来互联网搜索方式的关键技术之一。

💡Agent

在视频中,'Agent'指的是能够执行特定任务的程序或算法。例如,创建了'Internet Search Analyst'和'Insight Researcher'两种代理,分别用于搜索互联网和深入研究特定主题。这些代理是实现自动化搜索和分析流程的核心组件。

💡Tool

工具在视频中指的是用于执行特定功能的程序或脚本。例如,使用 'search' 工具来搜索互联网并获取URLs,以及使用 'process content' 工具来解析网页内容。这些工具被分配给代理,以便它们可以自动执行搜索和分析任务。

💡LSmith

LSmith 在视频中被提及为一个用户界面,用于监控Lang Chain的状态。它是一个可视化工具,可以帮助用户跟踪和理解代理之间的交互和整个搜索流程的进展。

💡API Key

API Key 是一种用于访问特定服务或API(应用程序编程接口)的密钥。视频中提到了导出Open AI API Key和Lang Chain API Key,这是为了允许程序与这些服务交互,执行搜索和分析等操作。

💡Beautiful Soup

Beautiful Soup 是一个Python库,用于从HTML和XML文件中提取数据。在视频中,它被用于通过 'process content' 工具解析网页内容,是自动化网络数据抓取和处理的关键技术。

💡Gradio

Gradio 是一个Python库,用于快速创建机器学习模型的交云用户界面。在视频的最后,提到了使用Gradio来创建用户界面,这表明了Lang Gro 技术可以与Gradio结合,为用户提供更直观的搜索体验。

💡Workflow

Workflow 在视频中指的是代理和工具之间交互的流程或序列。通过定义 'agent state', 'edges', 和 'graph',构建了一个工作流程,使得不同的代理可以协同工作,完成从搜索到分析的整个任务。

💡System Prompt

System Prompt 是一种预定义的消息或指令,用于指导代理如何执行任务。在创建 'Supervisor Agent' 时,系统提示定义了代理的角色和任务执行的规则,是确保代理正确响应用户请求的重要组成部分。

💡Conditional Map

Conditional Map 在视频中用于定义基于特定条件的流程控制。例如,如果消息是 'finish',则 'Supervisor Agent' 将终止请求;如果不是,则继续发送任务给代理。这是工作流中用于决策和控制流程的关键机制。

💡Graph Stream

Graph Stream 是在Lang Chain中用于运行和执行定义好的工作流程的方法。在视频中,通过调用 'graph.stream' 并提供问题或任务,可以启动整个搜索和分析过程,最终得到结果。

Highlights

介绍了Lang gr(语言代理)的概念,这是从Lang chain(语言链)的一次重大更新。

可以创建不同的代理来完成特定的任务,并且可以在Lang chain生态系统中定义工具并分配给这些代理。

展示了如何将Lang chain集成到lsmith(一种用户界面)中。

展示了如何创建两个代理:互联网搜索分析师和洞察研究员。

互联网搜索分析师将基于提出的问题搜索互联网,浏览不同链接并总结文章。

洞察研究员将识别总结中的主题,并为每个主题进行深入搜索,提供详细的答案。

展示了未来互联网搜索可能的发展方向。

介绍了如何安装Lang chain、Lang graph、Lang chain Open AI、lsmith、Lang chain Hub、dug duug、go search、beautiful soup和gradio。

展示了如何导出Open AI API密钥和Lang chain API密钥。

创建了一个名为app.py的文件,并导入了必要的函数和库。

设置了环境变量Lang chain tracing V2和Lang chain project。

初始化了模型,选择了gp4 turbo作为第一步。

定义了两个工具:搜索互联网以获取URLs,以及阅读URL的内容。

创建了帮助函数来创建代理,包括互联网搜索分析师和洞察研究员。

创建了监督者代理,它将与其他代理一起工作,以最终提供洞察。

定义了代理节点,这些是运行代理的函数。

创建了代理状态、边缘和图,这些是Lang graph中的关键概念。

通过graph.stream运行图,提供了问题或任务,以搜索和总结内容。

展示了如何在终端运行代码,并在lsmith中监控代理之间的交互。

讨论了Lang gr与Autogen和Crew AI的比较,以及Lang chain生态系统的可扩展性。

展示了如何使用Gradio创建用户界面,并运行代码以获取结果。

强调了Lang gr的可定制性,以及如何通过Lang chain工具实现简单的实现和更多的定制。

提出了未来将创建更多团队和代理的可能性。

Transcripts

play00:00

this is amazing now we have Lang gr

play00:04

language agents as GRS this is a huge

play00:08

update from Lang chain we are able to

play00:10

create different agents to complete a

play00:12

task we are able to Define tools and

play00:15

assign those tools to those agents

play00:17

completely in Lang chain ecosystem it

play00:20

can be integrated in lsmith like this as

play00:23

a use case we are going to see how the

play00:25

future of internet search will look like

play00:28

and finally we are going to create a

play00:29

user interface like this that's exactly

play00:31

what we're going to see today let's get

play00:33

[Music]

play00:36

started hi everyone I'm really excited

play00:38

to show you about Lang Gro I'm going to

play00:40

take you through step by step on how to

play00:42

create agents how to create tools and

play00:45

explain with the use case but before

play00:47

that I regularly create videos in

play00:49

regards to Artificial Intelligence on my

play00:50

YouTube channel so do subscribe and

play00:53

click the Bell icon to stay tuned make

play00:55

sure you click the like button so this

play00:57

video can be helpful for many others

play00:58

like you as I used use case first we are

play01:00

going to create two agents internet

play01:02

search analyst and insight researcher

play01:05

the internet search analyst is going to

play01:07

search the internet based on the

play01:09

question we are going to ask it is going

play01:11

to go through different links and then

play01:13

summarize all those articles imagine

play01:16

you're going to Google and type latest

play01:18

AI Trends 2024 go through every single

play01:21

URL probably the first five or first 10

play01:24

and then summarize everything it is

play01:26

going to take a lot of time but the

play01:28

internet search analyst agent is is

play01:30

going to do the same based on the topic

play01:32

is going to go through different URL and

play01:34

then give us a summary next that summary

play01:36

is sent to the Insight researcher this

play01:39

insights researcher will identify

play01:41

different topics in that summary and

play01:44

then again do in-depth search for each

play01:48

individual topic and give us a detailed

play01:51

in-depth answer this I believe would be

play01:53

the future of internet search at the end

play01:55

you will have a report like this where

play01:57

you identified all the key areas are the

play02:00

topics and insights about those topics

play02:03

first we going to install Lang chain

play02:05

Lang graph Lang chain open AI lsmith

play02:08

Lang chain Hub dug duug go search

play02:11

beautiful soup and gradio using pip

play02:14

install and then click enter once that

play02:16

is installed export your open AI API key

play02:18

like this and click enter next export

play02:21

Lang chain API key if you're using

play02:24

lsmith this is optional and click enter

play02:27

Then create a file called app.py and

play02:29

then let's open it inside the file first

play02:32

import function tools operator request

play02:35

OS and Json next beautiful soup dugdug

play02:39

go search agent executor and create open

play02:41

AI tools agent base message human

play02:44

message Jason output functions parser

play02:47

chat prompt template and message

play02:49

placeholder State graph and end from

play02:52

Lang graph tool chat open AI various

play02:56

data types and finally gradio I've

play02:58

already covered basics in regard L chain

play03:00

which I will link that in the

play03:01

description below as the name suggests

play03:03

the pares are used to par agents are to

play03:06

create and execute agents messages to

play03:08

provide messages L graph is to create

play03:11

the graph tools to create tools first we

play03:13

are going to set environment variables

play03:15

Lang chain tracing V2 true and then Lang

play03:18

chain project this is required only if

play03:20

you use lsmith that's the user interface

play03:23

to monitor the Lang chain status which I

play03:27

will show you towards the end of the

play03:28

video first we going to initialize the

play03:29

model I'm going to use gp4 turbo as a

play03:32

first step I'm going to Define tools

play03:35

overall we need agents tools and also

play03:38

make it to work together that is called

play03:39

gra so in regards to tools we are going

play03:42

to provide two different tools that is

play03:43

search to search the internet and get

play03:45

URLs and the second tool is for reading

play03:49

that URL so here we're going to use at

play03:51

decorator to internet search and then

play03:54

use the dugdug go search and return the

play03:57

top five results next we're going to

play03:59

create

play04:00

process content function it is using

play04:02

beautiful soup to pause the content and

play04:04

return the content now we're going to

play04:06

list the tools in the tools variable the

play04:09

next step is to create agents you're

play04:10

going to create the helper function for

play04:12

creating agents so firstly the function

play04:14

is called create agent this will get the

play04:16

Lost language model in our case it's our

play04:19

open AI chart gbt then we get the tools

play04:22

and then system prompt this will

play04:24

automatically use create open a tools

play04:27

agent to create the agent and it

play04:29

performs the executed task and return

play04:31

the executor next we're going to Define

play04:33

agent noes these are nothing but a

play04:36

function which runs the agent in simple

play04:39

terms create agent creates One agent and

play04:42

if you want to make the agent to perform

play04:45

multiple toss then you use the agent

play04:48

node function it is like recruiting one

play04:51

analyst in your company and then

play04:53

providing multiple task to the analyst

play04:56

we have seen that we need a internet

play04:58

search analyst agent and inside

play04:59

researcher agent to perform the insights

play05:02

task but in this case we are going to

play05:04

add one more person into the loop who is

play05:06

the supervisor the supervisor is going

play05:09

to work together with other agents to

play05:11

finally give us the insights so now

play05:13

creating agent supervisor providing the

play05:16

list of members one member is web

play05:18

searcher next one is inside researcher

play05:21

now we are going to create a system

play05:22

prompt here it says as a supervisor Your

play05:25

Role is to oversee a dialogue between

play05:27

workers based on the user requ EST

play05:30

determine which worker should take the

play05:32

next action each worker is responsible

play05:34

for executing a specific task reporting

play05:36

back their findings and progress once

play05:38

all tasks are complete indicate with

play05:41

finish now we going to define the

play05:42

options with members next we are going

play05:45

to do the function calling with the name

play05:47

root root is nothing but the link

play05:51

between the supervisor and the different

play05:53

agents so this function is going to

play05:55

determine if the supervisor is going to

play05:57

send the task to which agent or is it

play06:00

going to finish it now we going to

play06:02

define the prompt from chat prompt

play06:04

template so this is now passing the

play06:07

system prompt that is the one which we

play06:09

have defined earlier as the system

play06:11

message here we say given the

play06:13

conversation above who should act next

play06:16

the prompt to choose who will be doing

play06:18

the task next we're going to add

play06:20

supervisor chain where you provide the

play06:22

prompts which you got from here and

play06:24

provide the function definition and

play06:26

parse the output now we have completed

play06:29

creating a supervisor agent this looks

play06:32

big but once you understand this it'll

play06:35

be much more easier so it's just the

play06:37

prompt defining the function and then

play06:39

providing both prompt and function

play06:41

definition in supervisor chain that's it

play06:44

now we're going to create the two more

play06:46

agents internet research analyst and

play06:48

insights researcher search agent equals

play06:51

create agent and then providing the llm

play06:54

tools you are mentioning that you are a

play06:56

web searcher next we are creating a node

play06:59

search node equals function tools and

play07:01

partial you're providing the agent node

play07:04

this agent node is the agent node

play07:05

function which we created earlier this

play07:07

is used to run the agent next you're

play07:09

providing the name of the agent that is

play07:11

a search agent and web searcher this is

play07:14

one big difference in Lang graph nodes

play07:17

are nothing but agents or tools each and

play07:20

every action is called a node next we

play07:22

are going to create insights research

play07:24

agent create agent llm tools and then

play07:28

providing the prompt you are an Insight

play07:30

researcher do step by step based on the

play07:33

provided content first identify the list

play07:35

of topics then search internet for each

play07:38

topic one by one and finally find

play07:41

insights for each topic one by one

play07:43

include the insights and sources in the

play07:45

final response next insights research

play07:47

node same as above we are defining the

play07:50

node for this agent and naming it as

play07:52

Insight researcher now we have completed

play07:54

the step of creating agents and then

play07:56

tools finally we are going to make them

play07:58

work together together by creating

play08:00

graphs next is Define the agent State

play08:03

edges and graph these are the terms you

play08:05

might need to understand agent state is

play08:07

nothing but the conversation history

play08:10

between different agents edges are the

play08:13

connection between those agents graph

play08:16

can be also called pipeline or workflow

play08:19

first we are defining the agent state

play08:20

with a clause we are saving all the

play08:23

messages between agents these messages

play08:25

will be passed to each individual agents

play08:28

when they perform the tasks so they know

play08:30

the context before running the task next

play08:33

we Define the workflow workflows are

play08:36

nothing but the graph then we are going

play08:38

to add nodes first we are adding web

play08:40

searcher node next workflow. add node

play08:44

Insight researcher workflow. add node

play08:47

supervisor chain now we are going to

play08:49

define the edges Define edges for member

play08:52

in members workflow. add edge member and

play08:56

then supervisor what does that mean so

play08:59

when whenever a member that means a web

play09:01

searcher or Insight researcher perform a

play09:03

task then it should report back to

play09:05

supervisor next we Define the

play09:07

conditional map conditional map finish

play09:09

equals end then workflow. add

play09:12

conditional edes is supervisor and then

play09:14

providing the conditional map so what

play09:16

does this mean this is a conditional

play09:18

Edge which means the supervisor won't

play09:21

send the task to the agent all the time

play09:25

if the message says finish that means

play09:28

the supervisor will terminate the

play09:29

request if it is not finished then it'll

play09:32

keep on sending that to agents to

play09:34

complete the task that's what this mean

play09:36

we have come to the final stage we're

play09:39

going to say workflow set entry point

play09:41

supervisor just telling to the L graph

play09:44

that is the starting point now we are

play09:46

going to compile the workflow that is

play09:49

the actual graph you can also call that

play09:51

as a pipeline finally we going to run

play09:53

the graph using graph. stream we are

play09:56

providing the question or the task to

play09:58

ask search for the latest AI technology

play10:01

Trends in 2024 summarize the content

play10:05

after summarizing pass it onto the

play10:07

Insight researcher to provide insights

play10:09

for each topic and we are printing out

play10:11

the results as a quick summary we

play10:13

created two tools internet search and

play10:16

process content tools then we created

play10:19

three agents one is a supervisor agent

play10:23

where we are providing the prompt and

play10:25

the function calling next we have the

play10:26

search agent thirdly we have the

play10:29

Insight research agent then we are

play10:31

creating the graph that is the workflow

play10:33

and adding all the notes notes are where

play10:36

the agents are located finally we are

play10:38

running by graph. stream now I'm going

play10:41

to run this code in your terminal Python

play10:43

app.py and then click enter first I can

play10:45

see it's going to the web searcher agent

play10:47

from the supervisor agent let's monitor

play10:50

this in lsmith this is the Lang Smith

play10:53

layout where you can see monitor how the

play10:56

agents are interacting you might need to

play10:59

request access for this dashboard here

play11:01

we can see first the supervisor agent is

play11:03

sending the request to web searcher and

play11:05

web searcher is performing the search

play11:07

task by getting all those URLs then it's

play11:11

processing the content by going into

play11:14

each individual page and returning the

play11:16

output once the web searcher completes

play11:19

its task then it goes to the supervisor

play11:21

again supervisor is sending the task

play11:23

again to the Insight researcher now the

play11:25

Insight researcher identifies the list

play11:28

of topics such as multimodel ai agentic

play11:31

ai open source AI from the content and

play11:35

it's doing individual search for each

play11:39

individual topic which you can see here

play11:41

at the end it's summarizing everything

play11:43

and returning the output so as you can

play11:45

see here these are the agents working

play11:47

together to complete the task in our

play11:49

terminal we can see the output displayed

play11:51

here first web searcher agent then

play11:54

inside researcher finally we got the

play11:56

output here if you don't want to stream

play11:58

we we can even modify this with this

play12:01

graph. inor function with the same

play12:03

request and that will return the actual

play12:06

content you will get output like this in

play12:09

mock down format so if I copy this and

play12:11

paste it in a Word document this is what

play12:13

we got so the question we asked latest

play12:16

AI technology Trends in 2024 here is the

play12:19

answer after a in-depth research

play12:21

customize chatbots and generative AI

play12:24

multitasking robots Quantum ai ai

play12:27

legislation ethical Ai and much more to

play12:31

create a user interface with gradio just

play12:33

change this part and create a run graph

play12:36

function return the response and set up

play12:40

the interface with GI interface now I'm

play12:43

going to run this code Python app.py and

play12:44

then click enter now navigate to the URL

play12:47

after entering the question this is what

play12:49

I got as simple as that now we covered a

play12:52

use case where we are able to get more

play12:54

in-depth search with just one query

play12:58

which I think will be the the future of

play12:59

internet search and how this is compared

play13:01

with autogen and crew AI Lang gr with

play13:05

Lang chain ecosystem is more extendable

play13:08

because it contains many Lang chain

play13:10

tools easy implementation of Rags more

play13:14

customization in regards to edges agents

play13:17

flow Etc but if you want a bit more

play13:20

simpler version you can use autogen and

play13:22

the simplest version would be crew AI so

play13:25

did langro beat autogen depends on the

play13:28

use case we have covered only two agents

play13:31

with this setup but we can even extend

play13:33

that further with multiple teams like

play13:36

this and with multiple agents under each

play13:38

team which I'll be creating in the near

play13:40

future so stay tuned I hope you like

play13:42

this video do like share and subscribe

play13:44

and thanks for watching

Rate This

5.0 / 5 (0 votes)

Related Tags
Lang GroAI代理互联网搜索2024年AI趋势自动化分析自定义聊天机器人多任务机器人量子AIAI立法伦理AI技术趋势用户界面Gradio自动化工具Lang链生态系统
Do you need a summary in English?