LangGraph AI Agents: How Future of Internet Search will look like?
Summary
TLDR本视频介绍了如何利用Lang Chain生态系统创建语言代理(agents),以实现自动化的互联网搜索和深入分析。首先,创建了两个代理:互联网搜索分析师和洞察研究员。互联网搜索分析师负责基于问题搜索互联网、浏览链接并总结文章。洞察研究员则进一步分析总结中的主题,并为每个主题提供深入的答案。接着,通过安装Lang Chain、OpenAI、lsmith等工具,并设置环境变量和API密钥,展示了如何创建和执行这些代理。视频还演示了如何通过Lang Chain的Lgraph工具创建工具和代理节点,并构建工作流程图(graph),以协调不同代理之间的交互。最终,通过运行构建的图(graph),展示了如何通过一个查询获得深入的搜索结果。此外,还介绍了如何使用Gradio创建用户界面,以便用户可以通过简单的交互获得搜索结果。整个流程展示了Lang Chain生态系统的可扩展性和灵活性,以及如何通过自定义代理和工具来实现复杂的搜索和分析任务。
Takeaways
- 🚀 引入了Lang chain,这是从Lang chain到GRS(语言代理)的重大更新,允许创建不同的代理来完成任务。
- 🛠️ 用户可以在Lang chain生态系统中定义工具并将这些工具分配给相应的代理。
- 🌐 展示了如何将Lang chain集成到lsmith中,作为互联网搜索未来可能的用例。
- 📈 介绍了如何创建两个代理:互联网搜索分析师和洞察研究员,它们将协同工作以提供深入的搜索结果。
- 🔍 互联网搜索分析师代理将基于提出的问题搜索互联网,浏览不同链接并总结文章。
- 🔑 洞察研究员将识别总结中的主题,并针对每个主题进行深入搜索,提供详细的答案。
- 📚 展示了如何安装Lang chain、Lang graph、Open AI、lsmith等工具,并设置环境变量。
- 💡 介绍了如何定义工具和代理,以及如何通过系统提示和函数调用来创建代理节点。
- 📝 通过创建代理状态、边缘和图来定义工作流程,这些是Lang chain中的关键概念。
- 🔗 描述了如何通过添加节点和边缘来构建工作流程,以及如何使用条件映射来控制工作流程的流程。
- 📈 展示了如何编译并运行工作流程,以及如何通过终端和lsmith监控代理之间的交互。
- 🎉 最终,通过这个系统,用户可以获取关于2024年最新AI技术趋势的深入研究报告。
- 📊 还提到了如何使用Gradio创建用户界面,以便用户可以通过图形界面进行查询。
- 🤖 讨论了Lang chain生态系统与AutoGen和Crew AI的比较,强调了Lang chain的可扩展性和定制化优势。
Q & A
Lang Gro是什么?
-Lang Gro是一个语言代理系统,它允许创建不同的代理来完成特定的任务,并且可以定义工具并将这些工具分配给这些代理,完全在Lang chain生态系统中进行集成。
如何使用Lang Gro创建代理?
-首先,你需要安装Lang chain、Lang graph、Lang chain open AI、lsmith、Lang chain Hub、dug duug、go search、beautiful soup和gradio等工具。然后,创建一个名为app.py的文件,并在文件中导入必要的函数和模块。接下来,定义工具和代理,设置环境变量,初始化模型,并创建一个系统提示来指导代理的行为。最后,通过定义代理节点和工作流来使代理协同工作。
Lang Gro中的代理和工具有什么区别?
-在Lang Gro中,代理是指执行特定任务的实体,如互联网搜索分析师或洞察研究员。而工具则是指代理用来完成任务的具体功能,比如搜索互联网以获取URLs,或者读取并处理URL指向的内容。
如何将Lang Gro集成到lsmith中?
-通过安装Lang chain Hub并导出你的Open AI API密钥和Lang chain API密钥(如果使用lsmith),可以在lsmith中监测Lang chain的状态。在lsmith的用户界面中,可以监控代理之间的交互和任务执行情况。
Lang Gro如何改变互联网搜索的未来?
-Lang Gro通过创建专门的代理来执行搜索任务,比如互联网搜索分析师会基于问题搜索互联网并总结文章,而洞察研究员则会识别总结中的主题并进行深入搜索,提供详细的答案。这种工作流程预示着互联网搜索将变得更加深入和个性化。
如何使用Lang Gro生成用户界面?
-可以通过Gradio库来创建用户界面。在代码中修改相应的部分,创建一个运行图函数,并返回响应。然后,使用Gradio的界面设置来完成用户界面的建立。
Lang Gro与AutoGen和Crew AI相比有什么优势?
-Lang Gro在Lang chain生态系统中更加可扩展,因为它包含了许多Lang chain工具,可以更容易地实现代理,并且对于边缘、代理、流程等方面有更多的定制化选项。而AutoGen提供一个更简单的版本,Crew AI则是最简单的版本。选择哪种工具取决于具体的用例需求。
如何通过Lang Gro进行深入的互联网搜索?
-通过创建互联网搜索分析师代理和洞察研究员代理,可以进行深入的互联网搜索。首先,互联网搜索分析师代理会搜索互联网并总结相关内容,然后洞察研究员代理会识别主题并进行更深入的搜索,最终提供详细的见解。
Lang Gro中的工作流是如何定义的?
-在Lang Gro中,工作流是通过定义代理状态、边缘和图来创建的。代理状态保存了不同代理之间的对话历史,边缘是代理之间的连接,而图(或称为管道或工作流)则定义了整个任务执行的顺序和结构。
如何监控Lang Gro中代理的工作状态?
-可以通过lsmith的用户界面来监控代理的工作状态。在lsmith中,可以查看代理之间的交互和任务执行的实时情况,这需要在Lang Gro设置中集成lsmith并请求访问该仪表板。
Lang Gro是否支持多语言?
-根据提供的脚本内容,Lang Gro主要讨论的是其在搜索和分析任务中的应用,并没有明确指出是否支持多语言。但是,由于其基于Lang chain生态系统,可以推测它可能具有处理不同语言的能力。
Lang Gro是否需要编程知识来使用?
-是的,使用Lang Gro需要一定的编程知识。你需要能够安装必要的库,编写Python代码来定义代理、工具和工作流,以及设置环境变量和API密钥。
Outlines
🚀 介绍Lang Gro及其生态系统
本段介绍了Lang Gro(可能指的是一种语言模型或工具)及其在Lang chain生态系统中的应用。讨论了如何创建不同的代理(agents)来完成特定任务,并将这些代理与定义好的工具相结合。提到了如何将这些工具集成到Lang chain中,并通过一个用例展示了互联网搜索的未来。最后,介绍了如何创建用户界面,并鼓励观众订阅YouTube频道以获取更多关于人工智能的信息。
🔍 创建代理和工具的步骤
详细描述了如何安装Lang chain、Lang graph、Lang chain Open AI、lsmith等工具,并导出必要的API密钥。然后,创建了一个名为app.py的文件,并导入了所需的库和模块。接下来,定义了环境变量,初始化了模型,并详细说明了如何创建工具,特别是用于互联网搜索和内容处理的工具。之后,介绍了如何创建代理,包括互联网搜索分析师和洞察研究员,以及它们的职责。最后,通过创建一个名为supervisor的代理来协调其他代理的工作,并通过定义选项、系统提示和根函数来管理任务的分配和完成。
📈 运行代理和工具的流程
本段展示了如何通过定义代理状态、边缘和图来创建工作流程。代理状态保存了代理之间的对话历史,边缘是代理之间的连接,而图(或称为管道/工作流)则定义了整个任务执行的顺序。通过添加节点和边缘,构建了工作流程,并定义了条件映射来控制任务的流向。最后,通过运行graph.stream来启动整个流程,展示了如何通过终端命令执行搜索任务,并在lsmith仪表板中监控代理的交互。此外,还介绍了如何使用Gradio创建用户界面,并通过运行代码来展示最终的搜索结果。
Mindmap
Keywords
💡Lang Gro
💡Agent
💡Tool
💡LSmith
💡API Key
💡Beautiful Soup
💡Gradio
💡Workflow
💡System Prompt
💡Conditional Map
💡Graph Stream
Highlights
介绍了Lang gr(语言代理)的概念,这是从Lang chain(语言链)的一次重大更新。
可以创建不同的代理来完成特定的任务,并且可以在Lang chain生态系统中定义工具并分配给这些代理。
展示了如何将Lang chain集成到lsmith(一种用户界面)中。
展示了如何创建两个代理:互联网搜索分析师和洞察研究员。
互联网搜索分析师将基于提出的问题搜索互联网,浏览不同链接并总结文章。
洞察研究员将识别总结中的主题,并为每个主题进行深入搜索,提供详细的答案。
展示了未来互联网搜索可能的发展方向。
介绍了如何安装Lang chain、Lang graph、Lang chain Open AI、lsmith、Lang chain Hub、dug duug、go search、beautiful soup和gradio。
展示了如何导出Open AI API密钥和Lang chain API密钥。
创建了一个名为app.py的文件,并导入了必要的函数和库。
设置了环境变量Lang chain tracing V2和Lang chain project。
初始化了模型,选择了gp4 turbo作为第一步。
定义了两个工具:搜索互联网以获取URLs,以及阅读URL的内容。
创建了帮助函数来创建代理,包括互联网搜索分析师和洞察研究员。
创建了监督者代理,它将与其他代理一起工作,以最终提供洞察。
定义了代理节点,这些是运行代理的函数。
创建了代理状态、边缘和图,这些是Lang graph中的关键概念。
通过graph.stream运行图,提供了问题或任务,以搜索和总结内容。
展示了如何在终端运行代码,并在lsmith中监控代理之间的交互。
讨论了Lang gr与Autogen和Crew AI的比较,以及Lang chain生态系统的可扩展性。
展示了如何使用Gradio创建用户界面,并运行代码以获取结果。
强调了Lang gr的可定制性,以及如何通过Lang chain工具实现简单的实现和更多的定制。
提出了未来将创建更多团队和代理的可能性。
Transcripts
this is amazing now we have Lang gr
language agents as GRS this is a huge
update from Lang chain we are able to
create different agents to complete a
task we are able to Define tools and
assign those tools to those agents
completely in Lang chain ecosystem it
can be integrated in lsmith like this as
a use case we are going to see how the
future of internet search will look like
and finally we are going to create a
user interface like this that's exactly
what we're going to see today let's get
[Music]
started hi everyone I'm really excited
to show you about Lang Gro I'm going to
take you through step by step on how to
create agents how to create tools and
explain with the use case but before
that I regularly create videos in
regards to Artificial Intelligence on my
YouTube channel so do subscribe and
click the Bell icon to stay tuned make
sure you click the like button so this
video can be helpful for many others
like you as I used use case first we are
going to create two agents internet
search analyst and insight researcher
the internet search analyst is going to
search the internet based on the
question we are going to ask it is going
to go through different links and then
summarize all those articles imagine
you're going to Google and type latest
AI Trends 2024 go through every single
URL probably the first five or first 10
and then summarize everything it is
going to take a lot of time but the
internet search analyst agent is is
going to do the same based on the topic
is going to go through different URL and
then give us a summary next that summary
is sent to the Insight researcher this
insights researcher will identify
different topics in that summary and
then again do in-depth search for each
individual topic and give us a detailed
in-depth answer this I believe would be
the future of internet search at the end
you will have a report like this where
you identified all the key areas are the
topics and insights about those topics
first we going to install Lang chain
Lang graph Lang chain open AI lsmith
Lang chain Hub dug duug go search
beautiful soup and gradio using pip
install and then click enter once that
is installed export your open AI API key
like this and click enter next export
Lang chain API key if you're using
lsmith this is optional and click enter
Then create a file called app.py and
then let's open it inside the file first
import function tools operator request
OS and Json next beautiful soup dugdug
go search agent executor and create open
AI tools agent base message human
message Jason output functions parser
chat prompt template and message
placeholder State graph and end from
Lang graph tool chat open AI various
data types and finally gradio I've
already covered basics in regard L chain
which I will link that in the
description below as the name suggests
the pares are used to par agents are to
create and execute agents messages to
provide messages L graph is to create
the graph tools to create tools first we
are going to set environment variables
Lang chain tracing V2 true and then Lang
chain project this is required only if
you use lsmith that's the user interface
to monitor the Lang chain status which I
will show you towards the end of the
video first we going to initialize the
model I'm going to use gp4 turbo as a
first step I'm going to Define tools
overall we need agents tools and also
make it to work together that is called
gra so in regards to tools we are going
to provide two different tools that is
search to search the internet and get
URLs and the second tool is for reading
that URL so here we're going to use at
decorator to internet search and then
use the dugdug go search and return the
top five results next we're going to
create
process content function it is using
beautiful soup to pause the content and
return the content now we're going to
list the tools in the tools variable the
next step is to create agents you're
going to create the helper function for
creating agents so firstly the function
is called create agent this will get the
Lost language model in our case it's our
open AI chart gbt then we get the tools
and then system prompt this will
automatically use create open a tools
agent to create the agent and it
performs the executed task and return
the executor next we're going to Define
agent noes these are nothing but a
function which runs the agent in simple
terms create agent creates One agent and
if you want to make the agent to perform
multiple toss then you use the agent
node function it is like recruiting one
analyst in your company and then
providing multiple task to the analyst
we have seen that we need a internet
search analyst agent and inside
researcher agent to perform the insights
task but in this case we are going to
add one more person into the loop who is
the supervisor the supervisor is going
to work together with other agents to
finally give us the insights so now
creating agent supervisor providing the
list of members one member is web
searcher next one is inside researcher
now we are going to create a system
prompt here it says as a supervisor Your
Role is to oversee a dialogue between
workers based on the user requ EST
determine which worker should take the
next action each worker is responsible
for executing a specific task reporting
back their findings and progress once
all tasks are complete indicate with
finish now we going to define the
options with members next we are going
to do the function calling with the name
root root is nothing but the link
between the supervisor and the different
agents so this function is going to
determine if the supervisor is going to
send the task to which agent or is it
going to finish it now we going to
define the prompt from chat prompt
template so this is now passing the
system prompt that is the one which we
have defined earlier as the system
message here we say given the
conversation above who should act next
the prompt to choose who will be doing
the task next we're going to add
supervisor chain where you provide the
prompts which you got from here and
provide the function definition and
parse the output now we have completed
creating a supervisor agent this looks
big but once you understand this it'll
be much more easier so it's just the
prompt defining the function and then
providing both prompt and function
definition in supervisor chain that's it
now we're going to create the two more
agents internet research analyst and
insights researcher search agent equals
create agent and then providing the llm
tools you are mentioning that you are a
web searcher next we are creating a node
search node equals function tools and
partial you're providing the agent node
this agent node is the agent node
function which we created earlier this
is used to run the agent next you're
providing the name of the agent that is
a search agent and web searcher this is
one big difference in Lang graph nodes
are nothing but agents or tools each and
every action is called a node next we
are going to create insights research
agent create agent llm tools and then
providing the prompt you are an Insight
researcher do step by step based on the
provided content first identify the list
of topics then search internet for each
topic one by one and finally find
insights for each topic one by one
include the insights and sources in the
final response next insights research
node same as above we are defining the
node for this agent and naming it as
Insight researcher now we have completed
the step of creating agents and then
tools finally we are going to make them
work together together by creating
graphs next is Define the agent State
edges and graph these are the terms you
might need to understand agent state is
nothing but the conversation history
between different agents edges are the
connection between those agents graph
can be also called pipeline or workflow
first we are defining the agent state
with a clause we are saving all the
messages between agents these messages
will be passed to each individual agents
when they perform the tasks so they know
the context before running the task next
we Define the workflow workflows are
nothing but the graph then we are going
to add nodes first we are adding web
searcher node next workflow. add node
Insight researcher workflow. add node
supervisor chain now we are going to
define the edges Define edges for member
in members workflow. add edge member and
then supervisor what does that mean so
when whenever a member that means a web
searcher or Insight researcher perform a
task then it should report back to
supervisor next we Define the
conditional map conditional map finish
equals end then workflow. add
conditional edes is supervisor and then
providing the conditional map so what
does this mean this is a conditional
Edge which means the supervisor won't
send the task to the agent all the time
if the message says finish that means
the supervisor will terminate the
request if it is not finished then it'll
keep on sending that to agents to
complete the task that's what this mean
we have come to the final stage we're
going to say workflow set entry point
supervisor just telling to the L graph
that is the starting point now we are
going to compile the workflow that is
the actual graph you can also call that
as a pipeline finally we going to run
the graph using graph. stream we are
providing the question or the task to
ask search for the latest AI technology
Trends in 2024 summarize the content
after summarizing pass it onto the
Insight researcher to provide insights
for each topic and we are printing out
the results as a quick summary we
created two tools internet search and
process content tools then we created
three agents one is a supervisor agent
where we are providing the prompt and
the function calling next we have the
search agent thirdly we have the
Insight research agent then we are
creating the graph that is the workflow
and adding all the notes notes are where
the agents are located finally we are
running by graph. stream now I'm going
to run this code in your terminal Python
app.py and then click enter first I can
see it's going to the web searcher agent
from the supervisor agent let's monitor
this in lsmith this is the Lang Smith
layout where you can see monitor how the
agents are interacting you might need to
request access for this dashboard here
we can see first the supervisor agent is
sending the request to web searcher and
web searcher is performing the search
task by getting all those URLs then it's
processing the content by going into
each individual page and returning the
output once the web searcher completes
its task then it goes to the supervisor
again supervisor is sending the task
again to the Insight researcher now the
Insight researcher identifies the list
of topics such as multimodel ai agentic
ai open source AI from the content and
it's doing individual search for each
individual topic which you can see here
at the end it's summarizing everything
and returning the output so as you can
see here these are the agents working
together to complete the task in our
terminal we can see the output displayed
here first web searcher agent then
inside researcher finally we got the
output here if you don't want to stream
we we can even modify this with this
graph. inor function with the same
request and that will return the actual
content you will get output like this in
mock down format so if I copy this and
paste it in a Word document this is what
we got so the question we asked latest
AI technology Trends in 2024 here is the
answer after a in-depth research
customize chatbots and generative AI
multitasking robots Quantum ai ai
legislation ethical Ai and much more to
create a user interface with gradio just
change this part and create a run graph
function return the response and set up
the interface with GI interface now I'm
going to run this code Python app.py and
then click enter now navigate to the URL
after entering the question this is what
I got as simple as that now we covered a
use case where we are able to get more
in-depth search with just one query
which I think will be the the future of
internet search and how this is compared
with autogen and crew AI Lang gr with
Lang chain ecosystem is more extendable
because it contains many Lang chain
tools easy implementation of Rags more
customization in regards to edges agents
flow Etc but if you want a bit more
simpler version you can use autogen and
the simplest version would be crew AI so
did langro beat autogen depends on the
use case we have covered only two agents
with this setup but we can even extend
that further with multiple teams like
this and with multiple agents under each
team which I'll be creating in the near
future so stay tuned I hope you like
this video do like share and subscribe
and thanks for watching
関連する他のビデオを見る
![](https://i.ytimg.com/vi/X05ORyy7ZW4/hq720.jpg)
GPT-4o AI Agents: Easily Create Medical Research Agents (Praison AI)
![](https://i.ytimg.com/vi/pbAd8O1Lvm4/hq720.jpg)
Self-reflective RAG with LangGraph: Self-RAG and CRAG
![](https://i.ytimg.com/vi/JLmI0GJuGlY/hq720.jpg?sqp=-oaymwEmCIAKENAF8quKqQMa8AEB-AH-CYAC0AWKAgwIABABGGUgUihPMA8=&rs=AOn4CLDMU6k8dBTOIKV-VhORCAgtRf9cZA)
Python Advanced AI Agent Tutorial - LlamaIndex, Ollama and Multi-LLM!
![](https://i.ytimg.com/vi/DwsKeoXOa9I/hqdefault.jpg?sqp=-oaymwExCJADEOABSFryq4qpAyMIARUAAIhCGAHwAQH4AdQGgALgA4oCDAgAEAEYYyBjKGMwDw==&rs=AOn4CLBKuufsw6bSuzuMXggtqteyXj13mw)
Basic Computing Skills - Orientation
![](https://i.ytimg.com/vi/w_YRnA8RdnU/hq720.jpg)
Build Anything with Perplexity, Here’s How
![](https://i.ytimg.com/vi/zYyOF1JQato/hq720.jpg)
Stream of Search (SoS): Learning to Search in Language
5.0 / 5 (0 votes)