Announcing LlamaIndex Gen AI Playlist- Llamaindex Vs Langchain Framework

Krish Naik
29 Jan 202415:22

Summary

TLDRIn this informative video, Krishak introduces a new playlist focusing on 'Lama Index,' a framework for integrating custom data with large language models (LLMs). He explains Lama Index's core functionality, its role in data indexing and retrieval, and distinguishes it from 'Lang Chain,' another generative AI framework. Krishak promises to demonstrate how to harness these tools for end-to-end projects, emphasizing Lama Index's efficiency in handling structured, unstructured, and semi-structured data for quick data lookup and retrieval.

Takeaways

  • πŸš€ The video introduces a new playlist focused on 'Lama Index', a framework for working with generative AI and large language models (LLMs).
  • πŸ“š Krishak, the YouTuber, has been creating content around various LLM models and frameworks, including OpenAI, LangChain, Lama 2 from Meta, Hugging Face libraries, and Google's Ginipro and Pro Vision.
  • πŸ” Lama Index is described as a flexible data framework specifically designed to connect custom data sources to large language models, facilitating efficient data retrieval and search.
  • πŸ“ˆ The script explains the process of 'indexing' with Lama Index, which involves creating metadata to enable quick querying and retrieval of information from custom data sources.
  • πŸ› οΈ Lama Index is positioned as beneficial for creating end-to-end projects, particularly where the focus is on connecting custom data to LLMs, as opposed to LangChain, which is more about building a wide range of generative applications.
  • πŸ“ Differences between Lama Index and LangChain are highlighted, with Lama Index focusing on intelligent search, data indexing, and retrieval, while LangChain offers a broader range of functionalities for various use cases.
  • πŸ”„ Lama Index is efficient for ingesting, structuring, and accessing private or domain-specific data, whereas LangChain is more about loading, processing, and indexing data with high customization.
  • πŸ”‘ Lama Index provides tools for integrating private data into LLMs, making it specialized for efficient and fast search, whereas LangChain allows chaining multiple tools and components for more flexible application behavior.
  • 🌐 Both frameworks support a variety of LLM providers, but LangChain is noted to support more models, offering greater flexibility in terms of model integration.
  • 🏒 Lama Index is best suited for applications requiring quick data lookup and retrieval, making it ideal for search-centric applications, while LangChain is suitable for applications needing complex interactions and memory.
  • πŸ”§ The video promises upcoming content that will demonstrate the use of Lama Index and LangChain in projects, including handling multiple PDFs and showcasing various functionalities and use cases.

Q & A

  • What is the main focus of the video by Krishak?

    -The video focuses on introducing and explaining the Lama Index framework, its benefits, and how it differs from Lang Chain, as well as discussing their use in generative AI projects.

  • What is Lama Index according to the video?

    -Lama Index is described as a simple and flexible data framework for connecting custom data sources to large language models, enabling efficient indexing and retrieval of data.

  • What are some of the functionalities that can be created using Lama Index?

    -With Lama Index, functionalities such as document Q&A, data-augmented chat, knowledge agents, and structured analysis can be created.

  • How does Lama Index handle different types of data?

    -Lama Index supports the connection of unstructured, structured, and semi-structured data to large language models.

  • What are the three main steps in using Lama Index as described in the video?

    -The three main steps are data ingestion, which connects to various data sources; data indexing, which stores and indexes the data for different use cases; and providing a query interface that accepts input prompts and returns knowledge-augmented responses.

  • What is the primary focus of Lang Chain compared to Lama Index?

    -Lang Chain's primary focus is on building a wide range of generative applications, offering high customization and the ability to chain multiple tools and components.

  • How does Lama Index differ from Lang Chain in terms of data handling?

    -Lama Index is specialized for efficient and fast search, ingesting, structuring, and accessing private or domain-specific data, while Lang Chain is more about loading, processing, and indexing data for various use cases.

  • Which types of applications are best suited for Lama Index according to the video?

    -Applications that require quick data lookup and retrieval are best suited for Lama Index, such as search-centric applications.

  • What is the significance of using both Lama Index and Lang Chain in a project architecture?

    -Using both allows for efficient data indexing and retrieval with Lama Index and then leveraging the flexibility and customization of Lang Chain to build complex generative AI applications.

  • What is the role of indexing in Lama Index as explained in the video?

    -Indexing in Lama Index involves creating metadata that enables quick querying of the data, leading to faster and more efficient data retrieval compared to other methods.

  • What can viewers expect to learn in the upcoming videos by Krishak?

    -Viewers can expect to learn how to create projects using Lama Index and Lang Chain, explore use cases, understand functionalities of Lama Index, and learn about vector embeddings in Lama Index.

Outlines

00:00

πŸš€ Introduction to Lama Index and Generative AI Projects

Krishak introduces a new YouTube playlist focusing on Lama Index, a framework for connecting custom data sources to large language models (LLMs). He discusses his previous work with generative AI, mentioning projects using open AI models, LangChain, and Hugging Face libraries. The video aims to clarify the role of Lama Index in creating end-to-end projects and to distinguish it from LangChain, emphasizing its strengths in specific applications.

05:02

πŸ” Understanding Lama Index and Its Differences with LangChain

This paragraph delves into what Lama Index is, highlighting its role in intelligent search, data indexing, and retrieval. Krishak explains the functionalities of Lama Index, such as creating document Q&A, data-augmented chat, and knowledge agents. He contrasts Lama Index with LangChain, discussing their respective focuses, data handling capabilities, customization options, and flexibility. The paragraph also outlines the types of data Lama Index can handle and the differences in their integration with LLMs and use cases.

10:04

πŸ› οΈ Lama Index and LangChain in Project Architecture

Krishak illustrates the use of Lama Index and LangChain in a project's architecture, detailing the steps of data ingestion, indexing, and querying facilitated by Lama Index. He describes how the response from Lama Index can be used with prompts to interact with LLM models, which is where LangChain's capabilities come into play. The paragraph emphasizes the efficiency of Lama Index in search-centric applications and how it complements LangChain for building complex applications.

15:05

🌟 Wrapping Up and Future Video Preview

In the concluding paragraph, Krishak summarizes the video's content and teases upcoming videos where he will demonstrate projects using Lama Index and LangChain. He mentions the potential for exploring multiple PDFs, discussing functionalities of Lama Index, and the applications that can be developed with it. The paragraph ends with a farewell and an invitation to the next video in the series.

Mindmap

Keywords

πŸ’‘Lama Index

Lama Index is a framework designed to connect custom data sources to large language models. It is pivotal in the video as the creator discusses its use for creating end-to-end projects with generative AI. The script mentions Lama Index as being 'simple and flexible' for data indexing and retrieval, highlighting its efficiency in handling custom data and providing quick responses to queries.

πŸ’‘Generative AI

Generative AI refers to artificial intelligence systems that can generate new content, such as text, images, or videos. In the context of the video, the creator has been focusing on Generative AI, using various models and frameworks like Lama Index to create projects that harness the power of these models, emphasizing the innovative applications of AI in content creation.

πŸ’‘LLM Models

LLM Models, or Large Language Models, are AI models trained on vast amounts of text data to generate human-like responses. The script discusses the integration of these models with frameworks like Lama Index and Lang Chain for implementing projects. The video aims to clarify how these models work in conjunction with the mentioned frameworks to create advanced AI applications.

πŸ’‘Lang Chain

Lang Chain is a framework that the video script contrasts with Lama Index. It is used for building a wide range of generative applications and is known for its flexibility and customization options. The script explains the differences between Lang Chain and Lama Index, emphasizing their unique roles in project architecture, particularly in how they handle data and interact with LLM models.

πŸ’‘Data Indexing

Data Indexing is the process of organizing and storing data in a way that allows for efficient searching and retrieval. In the script, Lama Index is highlighted for its ability to perform data indexing, which is crucial for connecting custom data sources to LLM models. The video explains how indexing facilitates quick data lookup and retrieval, which is essential for the performance of AI applications.

πŸ’‘Custom Data

Custom Data refers to data that is specific to an individual or organization, such as data from YouTube APIs, PDFs, or SQL databases. The video script discusses how Lama Index can connect these custom data sources to LLM models, allowing for tailored AI applications that can interact with unique datasets.

πŸ’‘Project Architecture

Project Architecture in the video refers to the structure and design of an AI project, including how different components interact and function together. The script explains how Lama Index and Lang Chain fit into this architecture, particularly in the context of data handling and AI model integration, to create efficient and effective AI applications.

πŸ’‘Query Interface

The Query Interface mentioned in the script is a feature of Lama Index that allows users to input prompts and receive knowledge-augmented responses from the indexed data. The video emphasizes the speed and efficiency of this interface, which is crucial for the quick retrieval of information in AI applications.

πŸ’‘Vector Store

A Vector Store is a database designed to store and manage vector representations of data, which can be used for efficient searching and retrieval. The script mentions integrating with downstream vector stores as part of Lama Index's functionality, indicating its role in enhancing the performance of data retrieval in AI applications.

πŸ’‘Prompt Engineering

Prompt Engineering is the process of designing input prompts for AI models to elicit specific responses or behaviors. In the context of the video, prompt engineering is discussed in relation to how responses from Lama Index can be used to create prompts for LLM models, guiding the AI to perform desired functions.

πŸ’‘Knowledge Agents

Knowledge Agents, as mentioned in the script, are components that can be created using Lama Index, capable of performing tasks like document Q&A or data argumentation. The video positions these agents as part of the broader capabilities of Lama Index in creating AI applications that can interact intelligently with data.

Highlights

Introduction to a new playlist on the Lama Index framework.

Focus on generative AI and implementation of end-to-end projects with LLM models.

Discussion on open AI LLM models, LangChain, and open source LLM models like LLama 2.

Use of Hugging Face libraries and deployment strategies with Google Geni Pro and Vision.

Explanation of Lama Index as a data framework for connecting custom data sources to large language models.

The importance of indexing in Lama Index for efficient querying and response.

Functionalities of Lama Index for creating document Q&A, data argumentation, chatbots, and knowledge agents.

Comparison between Lama Index and LangChain regarding their use in project architecture.

Clarification on the specific steps where Lama Index is used in a project pipeline.

Differences in data handling between Lama Index and LangChain, focusing on Lama Index's efficiency.

Customization capabilities of Lama Index for integrating private data into LLMs.

LangChain's flexibility allowing users to chain multiple tools and components.

Lama Index's specialized focus on efficient and fast search with metadata indexing.

LangChain's broader application range and support for a wide variety of LLM models.

Use cases for Lama Index in applications requiring quick data lookup and retrieval.

Integration of Lama Index and LangChain in a project for enhanced functionality.

Upcoming video series on creating projects with Lama Index and LangChain, focusing on PDF indexing and querying.

Discussion on vector embeddings in Lama Index and its applications in AI projects.

Transcripts

play00:00

hello all my name is krishak and welcome

play00:02

to my YouTube channel so guys I'm

play00:04

starting a new playlist on an amazing

play00:07

framework which is called as Lama index

play00:09

now from past couple of months I've been

play00:11

focusing more on generative AI uploading

play00:13

amazing videos implementing multiple end

play00:16

to end projects with the help of llm

play00:18

models so we have discussed already

play00:20

about open AI llm models we have

play00:22

discussed about Frameworks like langin

play00:23

we have created a lot of end to end

play00:25

projects not only that we have also seen

play00:27

open source llm models like Lama 2 from

play00:29

meta we have also used hugging face

play00:32

libraries we have seen how to do the

play00:33

deployment and finally we have also

play00:35

implemented multiple end to-end projects

play00:37

with the help of Google gini pro and

play00:39

Google gini Pro Vision all these things

play00:41

we have specifically done now one more

play00:43

important framework that I'm going to

play00:45

discuss about is called as Lama index

play00:47

again in this playlist there will be a

play00:49

series of videos and again we will see

play00:51

that how we can use Lama index and even

play00:53

Lan framework and create some amazing

play00:56

endtoend projects and harness the power

play00:58

of the llm models all those things will

play01:00

be covered now in this video I'm going

play01:01

to discuss about two important things

play01:04

what is Lama Index right and the second

play01:07

thing which many people still have a lot

play01:09

of confusion like what is the exact

play01:11

difference between Lama index framework

play01:13

and Lang chain not only that we'll also

play01:15

discuss about one simple architecture of

play01:17

a project if you're combining Lama index

play01:20

and Lang chain how your project

play01:22

architecture will look like and

play01:24

specifically because many people think

play01:26

that Lama index is also used to create

play01:28

an end to end gen project it is uh

play01:30

langin is also used to create an end to

play01:32

end gen projects or generative AI

play01:33

projects yes it is used but specifically

play01:36

in those architecture where exactly is

play01:38

Lama index used because Lama index is

play01:41

very good at something which I will be

play01:42

discussing about it and Lang chain is

play01:45

super important and beneficial for Sim

play01:48

some different applications in a

play01:49

specific project so both these things

play01:51

will get covered now let me go ahead and

play01:53

share my screen uh just to keep a target

play01:56

for this particular video guys please

play01:57

let's hit like and keep the Target to

play02:00

th000 likes at least because all these

play02:02

videos are completely for free and as I

play02:04

said that I want to democratize the

play02:07

entire AI education so please do help me

play02:09

in that so here exactly it is like here

play02:12

is the Lama index page itself as I said

play02:15

two important things I'm going to

play02:16

discuss right what exactly is Lama index

play02:19

why it is super beneficial and the

play02:21

second thing is we'll compare the

play02:22

differences between Lama index and Lang

play02:24

chain and with respect to a project

play02:26

architecture where exactly is llama

play02:28

index used and where exactly Lang chain

play02:30

used everything I'll be discussing about

play02:32

it now first to First over here this

play02:36

diagram will actually help you to

play02:38

understand where does Lama index

play02:40

actually work whenever guys uh in your

play02:43

application many people have different

play02:45

custom data right let's say it be

play02:47

companies let it be uh it can be

play02:49

different different data itself data

play02:51

source itself like like YouTube apis PDF

play02:55

it can be Notions it can be SQL and this

play02:57

side you can probably see all the llm

play02:59

models now if you have specific custom

play03:03

data now let's see the definition Lama

play03:05

index is a simple flexible data

play03:08

framework for connecting custom data

play03:10

sources to large language models now to

play03:13

connect this entire data source to this

play03:15

large language models right that is

play03:18

where in those pipeline Lama index can

play03:20

be used beneficially see with the help

play03:23

of Lama index we can create an end to

play03:25

end project but its core important

play03:28

feature is connecting the custom data to

play03:31

the llm models now how do we connect

play03:33

this you know we take this entire data

play03:36

we perform something called as indexing

play03:38

using Lama index and then once we create

play03:41

those indexing right then we will be

play03:43

able to query from that index because it

play03:46

creates the necessary metadata itself

play03:48

right this entire Lama index library and

play03:50

then when we quiring anything we'll be

play03:53

able to get the response quickly right

play03:55

we'll still discuss about multiple

play03:57

points with various features what what

play03:59

is the difference between Lama index and

play04:01

Lang chain but here I hope you got an

play04:04

idea llama index is a simple flexible

play04:06

data framework for connecting custom

play04:08

data source to large language models

play04:10

okay some of the functionalities uh

play04:13

you'll be able to see we'll be able to

play04:14

create document Q&A data argumented chat

play04:17

B knowledge agents structured analysis

play04:19

many more things we'll be able to create

play04:21

and the same thing we also able to

play04:23

create with the help of Lang chain but

play04:25

what exactly is the difference and in

play04:27

the entire project architecture which

play04:29

pipeline we specifically use Lama index

play04:32

and that is where you'll be able to see

play04:34

over here in this three important steps

play04:36

we specifically use one is the data inje

play04:39

that basically means it provides you lot

play04:41

of libraries to connect with your

play04:44

external data of different different

play04:46

data sources like apis PDF document SQL

play04:49

Etc then after ingesting the data it

play04:52

helps you perform data indexing again

play04:55

store and index your data for different

play04:56

use cases integrate with Downstream

play04:58

Vector store and and database providers

play05:01

and the third step is with respect to

play05:03

query interface Lama index provides a

play05:06

pro uh query interface that accepts any

play05:08

input prompt over your data and returns

play05:11

a knowledge augmented response so from

play05:13

that index whenever you put any queries

play05:16

you'll be able to get a good response

play05:18

not only good quicker response it is

play05:20

quite fast so if I consider L index if

play05:24

you tell me Chris tomorrow if in a

play05:26

project if it someone tells you to use l

play05:28

index where's specifically you're going

play05:30

to use in this three steps right now

play05:33

once we get this response we can further

play05:35

connect this to any llm powered app or

play05:37

llm models itself and based on the

play05:40

prompt engineering that we do a prompt

play05:42

template we can do we get a specific

play05:44

response and there we can specifically

play05:46

use langin again I will be discussing as

play05:48

I go ahead so I hope you got an idea

play05:50

with respect to this whenever we talk

play05:53

about Lama Index this three kind of data

play05:56

you can easily connect to it one is

play05:58

unstructured data structured data and

play05:59

semi- structured data right it come it

play06:02

it it supports all the this three

play06:04

different types of data itself now let

play06:07

me quickly go ahead and talk about some

play06:10

of the important differences between

play06:12

Lama index and Lang chain now guys this

play06:15

differences is super beneficial why I'm

play06:18

telling you because in interviews in

play06:21

projects when you implement you should

play06:22

know what is the thing that you should

play06:24

really use and finally after discussing

play06:27

this differences can we use Jama index

play06:29

and Lang chain together in a specific

play06:32

llm powered app and that kind of

play06:34

architecture also I will discuss about

play06:36

so over

play06:38

here all the differences that you'll be

play06:40

seeing and remember guys this uh I have

play06:43

referred it from di dw. a website so

play06:47

this is another blogging website over

play06:49

there where they they put a lot of

play06:51

information regarding generative AI so

play06:53

again the references from here I've

play06:55

taken the screenshot from there and but

play06:57

I'll explain it from my way okay

play07:00

so over here based on the features here

play07:02

you'll be able to see here you have Lama

play07:05

index here you have Lang Lang chain as

play07:06

you know both of them are Frameworks

play07:08

right and these are specifically used

play07:11

with multiple llm models but if I talk

play07:13

about what is the primary focus of Lama

play07:16

index so here you'll be able to see

play07:18

intelligent search and data indexing

play07:20

along with retrieval that three points

play07:23

that I specifically discussed about in

play07:25

case of Lang chain it helps you build a

play07:28

wide range of gen applications right

play07:31

what all different different all

play07:33

different different functionalities J

play07:35

application will help you to create if I

play07:37

talk with respect to data handling Lama

play07:40

index helps you in ingesting structuring

play07:42

and accessing private or domain specific

play07:45

data right it will be able to help you

play07:48

do all these functionalities in case of

play07:50

Lang chain loading processing and

play07:52

indexing data for various use cases see

play07:55

here also you can do indexing but with

play07:58

the help of Lama Index this this

play07:59

indexing that we are specifically doing

play08:02

it is very much efficient over here

play08:05

right why I will show you when I

play08:07

probably develop a project in my

play08:08

upcoming videos upcoming series of

play08:10

videos okay now if I talk with respect

play08:13

to customization it offers tools for

play08:16

integrating private data into llms

play08:18

whereas in Lang chain you can see highly

play08:21

customizable it allows users to chain

play08:23

multiple tools and components right what

play08:25

is this multiple chain multiple tools

play08:27

and components I'll discuss about when I

play08:29

talk about the architecture in case of

play08:31

flexibility but here again we talking

play08:34

about some data and integrating that

play08:37

data with our llm right mostly with

play08:39

respect to Lama index if I talk with

play08:41

respect to flexibility specialized for

play08:44

efficient and fast search see this is

play08:46

what is very much amazing in this

play08:49

efficient and fast search the kind of

play08:51

indexing the kind of metadata that is

play08:53

created using Lama index actually helps

play08:56

us to query that data efficiently

play08:59

and with less response time right in

play09:03

case of Lang chin general purpose

play09:05

framework with more flexibility in

play09:06

application

play09:08

behavior let's see with respect to llm

play09:10

model which all llm models is being

play09:12

connect uh it can probably support so it

play09:14

connects to almost most of the llm pro

play09:16

providers like open AI anthropic hugging

play09:18

face uh AI 21 Labs right in case of Lang

play09:23

chain it supports 60 llm models so it is

play09:27

pretty much good it is more than when

play09:28

comp to the Llama index now you may be

play09:31

saying Krish I'm talking all good points

play09:33

about Lang chain what about llama index

play09:35

guys exact distinguish I'm trying to

play09:37

talk about the exact differences right

play09:40

and once I probably compare compare all

play09:42

these particular uh differences then

play09:44

you'll be able to understand as soon as

play09:46

you see the architecture so use cases

play09:48

here you can probably see best for

play09:50

application that require quick data

play09:51

lookup and retrieval again quick data

play09:53

lookup and interval suitable for

play09:55

application that require complex

play09:57

interaction like chat BS and need to

play09:59

remember the memory gqs summarization

play10:01

many more then integration functions as

play10:04

a Smart Storage

play10:05

mechanism Smart Storage mechanism why

play10:08

this Smart Storage I'm saying because in

play10:10

the projects also we will create this

play10:12

okay designed to bring multiple to tools

play10:15

uh together and chain operations right

play10:17

again it is python based it is python

play10:19

based it is Lama index. TX in case of

play10:21

front end here you have Lang chain doj

play10:24

focused on search Centric application

play10:26

now this is the most important thing a

play10:29

broad range of application here whenever

play10:31

we talk about search Centric application

play10:34

this is the thing right many people will

play10:37

be talking about rag system RG right you

play10:40

know I hope everybody knows about rag

play10:42

system if you have an external customer

play10:43

data custom data how can you probably

play10:46

explore that data if you have multiple

play10:48

PDFs document how can you ask any

play10:50

question to that and probably retrieve

play10:52

the data itself with the help of Lang

play10:54

chin we can do it but if you using Lama

play10:56

index we'll still get more efficient

play10:58

results and I'm just not saying because

play11:01

it'll just provide functionalities to

play11:02

load all the PDF convert uh index those

play11:05

PDFs take out all the text Data index in

play11:07

such a way that whenever you uh ask for

play11:09

any query you'll be able to get the

play11:11

retrieval very much quicker and then for

play11:13

deployment it is idal for both of them

play11:15

it is the deployment part see at the end

play11:18

of the day our main aim is always to

play11:21

create an application using llama

play11:25

index Plus langin

play11:29

as I told you guys if I probably

play11:32

consider this okay now let me show you

play11:34

an architecture here so here is what if

play11:37

we are

play11:38

using llama

play11:41

index plus Lang chain now this is what

play11:46

is the architecture that I'm talking

play11:47

about so here you can see let's go from

play11:51

here this is your data right structured

play11:55

unstructured and programmatic all this

play11:57

data is basically indexed right this

play12:00

indexing basically happens with the help

play12:02

of Lama Index right once this indexing

play12:06

is happened right this this entire

play12:09

pipeline this entire pipeline is

play12:12

implemented using llama

play12:16

Index right it is very much efficient

play12:19

when compared to using Lang chin then

play12:22

you will also be seeing this part this

play12:25

entire application is built by Lang

play12:27

chain framework now llm models can be

play12:30

anything different different llm models

play12:32

now let's understand this so I have my

play12:34

structured unstructured programmatic

play12:36

data this indexing will basically happen

play12:38

with the help of Lama index now user

play12:40

whenever they ask any query now since we

play12:43

are indexing with the help of L andex

play12:45

usually this query when we are asking we

play12:48

will be quickly able to get the response

play12:49

now let's say we are getting some

play12:51

response over here we will use this

play12:53

response along with one specific prompt

play12:57

along with all the response that we have

play12:58

specifically getting and then we'll send

play13:00

it to our llm model and from this llm

play13:03

model we will get one another response

play13:06

so this entire thing can be implemented

play13:08

with the help of Lang chain because here

play13:10

the dependency will be with respect to

play13:12

multiple agents right multi-chain agents

play13:16

there may be one chain to other chain uh

play13:18

there may be a requirement right uh so

play13:21

all these things will specifically be

play13:22

there let's say if I want to use an

play13:26

application create an application which

play13:28

is called text to SQL right text to SQL

play13:33

now not text to SQL let's say I'll not

play13:37

use text SQL let's say I have a PDF

play13:39

document I have multiple PDFs document

play13:42

right so PDF I will Index this data

play13:45

whenever I ask for any query I will be

play13:48

able to get a response now along with

play13:51

this response what I can do I can create

play13:53

another prompt and give it to my llm

play13:55

model and make this llm model perform

play13:58

any function functionality based on the

play14:00

response that I'm getting over here

play14:01

right and then finally I'll get the

play14:03

response so if I'm specifically using

play14:05

Lang chain and Lama index I will be able

play14:07

to create some amazing application but

play14:10

at the end of the day if someone even

play14:12

asks you in the interview where exactly

play14:13

is Lama index used you should

play14:15

specifically say in this pipeline this

play14:18

is the pipeline where it will be used

play14:20

still here right and this squaring

play14:23

usually happens very much faster because

play14:25

the indexing technique that is used in

play14:26

Lama index is completely different with

play14:29

respect to Lin the way that it is

play14:31

probably creating vectors Vector Mings

play14:33

the way that it is creating metadata it

play14:35

is completely different when compared to

play14:37

Lang Lang chin and this is a kind of

play14:39

expert in this matter Lang chain overall

play14:43

it can do multiple things right so I

play14:46

hope you like this particular video I

play14:47

hope you able to understand this now in

play14:49

the upcoming video we will try to create

play14:51

a projects using L index and Lang chin

play14:53

we'll try I'll try to show you I will

play14:55

talk about multiple PDFs we'll talk

play14:57

about multiple PDF how you can query it

play15:00

we'll talk about amazing use cases that

play15:02

we are going to discuss we'll talk about

play15:03

what all functionalities you have in L

play15:05

index you we'll also be talking about

play15:07

Vector embeddings in Lama index uh by

play15:09

using this Lama index many things will

play15:11

be probably coming and what all

play15:13

applications you'll be able to develop

play15:15

right so yes this was it for my side I

play15:17

think I hope you like this particular

play15:18

video I'll see you all in the next video

play15:19

have a great day thank you one all take

play15:21

care bye-bye

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Generative AILama IndexLang ChainAI EducationLLM ModelsData IndexingCustom DataAI ProjectsSearch OptimizationKnowledge Retrieval