DTOs are Essential in the Frontend | TypeScript

Lucas Barake
3 Jul 202420:35

Summary

TLDRThis video script discusses the importance of Data Transfer Objects (DTOs) in both backend and frontend development. It explains how DTOs transform raw data into a usable format, emphasizing their role in API design and data handling. The script also addresses common mistakes in state management within components and advocates for the use of DTOs to maintain code readability and maintainability. It provides practical examples of creating DTOs, including handling nullable values and using discriminated unions, to demonstrate the benefits of DTOs in streamlining data processing and reducing bugs.

Takeaways

  • πŸ”„ Data Transfer Objects (DTOs) are essential for both backend and frontend development, transforming raw or sensitive data into a structure that can be safely consumed.
  • πŸ”‘ DTOs are crucial in frontend development for transforming raw API data into a usable format, reducing the need to derive state within components.
  • 🚫 Avoid deriving state within components as it leads to the creation of numerous functions, making the codebase harder to maintain and understand.
  • πŸ—ƒοΈ Use a data access directory to abstract API interaction, holding all logic for queries, mutations, and policies.
  • βœ… Always validate external data at runtime, even if it comes from a reliable source, to facilitate easier debugging and logging.
  • πŸ”„ Normalize data structures to handle inconsistencies and avoid tedious handling of nullable fields and arrays represented as tuples.
  • πŸ†” Utilize nominal types to differentiate between similar underlying types (like IDs), minimizing bugs during data manipulation.
  • πŸŽ›οΈ Remap API data to user-centric structures in the DTO to make the frontend code more intuitive and user-friendly.
  • πŸ“Š Use discriminated unions to handle complex data scenarios, allowing conditional fields and behavior based on the data context.
  • πŸš€ Employ functional programming techniques like option and match from libraries like Effect to streamline nullable and conditional data handling.
  • πŸ› οΈ Ensure all components consuming API data use the DTO structure, making the data consistent across the application.
  • πŸ—ƒοΈ Use the same DTO structure for both incoming data and cache manipulation to maintain consistency and reduce errors.

Q & A

  • What is the primary purpose of Data Transfer Objects (DTOs) in back-end development?

    -DTOs serve as a means to transform potentially raw or sensitive data into a structured format that can be safely consumed, acting as a cornerstone of a well-designed API.

  • How are DTOs utilized in front-end development, and why are they important?

    -In front-end development, DTOs are essential for transforming the raw data received from APIs into a sensible and usable format, enhancing the readability and maintainability of the project.

  • What is the common mistake made by many projects in terms of state management within components?

    -A common mistake is overusing state within components, which can lead to hundreds of functions and custom hooks being created, spreading all over the codebase and reducing maintainability.

  • Why is it necessary to validate data from external sources at runtime, even if the API is autogenerated or introspected?

    -Validating data at runtime is crucial for quick debugging and ensuring that the data conforms to expected schemas, which helps in catching bugs regardless of the source of the data.

  • What is the significance of using branded types or nominal types in DTOs?

    -Branded types or nominal types extend base types with additional information, allowing for type safety and preventing the comparison of different types that share the same underlying type.

  • How does the script suggest organizing data access logic in a front-end application?

    -The script suggests creating a data access directory to hold all the logic for the API, including query policies and other business logic, abstracting this layer for better organization.

  • What is the role of discriminated unions in DTOs as discussed in the script?

    -Discriminated unions are used to normalize and remap data types based on certain conditions, making it easier to handle complex data structures and ensuring type safety.

  • Why should DTOs be used for both incoming and outgoing data manipulation in an application?

    -Using DTOs for both incoming and outgoing data ensures consistency and type safety throughout the application, making it easier to communicate and manipulate data across different components.

  • How does the script handle nullable values in DTOs to improve code readability?

    -The script uses the Option type from functional programming, which represents the presence or absence of a value, to handle nullable values elegantly and improve code readability.

  • What is the benefit of using the 'match' pattern with discriminated unions in DTOs?

    -The 'match' pattern with discriminated unions allows for a switch-like statement that is more powerful and type-safe, ensuring all possible values are handled and reducing the chance of runtime errors.

Outlines

00:00

πŸ“š Data Transformation with DTOs in API and Frontend Development

This paragraph discusses the importance of Data Transfer Objects (DTOs) in both backend and frontend development. DTOs are instrumental in transforming raw or sensitive data into a structured format that can be safely consumed. The speaker argues that DTOs are a cornerstone of well-designed APIs and are equally vital for frontend development, particularly for transforming API data into a usable format for components. The paragraph also touches on common mistakes made with state management in components and the benefits of using DTOs to avoid redundancy and improve code readability and maintainability.

05:03

πŸ”§ Utilizing DTOs for Data Normalization and Type Safety

The speaker elaborates on the use of DTOs for data normalization and ensuring type safety in applications. They describe how DTOs can help in transforming data from external sources, such as APIs, into a format that is more suitable for frontend consumption. The paragraph emphasizes the importance of data validation at runtime to quickly catch bugs and the use of nominal types or branded types to prevent errors related to data type mismatches. It also explains how DTOs can simplify the process of handling complex data structures and conditional logic in client-side applications.

10:05

πŸ›  Implementing Discriminated Unions and Type Guards with DTOs

This section delves into the technical implementation of DTOs, particularly the use of discriminated unions for handling complex data structures. The speaker introduces the concept of branded types for distinguishing between different types of IDs and the use of type guards to ensure that the correct data types are being manipulated. They provide an example of transforming raw post data into a DTO format, including handling optional fields like edit history and author information, and demonstrate how to use pattern matching to simplify conditional logic based on data types.

15:05

πŸ“ Advanced Data Transformation Techniques with Options and Arrays

The paragraph explores advanced data transformation techniques using Options and Arrays from the Effect system, which are used to handle nullable values elegantly. The speaker demonstrates how to streamline data transformation by using functions like 'fromNullable', 'getOrElse', and 'option.map' to create clean and readable code. They show practical examples of transforming data structures, such as comments and edit histories, into a normalized format that is easy to consume in the application's components.

20:08

πŸ”„ Applying DTOs for Caching and Mutations in Frontend Applications

The final paragraph wraps up the discussion by emphasizing the importance of using DTOs not only for incoming data transformation but also for manipulating the cache in frontend applications. The speaker explains how to ensure that the server sends data in a format that can be used to update the cache accurately after mutations, such as adding a post. They conclude by highlighting the need for consistency in using DTOs throughout the application for both data input and cache manipulation to maintain data integrity and simplify development.

πŸ“’ Conclusion and Call to Action

In conclusion, the speaker summarizes the video's content and encourages viewers to subscribe and like the video for more similar content. They look forward to connecting with the audience in the next video, indicating the end of the current discussion on DTOs in API and frontend development.

Mindmap

Keywords

πŸ’‘Data Transfer Objects (DTOs)

DTOs are a design pattern used to transfer data between processes, such as when data is sent from a server to a client in a web application. In the video, DTOs are described as essential in both backend and frontend development for transforming raw or sensitive data into a structured format that can be safely consumed. The script emphasizes DTOs as a cornerstone of a well-designed API, illustrating their use in transforming data for frontend consumption and improving maintainability.

πŸ’‘API

An API, or Application Programming Interface, is a set of rules and protocols for building and interacting with software applications. The video discusses the importance of DTOs in API design, highlighting how they serve as a means to structure and safely transfer data. The script also mentions the role of DTOs in abstracting the data layer from the frontend, ensuring that the data received from APIs is in a usable format.

πŸ’‘Frontend Development

Frontend development refers to the process of creating the user interface and client-side functionality of a web application. The video script discusses the importance of DTOs in frontend development, particularly in transforming the raw data received from APIs into a format that is more sensible and usable for the client-side application. An example is given where DTOs are used to avoid redundancy and improve the maintainability of the frontend code.

πŸ’‘State Derivation

State derivation is the process of extracting and transforming data to be used within a component's state in a frontend application. The script warns against deriving state within components excessively, as it can lead to code duplication and maintenance issues. Instead, the use of DTOs is advocated to minimize such state derivation and to streamline data handling across components.

πŸ’‘Data Validation

Data validation is the process of checking data for correctness and completeness before it is used in an application. The video emphasizes the importance of validating all external data at runtime, including data from APIs, to ensure the robustness of the application. The script provides an example of how data should be parsed and validated when using DTOs, illustrating the role of data validation in debugging and maintaining data integrity.

πŸ’‘TypeScript

TypeScript is a statically typed superset of JavaScript that adds optional types, classes, and interfaces to the language. The video script mentions TypeScript in the context of using DTOs to create nominal types, which help prevent bugs by ensuring that data types are used correctly throughout the application. The script also discusses the use of TypeScript features like discriminated unions and options to handle nullable values elegantly.

πŸ’‘GraphQL

GraphQL is a query language for APIs and a runtime for executing those queries against data sources. The video script mentions GraphQL in the context of API design, discussing how DTOs can be used to normalize and transform data received from GraphQL queries. The script points out common issues with GraphQL responses, such as single elements represented as arrays, and how DTOs can address these issues.

πŸ’‘Normalization

Normalization in data handling refers to the process of organizing data to reduce redundancy and improve efficiency. The video script discusses the importance of normalizing data, especially when dealing with complex or nested data structures from APIs. The script provides examples of normalizing data within DTOs to simplify the data structure and make it more manageable for frontend applications.

πŸ’‘Discriminated Unions

Discriminated unions are a type in TypeScript that allow for a value to be one of several types, but only one at a time, identified by a discriminant property. The video script explains how discriminated unions can be used in DTOs to handle different data scenarios, such as when certain data fields should only be present under specific conditions. The script demonstrates the use of discriminated unions to create a more robust and flexible data handling strategy.

πŸ’‘Options

In the context of the video, 'options' refers to a way of handling nullable values in TypeScript, inspired by languages like Rust. The script discusses the use of options to represent the presence or absence of a value, making the code more readable and reducing the mental overhead associated with handling nullable values. The script provides examples of using options in DTOs to elegantly handle nullable data and improve code maintainability.

Highlights

Data Transfer Objects (DTOs) are essential in both back-end and front-end development for transforming data into a safe and usable format.

DTOs are a cornerstone of well-designed APIs, ensuring data is structured for safe consumption.

In front-end development, DTOs transform raw data from APIs into a sensible format, despite potential redundancy with back-end DTOs.

Minimizing state derivation within components is crucial for maintainability and readability.

Creating reusable functions or custom hooks for state derivation can lead to hundreds of functions scattered throughout a project.

A data access directory is recommended for organizing API logic and query policies.

Validating data from external sources at runtime is mandatory for quick debugging and ensuring data integrity.

Branded types or nominal types in DTOs help prevent bugs by distinguishing between similar underlying types.

DTOs should remap API data to better suit the client's perspective and user interaction.

Discriminated unions in DTOs help manage complex data structures and conditional logic.

Normalization of data, such as converting 'nullish' values to 'null', simplifies data handling in DTOs.

Using options in TypeScript can make code more elegant and readable, especially for nullable values.

The power of options in TypeScript is demonstrated through complex examples, showing its superiority over traditional ternary operations.

DTOs should be used consistently throughout an application for both incoming data transformation and cache manipulation.

The importance of using DTOs for mutations and cache updates is emphasized for data consistency.

A comprehensive example demonstrates the transformation of raw posts into a structured DTO format for easy consumption.

The video concludes with the recommendation to subscribe for more content on DTOs and front-end development.

Transcripts

play00:00

dto or data transfer objects they are

play00:03

essential in buck and development they

play00:05

serve as a means to transform

play00:08

potentially raw or sensitive data into a

play00:11

structure that can be safely consumed I

play00:14

would say it's a Cornerstone of a

play00:16

well-designed API but I would even go as

play00:19

far as to argue that they are equally

play00:22

essential to frontend development of

play00:24

course not in terms of security or

play00:27

anything like that as everything in the

play00:29

front end ultimately is accessible to

play00:31

anyone rather in transforming the raw

play00:35

data that you receive from these API

play00:38

into a sensible and usable format now

play00:41

this is a little bit ironic since the

play00:44

backend might already employ a dto so

play00:47

are we slapping a DPO on top of another

play00:50

dto no that's not how it works I've seen

play00:53

many but many people many projects make

play00:57

the mistake of theing all possible state

play01:01

within their components now is deriving

play01:03

state within a component wrong I'm not

play01:05

saying that I'm saying that you can

play01:07

minimize that if you start using dto so

play01:11

take this for example we have this query

play01:14

right here we use this in a component we

play01:16

need to derive this state in the Heather

play01:19

so we passing the query data over to the

play01:22

Heather and the Heather transforms the

play01:25

data and deres a state from it but now

play01:28

you realized that you need that same

play01:30

data within another deeply nested

play01:33

component what do you do in that case

play01:35

well simple you create either a reusable

play01:38

function or you create a custom hook but

play01:41

you start doing that for every single

play01:44

piece that you need to derive so now

play01:46

you're creating hundreds of functions

play01:50

and you're spreading them all over the

play01:52

place and you need to be constantly

play01:54

recalling which function does what and

play01:57

which function you need for that and

play01:59

that will just trust for the legibility

play02:01

and maintainability of your project so

play02:03

please don't do that of course there are

play02:05

some cases where it's unavoidable but

play02:08

generally speaking you should always be

play02:11

using dto so here I have a simple

play02:14

example I have a wrapper for our use

play02:18

Query hook which you should always be

play02:20

doing if you're using something like s

play02:23

query what I like to do is create a data

play02:26

access directory and here I hold

play02:28

everything that is the layer for the API

play02:32

so here I can have all of the re right

play02:35

policies the logic for the queries

play02:38

enabled whatever you should always be

play02:40

abstracting this anyway here I can pass

play02:43

in some options so the query variables

play02:46

then I pass them over to the query key

play02:48

if you've used s query well you would

play02:51

know why this is very important and then

play02:54

here in the query function I retrieve

play02:56

the query variables and pass them over

play02:59

to the fetcher then all we do is try to

play03:02

parse the body expecting Json and then

play03:06

we decode the data now this step is

play03:09

mandatory all data from any type of

play03:12

external Source should always be

play03:15

validated at the run time and there are

play03:17

no excuses for this even if this is an

play03:20

autogenerated open API spec or

play03:24

introspected with a graphql Cogen you

play03:27

should always validate the data and why

play03:30

because it lets you debug really fast if

play03:33

you have logging in place with your run

play03:35

and validators it is very easy to catch

play03:38

any kind of bugs is your schema wrong

play03:40

your types or is the server hailing at

play03:44

certain scenarios there's a plethora of

play03:47

different cases where if you validate

play03:49

from the very beginning and that logging

play03:51

on top of that you will make your life

play03:54

way easier so please at all times

play03:57

validate any external data whether

play03:59

that's from local storage session

play04:01

storage your API a third party library

play04:05

in short anything that is external

play04:07

anyway here we return the decoded posts

play04:10

and call it a day now we can start

play04:13

consuming this query we can make all

play04:15

type of derivations within the

play04:17

components and if we want to manipulate

play04:20

the Kush well we're going to be mutating

play04:23

this structure now let's come here over

play04:26

to the schema so here in the post

play04:29

response schema we get back an object

play04:32

that contains an ID the content then the

play04:35

author then the author is an object with

play04:37

ID and username but notice how the

play04:40

username is Nish meaning that it can be

play04:43

undefined and null which makes it

play04:45

incredibly tedious to work with so this

play04:48

is something you should normalize to

play04:50

just null then the top comment this is

play04:53

an object contains an ID the content the

play04:56

author type and this is an enum use

play04:59

moderator or admin then we have the

play05:02

author ID which again is Nish and this

play05:05

in itself is Nish so we need to

play05:08

normalize these two and then we have the

play05:10

latest comment and this is an array of

play05:13

objects and each object again has the

play05:16

same structure however notice how we're

play05:19

restraining this to just be one if it is

play05:22

present because this can be knowledge

play05:24

again now the reason why I used one as

play05:27

an example is because if you have used

play05:30

graphql with something autogenerated say

play05:33

HRA or something of the like or an API

play05:36

that is badly designed which is very

play05:39

very common a single element is

play05:41

represented as a topple so again this is

play05:44

something that we need to normalize we

play05:46

need to extract the object if it is

play05:49

present it is again very tedious to work

play05:52

with an array that is in fact a topple

play05:55

and a topple of one element which makes

play05:57

no sense whatsoever and finally we have

play06:00

these two which are very interesting and

play06:03

why because the edit history should only

play06:06

be present if this post was created by

play06:09

the current session so if the currently

play06:11

logged in user didn't create this post

play06:15

then we shouldn't be able to access the

play06:17

edit history this is a field that should

play06:20

be completely absent in other words it

play06:23

should be represented by never in

play06:25

typescript and same for each following

play06:28

author if you create the post it doesn't

play06:30

make sense that you can follow yourself

play06:33

so again this should be hidden so for

play06:35

that we can have a discriminator so

play06:37

notice how this can become more and more

play06:40

complex and that's not something you can

play06:42

easily represent in an API this is

play06:45

something that ultimately the client

play06:47

needs to do so for this again we need to

play06:49

have a discriminator and these two will

play06:52

exist Bas on that discriminator and

play06:55

that's it now obviously this is a very

play06:58

contrived example an actual API would be

play07:01

way way bigger and would have way more

play07:04

business logic but with this we can see

play07:06

how messy it can become if you want to

play07:09

derive all of these nuances in the

play07:12

components it is going to be extremely

play07:14

tedious so now let's take a look at the

play07:17

solution the dto so here I it starts off

play07:20

with branded types or in other words

play07:23

nominal types meaning types that extend

play07:26

from a base type so in this case from

play07:29

string string but we add something to

play07:31

these types to be able to set them apart

play07:35

so what I mean by this if you have two

play07:37

strings and you compare to Strings that

play07:39

is completely valid but if we have a

play07:41

nominal type such as these ones if we

play07:44

were to compare so let me show you if we

play07:47

were to compare a comment ID with a post

play07:51

ID even though they both share the same

play07:54

underlying type they are considered to

play07:57

be different and as we can see we get

play07:59

type error now you might be wondering

play08:01

why are we doing this this seems

play08:03

overkill for the front end well in the

play08:06

front end you need to do a lot of

play08:08

mutations you need to pass data a lot

play08:11

you need to manipulate the cash a lot

play08:14

and it is very bad very easy to make

play08:17

this mistake and if you want to update a

play08:20

record in the cash or an optimistic

play08:22

update or a settled update and you end

play08:25

up passing in the wrong ID you have now

play08:27

introduced a bug so if you can minimize

play08:30

all type of bugs at the type level you

play08:33

should do it so this is why I'm an

play08:35

advocate for nominal types anyway I have

play08:38

a whole video dedicated on this so make

play08:40

sure to check it out now we have the

play08:43

comment author type so we have a regular

play08:45

user we have this stuff and we have this

play08:48

system so this is essentially a

play08:51

remapping of this type now you might be

play08:53

wondering why are we in the dto

play08:56

remapping this shouldn't it be as close

play08:58

as one to one with the API and I would

play09:01

say no the API is designed for data

play09:04

everything in an API is centered around

play09:07

the processing of data meanwhile a

play09:10

client is designed to consume the data

play09:13

so what the user sees is perceived

play09:16

completely different from the underlying

play09:19

architecture of your application so what

play09:22

I always like to do is to name things as

play09:26

if the user was the one interacting with

play09:29

the code if you can Center it as you

play09:32

being the user it makes your life easier

play09:35

so this is just an example but you get

play09:37

the gist of it obviously these are just

play09:40

placeholders don't Focus too much on the

play09:43

idea behind these names this is just so

play09:45

you get my point this is a demonstration

play09:49

now I told you we're going to be using

play09:51

discriminated unions so I created the

play09:54

base type so here nothing too fancy I

play09:57

just normalized things so it is now oral

play10:01

instead of or undefined oral I added in

play10:04

the nominal types I Rema the author type

play10:09

and that's pretty much it now as for the

play10:11

discriminators I added these two types

play10:14

so own post dto and other user post dto

play10:18

so we have the type for own we have the

play10:21

edit history and we have the other user

play10:24

post dto Ty is following author Boolean

play10:28

and we added an extra property like can

play10:31

reports assuming this would come from

play10:33

the configuration of the user so when

play10:36

you query the permissions of a user

play10:39

whatever this is just an example

play10:41

placeholder and then the post dto is

play10:44

again the union of these two intersected

play10:47

with the Base Post dto now I created the

play10:50

get author type so we take in an author

play10:53

type which again is this Union and all

play10:56

we do is use mat now if you have used

play10:59

something like CS pattern this will look

play11:01

very familiar and if you have no idea

play11:04

what this is this is a switch statement

play11:06

on steroids and this comes from effect

play11:09

so effect brings much it brings array

play11:13

option and a lot of other things now for

play11:16

this we say we're going to match for the

play11:19

other type then we restrict the type

play11:21

here so we must return this and then we

play11:24

have a when user remove this to regular

play11:27

user moderator stuff and administrator

play11:30

stuff and then we say exhaustive so that

play11:33

it ensures that we're handling all

play11:36

values here so we were to have a fourth

play11:38

one and we do not handle that case we're

play11:41

going to get a type error so again this

play11:43

is great obviously this is very basic

play11:46

we're just remapping them like this but

play11:49

usually you would have some kind of

play11:51

logic here so maybe for user there's

play11:54

more than just remapping this to our

play11:56

regular user and then we have the r post

play11:59

posts to posts so this is the actual

play12:01

function that transforms everything so

play12:04

we take in a r post then we get the

play12:06

current user ID so this comes from a use

play12:10

session store so again nominal types

play12:13

make sure we're not comparing the wrong

play12:15

data types here and well we just have

play12:17

the users so we retrieve that and then

play12:20

we can determine if this post was

play12:23

created by the current user so now we're

play12:25

deriving this state here then we return

play12:28

we we apply the nominal type we have the

play12:31

content here this is great because you

play12:34

can trim the content you can use our

play12:36

rejects you can use whatever you want

play12:39

here so everything is normalized via d

play12:42

dto then you have the author again

play12:44

nominal types and then notice this I'm

play12:47

using a pipe and then I'm using option

play12:50

Now options don't really make much sense

play12:54

in typescript they make perfect sense in

play12:56

languages like rust but if you're going

play12:59

going to be using options to pass data

play13:01

around in typescript would say you're

play13:04

doing it wrong now if you're not

play13:06

familiar with options options is simply

play13:08

a way to indicate either the absence or

play13:12

the presence of a value so now you might

play13:14

be saying well that's null right null is

play13:17

the absence of a value and if it's not

play13:19

null it is present and you're completely

play13:22

right so that's why I say in typescript

play13:25

it doesn't make much sense because you

play13:27

can narrow down the type by simply

play13:29

checking against null but the reason why

play13:32

I'm using option in this case is because

play13:35

it can make your code very elegant

play13:38

You're simply converting things to

play13:40

options so in this case we can say from

play13:42

Noel that way the sum so the presence of

play13:46

the value is assigned whenever the

play13:49

username is not null or undefined as

play13:53

simple as that and then I say get or

play13:55

null so get the Su value if there is one

play13:58

and if the recent then simply fall back

play14:01

with null now obviously this is very

play14:03

basic you would say just use a ternary

play14:06

for this or are you using option in my

play14:09

opinion this is way more readable and

play14:11

requires less mental overhead but let me

play14:14

show you more complex examples so that

play14:17

you can see the power of option so then

play14:19

we have the top comment so again we pipe

play14:22

this we save from Noel so we assign the

play14:24

top comment to some if it is present and

play14:27

then we can say option map so this is

play14:30

great because the map allows you to

play14:33

access the some value when it exists so

play14:36

if I come here as we can see the comment

play14:39

is now not knowledge it is present so

play14:42

again map this only executes if this

play14:45

results in a sum value so here we can

play14:49

map this over like you would do with an

play14:51

array you taking the input you output

play14:53

something new derived from that value

play14:56

and then here we can say ID comment ID

play14:59

we pass in the nominal type the content

play15:02

which Ram the content then the author

play15:05

type here which actually they should use

play15:08

the function that I showed you so get

play15:10

author type and then I have the author

play15:13

ID so again I say from Noble I assign

play15:16

this to sum if it is present if it is

play15:19

present I map it over I take in the sum

play15:22

value so the author ID and I implicitly

play15:25

convert it to this nominal type and then

play15:27

I say get or not so see how clean this

play15:30

is if you were to use a turnar for this

play15:34

you can see right away how we have to

play15:36

Nest a taries we have one for this one

play15:39

which would look terrible and then you

play15:41

would have another nest theary for this

play15:44

so this is streamlined with option where

play15:47

I Rely solely on option for nullable

play15:50

values and much so this one for logical

play15:54

values so this is what I would recommend

play15:57

and then comment by S so this is a

play15:59

property that I added so this is true if

play16:02

the current user ID is equal to the

play16:04

comment. author ID so as we can see

play16:07

we're deriving here and now any

play16:09

component can simply check this Boolean

play16:11

property and say render a bdge or an

play16:15

edit button whatever there's no need for

play16:18

each component to derive it and then

play16:20

after all of this I say C or null so if

play16:23

this is null then do this if this is not

play16:26

null then do this then we have the pre

play16:28

give you comment so here again I pipe

play16:31

this through I save from E Rubble so I

play16:34

use array and this array is not the

play16:36

native one this comes from effect which

play16:39

works really nicely with the pipe and

play16:41

with option so here I say from it Rubble

play16:45

which would be array. from pretty much

play16:47

the same thing and then we say array.

play16:50

head and the head gets the first element

play16:53

of an array or none if the array is

play16:56

empty so this in itself returns an

play16:59

option for us that means that all we

play17:01

need to do is say option. mop and now

play17:03

we're getting this if this exists as we

play17:06

can see we get the comment here we do

play17:08

everything again here we say from

play17:10

nullable we map this over apply the

play17:13

nominal type get or null this is very

play17:16

very readable in my opinion and then

play17:18

again comment by self so we apply the

play17:20

same logic although we could extract

play17:23

this into its own helper function like

play17:25

we did with getor type but for it the

play17:29

sake of demonstration purposes we can

play17:31

leave it as is and then we have the

play17:34

discriminators so if isone post then we

play17:37

spread over this object where we passing

play17:40

the type as own so as const we need to

play17:42

narrow it down then we have the edit

play17:45

history and again from no level we check

play17:48

the edit history if it exists then we

play17:51

can map overreach element so we say

play17:54

array. map and this will implicitly take

play17:57

in the value from option. map and then

play18:00

we get each edit so as we can see edit

play18:03

is just this object and we parse this

play18:06

into a new data assuming this is an ISO

play18:10

string and then the content is just

play18:12

edit. content and in the case there's

play18:15

nothing we default back to an empty

play18:17

array so get or else and this must

play18:20

satisfy own post dto and then we have

play18:23

the other one other user post dto and we

play18:26

do the same so is following author get

play18:30

or else false so again fall bu and then

play18:33

they can report which is what I was

play18:35

talking about earlier well this could

play18:37

depend on your business Logic the

play18:39

permissions of the user whatever so this

play18:42

is just a placeholder for demonstration

play18:44

purposes and this is pretty much the

play18:46

whole dto so now to use it all you need

play18:49

to do is come here and then say return

play18:53

and then you could say and this is

play18:56

called raw posts this is a actually raw

play18:59

post or you could say post dto you can

play19:03

come up with whatever name I can't come

play19:05

up with one at this instance I'm going

play19:08

to keep it with a very basic name which

play19:11

isn't good whatsoever and then this

play19:13

would be the decoded post I import it

play19:17

and then I can come here to this type

play19:20

and then replace this with the post dto

play19:25

and this should be post dto so now when

play19:28

you cons assume this query we going to

play19:30

get this data already derived now what

play19:34

about mutations what if you need to add

play19:36

a post and then you need to get the data

play19:38

back how do you update the cache

play19:40

accordingly well all you need to do is

play19:43

obviously make sure that the server

play19:46

sends in the same data so that you can

play19:48

replace the optimistic post or at least

play19:51

partial data and then you can make up

play19:53

some data in the Kush and all you need

play19:55

to do is call this function so post dto

play19:58

and you store the cache data with this

play20:01

structure so that means that all points

play20:04

in your applications that need to

play20:07

communicate or manipulate the cach must

play20:11

use these dto so not only everything

play20:15

incoming needs to be transformed using a

play20:18

dto anytime you manipulate the cash you

play20:21

need to also par the data and apply the

play20:24

dto there anyway this wraps up the video

play20:28

If you want to see more content like

play20:30

this make sure to subscribe and like the

play20:32

video I'll see you in the next one see

play20:34

you

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Data TransformationAPI DesignFrontend DevelopmentDTOsData ValidationTypeScriptSoftware ArchitectureData ModelingCoding Best PracticesWeb Development