New AI Video That Does Everything!
Summary
TLDRThe video script unveils the latest advancements in AI-powered creative tools and platforms. It introduces LTX Studio, a holistic platform that solves challenges faced in creating AI-generated films and videos, offering features like text-to-video generation, shot editing, dialogue and sound integration, and auto-editing. Updates to platforms like Pika and Runway, enabling lip-syncing and motion brush enhancements, are also discussed. Notably, Google's Genie, a text-to-2D game generator trained on game footage, showcases the potential for unsupervised learning in AI-powered content creation.
Takeaways
- 👍 LTX Studio is a new AI video platform announced that offers features like text-to-video, shot editing, camera controls, music/dialogue/sound effect integration, video inpainting, storyboarding, casting, and auto-editing.
- 🎥 Pika AI has added a lip-sync feature that can generate 3-4 second audio clips or sync to provided MP3 files for characters in the video.
- ✏️ Runway AI has introduced a new quality-of-life feature in Motion Brush that allows for auto-selecting and controlling specific areas of an image.
- 🎮 Google has released Genie, a text-to-2D game generator that can create simple platformer games from text or images, trained on 200,000 hours of video game footage.
- 🔮 Runway's CEO hinted that their AI outputs may soon be better than the highly impressive Sora AI.
- 🌟 The video emphasizes the importance of continuously creating and using available tools instead of waiting for the perfect one.
- 🤖 The video showcases various examples and use cases of the new features from LTX Studio, Pika AI, and Runway AI.
- 🔄 LTX Studio allows for iterative polishing and rearranging of shots while creating AI films.
- 🎬 LTX Studio can generate entire storylines and scripts, though users can still write their own.
- ⏱️ The video mentions that LTX Studio is set to release by the end of March, with an opportunity for early access through a provided link.
Q & A
What is LTX Studio and who developed it?
-LTX Studio is a new AI video platform developed by Light Tricks. It's designed as a holistic platform to solve various challenges in creating longer form AI films or videos, integrating external requirements under one roof.
What makes LTX Studio unique compared to other platforms?
-LTX Studio distinguishes itself by offering comprehensive tools for AI film and video creation, including camera controls, editable shots, and on-platform music, dialogue, and sound effects, making it a one-stop solution.
What are some of the key features of LTX Studio?
-Key features include text-to-video conversion, camera control, storyboard view, consistent character casting, lighting adjustments, title card generation, and auto editing for dialogue, sound effects, and music.
How can users access LTX Studio early?
-Users can gain early access to LTX Studio by using a special link provided by the video's presenter, who received early access as a partner through Light Tricks.
What does the updated Pika platform offer?
-The updated Pika platform introduced a new feature that allows adding lip sync to video outputs, with audio generations limited to 3 to 4 seconds but with a workaround for extending this.
How does the new motion brush feature from Runway improve the user experience?
-Runway's new motion brush feature automates the selection of certain image areas for control, significantly speeding up the editing process and making it easier to apply effects and adjustments.
What is Genie by Google, and what can it do?
-Genie is a text-to-2D game generator released by Google, trained on 200,000 hours of video game footage. It can create simple games from text prompts or images, focusing on old school NES platformer style games.
How does the LTX Studio manage camera movements in videos?
-LTX Studio allows users to control camera movements including horizontal, vertical, pan, roll, and zoom, providing greater flexibility and creativity in video production.
Can LTX Studio generate an entire script for a film?
-Yes, LTX Studio can write and generate an entire storyline and script for you, but it also allows users the flexibility to write their own storylines if they prefer.
What was the response of Runway's CEO to the comparison with Sora's output quality?
-The CEO of Runway, Cristobal Valenzuela, responded with optimism, stating that Runway's outputs would look 'better' than Sora's, suggesting confidence in future improvements to Runway's technology.
Outlines
🎬 LTX Studio: A Groundbreaking AI Video Platform
The input introduces LTX Studio, a new AI video platform that provides a comprehensive solution for creating long-form AI films or videos. It is a holistic platform that brings various external requirements under one roof. The video showcases LTX Studio's capabilities, including text-to-video generation, shot-to-shot panels, editable prompts, timelines for music, dialogue, and sound effects, video inpainting, storyboarding, casting, title card generation, camera movement controls, auto-editing, and script generation. It highlights the platform's ability to create full stories efficiently. The input also mentions that while impressive, LTX Studio serves a different purpose than Sora, which generates individual shots, and emphasizes the importance of continually creating and utilizing available tools.
👄 Pika's Lip Sync and Runway's Motion Brush Update
Pika has updated their platform with a new feature that allows adding lip sync to output videos. The feature enables users to select from various AI voices or upload their own audio files for lip-syncing. The summary showcases examples of Pika's lip sync feature with different characters and discusses potential workarounds for extending audio durations. Additionally, Runway has introduced a quality-of-life update to their Motion Brush module, allowing for auto-selection of image areas for precise control. The input illustrates the update's capability by recreating the iconic dolly zoom shot from the film Jaws and provides a practical use case example.
🎮 Google's Genie: Text-to-2D Game Generator
Google has released Genie, a text-to-2D game generator. Genie can generate simple, old-school NES-style platformer games based on text prompts or images. While the games are limited in scope and run at a low frame rate, the underlying technology is remarkable. Genie was trained unsupervised on 200,000 hours of publicly available video game footage, enabling it to create entire game worlds. Although the input suggests that Genie itself may not be accessible to the public, the underlying research and technology are expected to be incorporated into future products and advancements.
Mindmap
Keywords
💡LTX Studio
💡Text to video
💡Camera controls
💡Editable shots
💡Storyboarding
💡Casting and in-painting
💡Auto editing
💡Script generation
💡Pika platform
💡Genie
Highlights
LTX Studio by Lightricks is a new holistic AI video platform that solves challenges in making longer-form AI films by bringing all external requirements under one roof.
LTX Studio offers text-to-video generation, shot-to-shot panel editing, music, dialogue, and sound effects integration, video inpainting, storyboarding, casting, title card generation, camera movement controls, and auto-editing.
LTX Studio can generate an entire storyline and script, but users can also write their own.
LTX Studio is different from Sora, which generates individual shots, while LTX Studio is a platform for creating full stories.
Pika has updated their platform with a new feature that allows lip-syncing to generated audio or user-provided MP3 files.
Runway has introduced a new quality-of-life feature in Motion Brush that allows for auto-selection of certain areas of an image for better control.
The new Runway feature enables creating cinematic effects like the dolly zoom shot from Jaws.
Runway's CEO hinted that their outputs may soon look better than Sora's.
Google has released Genie, a text-to-2D game generator that can create simple platformer games from text or images.
Genie was trained unsupervised on 200,000 hours of video game footage and can generate games, showcasing the potential for world-creation through AI.
The speaker encourages always creating and using available tools, rather than waiting for specific products.
LTX Studio is set to release by the end of March 2024, and users can sign up for early access using the provided link.
The speaker demonstrated the new Pika lip-syncing feature using different voices and an MP3 file.
The speaker showcased the new Runway Motion Brush feature by recreating the iconic dolly zoom shot from Jaws.
The speaker highlighted the unsupervised training approach used for Genie and its potential implications for AI world-creation.
Transcripts
so it looks like we have a new AI video
platform this one was just announced
today and it does a lot honestly like
all the things uh trust me you're going
to want to check this one out because we
haven't seen anything like it yet we've
also got some big updates to paa and Gen
2 plus Google has a text to 2D game
generator which you know on its own is I
guess kind of cool but it's really when
you dig in that you see the
ramifications of what this really does
okay lots to cover let's dive in kicking
off we have LT X Studio by light tricks
this is something that I've been
predicting for a while a holistic
platform that solves a lot of the
challenges that we face when we're
making longer form AI film or video uh
kind of bringing all of the external
requirements in Under One Roof as a
quick FYI I did get early access to LTX
Studio as a partner through lightricks
uh stay tuned because I've got a way for
you guys to get Early Access as well see
I'm always looking out for you uh but I
think as we go through this video you'll
see exactly why I'm excited about it and
listen I know Sora we'll talk about that
in a minute but for now let's step
through this video the video kicks off
with some text to video examples
everything looks really good here uh I
did notice that in this shot the prompt
calls out track in toward villain uh yes
there is camera controls in LTX Studio
we'll talk about that more in just a
minute from there it moves into this
shot to shot panel where you know we get
three shots when we prompt futuristic
space drama but here is where things get
pretty interesting continues generating
and we end up with 12 shots now what's
interesting is that these are all
editable as well uh as seen in the next
example where uh they change the prompt
out to New York City courtroom drama and
yeah everything repopulates now what's
interesting is that if you actually
scrub through this you'll see that our
vaguely Cyber punan City here actually
becomes an east coast city and you know
our first shot uh kind of that back
angle of the character turns into uh is
that uh I guess like John Doe from Seven
I don't know I kind of want to see this
movie now now here is the really cool
part because as you can see here we now
have uh timelines for music dialogue and
sound effects and these all happen on
platform so there's no need to generate
externally and bring them in this is all
One-Stop shopping it's a quick FYI uh
this is actually not the UI you know
this is a Sizzle reel that tends to
happen I do have a look at the actual UI
coming up in just a minute we next get a
shot of video imp painting with this
shot of like bootleg Josh Haron and the
green car in the background where
obviously via text prompt we have now
changed the green car to a red car
storyboarding is another feature that I
think is super cool you know you can
basically get a bird's eyye view of your
entire film and you can actually even
swap shots around and if you need to
insert shots uh you can do so as well
which is obviously hugely valuable when
you're iterating and polishing your film
if you're wondering about timing of your
shots yeah you can do that as well we'll
take a look at that in one second here
we've got a bit on casting it's you know
basically video in painting but the
important thing to stress here is that
you're getting consistent characters we
can swap around lighting we can generate
some solid title cards for our film
everything here looks spelled correctly
so really this just comes down to user
input meaning all of my titles are going
to be misspelled we have camera movement
controls as well as you can see we have
controls for horizontal vertical our pan
our roll and our zoom and LTX studio
also has Auto editing this is likely to
time all of the dialogue uh sound
effects and music
just as a quick FYI the UI that we saw
in the sizzle reel is not what LTX
Studio actually looks like uh it's
fairly common for a launch video to
prioritize obviously these are the
features that are available not
necessarily like this is what the
platform looks like but they did send
this over to me so that I could show you
what the platform actually looks like
right now to be honest I actually prefer
this over sort of a flashy iPad type
look as the video closes out it does
mention script which wasn't highlighted
in the the sizzle reel but yes LTX
Studio can you know write and generate
an entire storyline in script for you
but importantly it doesn't have to you
can still write your own storyline
that's something that's kind of
important to me okay now for the
elephant in the room Sora uh yes Sora is
amazing but a we don't have it yet and B
this is doing something different Sora
generates individual shots whereas this
is a platform where you can create full
stories in my last video I talked to
Nico from C who had some really great
advice for anyone that wanted to make an
AI film or really do anything creative
always be creating it's not about one
idea it's not about one product AI or
whatever it doesn't matter you need to
always be making stuff it art is not
about I made one movie I'm good it's
about I like to make movies so I make
movies every day so don't wait for the
tools use the ones that are available to
you and make awesome stuff to which you
are probably wondering when does LTX
Studio actually released well I've got
good news not too long away end of March
but I've also got some good news you can
sign up for Early Access if you use the
link down below moving on Pika have
updated their platform with a new
feature that allows you to add lip sync
to your output videos audio Generations
are limited to about 3 to 4 seconds
although there is kind of a work around
that I'll show you in just one second so
uh let's give lip syncing a shot we're
going to take our old friend Daniela van
Denon dressed as a pirate we haven't
seen her in a while and drop her into
the prompt box uh from here you can just
hit this lip sync button and that will
provide you with a number of different
drop-down voices that you can try out so
uh we're going to try out Demi we type
in some text and before you know it we
got Daniela speaking please stop using
me in your videos it's getting silly in
general I kind of find the AI voices to
be well very AI voice sounding uh but
luckily you can actually add in your own
MP3 files as well and it will lips sync
to that so let's give that a shot so
taking some audio of the real Daniela
van Denon and popping it in uh take a
listen to that real
quick she's Dutch I have no idea what
she was saying there I will say that it
can also get a bit finicky depending on
the face detection for example uh with
this astronaut here um well I'll I'll
just play it I wasn't originally going
to get a brain transplant but then I
changed my mind I cracked me up so while
I did get a whopping six seconds out of
it uh obviously the face was kind of
paralyzed by the way two points to the
Spaceman for the dad joke but I did get
a pretty good Indiana Jones X never ever
the spot so I will say it's all
definitely a work in progress but it is
progress in the right direction uh
here's a couple of quick examples Dave
alova posted this one up show me the
money and P themselves posted this up uh
which kind of looks like a young John
Truro in a liveaction Ratatouille
remake that is a deep cut Dexter's
Laboratory reference one idea in terms
of extending out your lip sync audio is
that you can potentially run one
generation and then just slide this
whole thing over uh and then run a
second generation and then kind of
splice them together in some editing
software afterwards not going to lie
might be a bit of a pain but it is a
potential workaround at least for now
all in all keep up the great work Pika
moving on not to be outdone Runway have
introduced a new quality of life feature
to their motion brush uh mostly what
this allows for is to Auto Select
certain areas of your image uh so that
you can have control over them uh it is
definitely a quality of life thing it
just makes things a lot faster you can
always erase sections as well uh this is
all demonstrated by Rich Klein AI now as
soon as I saw that I figured that this
would make a pretty good excuse to try
out the dolly Zoom shot from Jaws the Z
shot if you will it was also used in
Vertigo the way this effect was achieved
in camera is that the camera itself
would dolly in on Roy shider while at
the same time they would be zooming out
on the lens so it creates that sort of
weird parallaxing vertigo is effect so
taking a screenshot and bringing it into
the motion brush mod module as you can
see we can just sort of you know grab
selections and just click and uh those
sections are now part of our brush one
from there I just took a brush one and
cranked the proximity up meanwhile I
just selected the entire background and
cranked the proximity in the opposite
direction running that got us this which
actually is not that bad I mean it yes
it it is weird and warpy and all of that
but I mean it's also one of the most
iconic film shots of all time so you are
judging at a very high bar still that is
a cinematic technique that you know does
work in Runway so you can feel free to
bring that into your own AI films in
terms of a more practical use case uh
for this new tool Nicholas Newbert
posted up a speedrun of him working with
it you can see him here quickly
selecting various tools using uh
proximity Ambience uh the horizontal and
vertical as well as camera movement on
it uh and then when he generates yeah
this is looks very very good rounding
out and Runway USS I don't know if you
guys caught this but uh someone asked
crystal ball valenzula the CEO of Runway
if uh runway's outputs would would look
as good as Sor is anytime soon and his
response was better so whether that was
bravado or if we actually will be seeing
like Sora level outputs coming out of
runway in the next couple of months I
don't know it's all just very exciting a
lot of stuff is moving very quickly and
yeah it this couldn't be a better time
to be sharpening your skills with AI
video rounding out Google have released
well Google released Genie which is a
text to 2D game generator so yeah
game-wise it is very much in the old
school NES platformer style um you are
not going to be text prompting for
halflife 3 here but here is what's kind
of cool about it is that Genie was
actually trained unsupervised on 200,000
hours of publicly available video game
footage and it can generate games either
off of text or via an image the games
are obviously very simple and apparently
only running at one frame a second so
it's not like it would be a very
enjoyable experience to play anyways but
the idea behind it is pretty remarkable
in that what Jeanie's doing is actually
world creating again Google being Google
I don't think that we'll ever get a
chance to play with Genie but I think
the important part here is the
underlying research which I think will
appear in some way in the future oh and
if you did happen to miss the last video
I did where I gave a talk about AI to
Hollywood I do invite you to check that
out that video was a lot of fun it's
coming up next I thank you for watching
my name is
Tim
Parcourir plus de vidéos associées
![](https://i.ytimg.com/vi/rd6Q5k4yTiw/hq720.jpg?sqp=-oaymwEmCIAKENAF8quKqQMa8AEB-AH-CYAC0AWKAgwIABABGGUgVyhFMA8=&rs=AOn4CLAMzt7cTpBt_vWZ2LkkSPinr0-eIg)
Forget SORA, Wait For MorphStudio: NEW AI Video Tool!
![](https://i.ytimg.com/vi/bOw5RwlZHl4/hq720.jpg)
THE FUTURE OF AI VIDEO EDITING IS HERE! What's new in Wondershare Filmora 13?
![](https://i.ytimg.com/vi/IXTj-ZmjLhI/hq720.jpg)
Come montare VIDEO IN AUTOMATICO con l'IA
![](https://i.ytimg.com/vi/Mgyb4nNqicw/hq720.jpg)
10 NEW AI Tools that Will Change Your Life
![](https://i.ytimg.com/vi/CDvzquaVAMs/hq720.jpg)
🔴 NEWS: si potrà usare Sora dentro Adobe Premiere [Reaction]
![](https://i.ytimg.com/vi/eLs7zpEDnBU/hq720.jpg)
20 Productivity AI Tools You NEED to Know (as a Content Creator)
5.0 / 5 (0 votes)