EB.

We Tested Google Veo and Runway to Create This AI Film. It Was Wild. | WSJ

Read on Jun 3, 2025 | Created on May 30, 2025
Video by The Wall Street Journal | View Original | Source: YouTube
Tags: ai video Website

Note: These are automated summaries imported from my Readwise Reader account.
View Article

Summary

Summarized wtih ChatGPT

The Wall Street Journal created a film using AI tools like Google Veo and Runway, blending visuals and audio with human input. The project showcased the potential of AI in filmmaking, highlighting both its capabilities and limitations. They emphasized that while AI can assist, creativity and originality still come from humans.

Key Takeaways:

  1. Explore AI tools for creative projects, but rely on human creativity for ideas.
  2. Understand that AI can improve efficiency but is not a complete substitute for human input.
  3. Experiment with different AI platforms to find the best fit for your needs.

Highlights from Article

We used Midjourney

to design our star, Optimax 5000.

It took a few iterations.

To create me, we gathered some photos,

and the neighborhood was made in Runway.

Then we took these images of the characters

and the background and put them

into Runway’s References tool and wrote a prompt

describing the scene we wanted.

Out came the first frame of the scene.

Then we fed that image into Google Veo

and wrote a new prompt to create the motion.

And then we picked the best version

that AI generated.

For some shots that didn’t have characters,

like in our suspense scene at the end,

we used text to video prompts in Veo 3.

And if you’re wondering how we made this handsome fella,

we used an image generator

to create our ideal mad scientist.

Then I did my best impression

of how I thought he’d gesture and sound like on camera.

Design them to optimize workplace efficiency.

And we uploaded the image

and the video of me into Runway ReStyle.

Once we had the clips, we brought them into Adobe Premier.

For editing and sound,

Max’s voice and Chip’s voice

were generated in ElevenLabs.

The AI audio tool

lets you describe the voices you want

or even clone your own.

But I recorded my own character

because ElevenLabs couldn’t quite

get my inflection right.

Go away.

And that song you hear

at the end of the film

was made with Suno, an AI music generator.

And yes, the script was written

by us humans, not AI.

Unlike this guy says,

I love robots.

I love robots so much.

All material owns to the authors, of course. If I’m highlighting or writing notes on this, I mostly likely recommend reading the original article, of course.

See other recent things I’ve read here.