I Made an Anime with AI in 24 Hours - No Experience Needed!

Film & Animation


I Made an Anime with AI in 24 Hours - No Experience Needed!


Recent advancements in AI have brought us some incredible tools, and with the release of P 1.0, the quality of AI-generated animations has reached a new level. Intrigued by the stunning animations the team behind P 1.0 has achieved, I decided to take on a challenge: create a short anime within 24 hours using AI.

Brainstorming the Concept

A good concept is the foundation of any creative project. I began by endlessly scrolling through anime trailers and concept videos. While I found some interesting ideas, none felt just right. I wanted to create something relatable yet magical, effectively showcasing the capabilities of AI. Just as I was about to give up, I came across the GTA 6 trailer, sparking an idea. I love Grand Theft Auto and anime, so why not transform the trailer into an anime? I wasn’t sure if it was possible, but it seemed like a fun challenge.

Dissecting the Trailer

To begin, I downloaded the GTA trailer and broke it down into a list of scenes. Understanding each shot allowed me to determine which scenes and shots I needed to generate using AI.

Scene Generation with P 1.0

Utilizing the PEER web app, I leveraged the "modify region" tool to transform specific areas in the videos according to my prompts. For example, I started with a sunset shot, adding a prompt for an anime-style beach with fans and gates in Studio Ghibli style. The results were impressive, though they sometimes required tweaking the prompt to get the perfect shot.

When focusing on particular elements, like transforming a character into anime style, the selective modification ensured the background stayed consistent. For instance, in a scene featuring a girl, I only highlighted her and added the anime transformation prompt, keeping the rest of the frame unchanged.

However, not every scene worked perfectly with video-to-video transformation. In those cases, I turned to text-to-video generation, replacing problematic shots like a girl on a Lamborghini with a more straightforward animation of a moving car.

Through a combination of video transformation and text-to-video, I managed to generate all the necessary scenes. After producing over 200 videos and finalizing 40+ clips, it was time to piece them together.

Editing and Voiceovers

Using CapCut, I assembled the scenes into a cohesive anime short. To give it authentic anime vibes, I needed Japanese voiceovers. Though I’m not a voiceover artist, I used 11 Labs to generate voices. I browsed their library, found suitable voices for the characters, and translated my script into Japanese. The AI-generated voiceovers were then added to the animation.

Final Product

With everything in place—from vibrant scenes to immersive Japanese voiceovers—here is a glimpse of the magic we created. Enjoy!


Keywords

  • AI animations
  • P 1.0
  • anime transformation
  • Grand Theft Auto
  • video-to-video transformation
  • text-to-video generation
  • CapCut
  • 11 Labs
  • Japanese voiceovers

FAQ

How did you come up with the concept for your anime?

I was initially searching for relatable yet magical concepts. Eventually, the GTA 6 trailer inspired me to blend my love for Grand Theft Auto with anime.

What tools did you use for scene generation?

I used the PEER web app's "modify region" tool and text-to-video generation features to create the animation scenes.

How did you ensure consistent transformation in the scenes?

By selectively applying transformations to specific areas and making use of prompts with negative descriptions and motion control, I achieved consistent results.

How did you handle problematic scenes that didn’t transform well?

When some scenes didn’t transform as intended, I replaced them using text-to-video generation, allowing for more straightforward animation.

How did you create the Japanese voiceovers?

Using 11 Labs, I generated suitable voiceovers by translating my script into Japanese and selecting voices that fit the characters.