Meta AI’s “Vibes” Feed: A New Frontier for Generative Video

Runbo Li
Runbo Li
·
Co-founder & CEO of Magic Hour
· 7 min read
Meta AI’s “Vibes” Feed

Meta has officially launched Vibes - a feed entirely composed of short-form AI-generated videos, designed to let users view, remix, and share AI content across its ecosystem. This marks a major shift from simple AI prompt exploration to a dynamic, participatory platform for AI-generated visual storytelling.

This move also signals Meta’s ambition to position itself at the center of generative video culture, competing not only with OpenAI’s Sora and Google’s Veo but also with creative tools like Runway and Pika.


What Is Vibes - and How It Works

Vibes

From Discover to Vibes: Meta’s Evolution in Generative Media

Previously, the Meta AI app included a Discover feed where users could browse AI text or image prompts shared by others. The new Vibes feed replaces that system and focuses entirely on video. According to Meta, Vibes is “a feed of AI videos at the center of the Meta AI app and on meta.ai,” where users can discover, remix, and share short-form AI-generated content.

Each video appears alongside the prompt that created it, offering transparency into how generative media is made. Users can cross-post their creations to Instagram or Facebook as Reels or Stories, and even remix videos they encounter on those platforms directly in the Meta AI app.

This integration transforms Vibes from a standalone product into a creative bridge across Meta’s wider ecosystem.


Three Paths to Creation

Vibes

Meta gives users three main ways to participate:

  1. Generate from scratch - Write a text prompt and generate a unique video using Meta’s internal and partner AI models.
  2. Animate existing visuals - Upload your own photo or video and use Vibes tools to stylize or animate it.
  3. Remix existing AI videos - Tap into the feed, adjust visuals, music, or animation, and repost your own version.

Once created, videos can be shared within Vibes or across Meta’s major platforms, making the feed both an AI playground and a content pipeline.


The Tech Behind the Feed

To accelerate rollout, Meta has partnered with external model providers like Midjourney and Black Forest Labs while continuing to develop its own proprietary models. This hybrid strategy lets Meta balance speed-to-market with eventual in-house control of generative video quality and style.


Strengths and Limitations

Strengths

1. Lower barrier to entry
Vibes allows anyone to create short videos without needing cameras, actors, or production tools. With just a prompt, users can generate animated clips ready to share.

2. Built-in remixability
Remixing is core to Vibes. Users can fork others’ content, tweak it, and repost. This encourages collaborative creation and viral remix chains.

3. Deep integration with Meta platforms
Because Vibes connects directly to Instagram and Facebook, creators can easily reach large audiences without exporting content manually.

4. Transparent creation process
Displaying the generating prompt alongside the video helps demystify how AI art is made and promotes community learning.

5. Personalized content curation
Over time, Vibes adapts to your interests and preferred visual styles, creating a feedback loop between viewer behavior and creative inspiration.


Limitations and Risks

1. Quality inconsistency
Early users report “AI slop” - awkward transitions, uncanny animations, or physics-defying moments. These glitches limit professional usability for now.

2. Short-term novelty
Like many AI experiments, the initial excitement may fade if creators don’t find sustainable, audience-building use cases.

3. Remix fatigue
If everyone modifies the same base clips, the feed risks becoming repetitive. Maintaining diversity will require algorithmic curation.

4. Intellectual property questions
Who owns a remixed AI video? Meta hasn’t yet clarified licensing or attribution rules for derivative works.

5. Generative bias and realism limits
AI models can hallucinate or produce distorted visuals. These flaws may deter brand use or serious creative adoption.

6. Discoverability and fairness
As with any algorithmic platform, questions remain around what gets visibility and how remixers are credited.


Testing Scenarios and Early Impressions

While full access remains limited, early testers and journalists describe several promising workflows:

  • Remixing a prompt-based video
    A user sees a clip of “mountain goats leaping through deep snow.” They remix it by changing music and animation speed, then repost. The interface supports fluid iteration.
  • Prompt-based generation
    Typing “cyberpunk Tokyo in rain, neon reflections, slow motion” yields a 10-15 second clip. Quality varies by prompt precision, but the results are visually striking.
  • Cross-post ripple test
    When a user shares their remix to Instagram, viewers can click through to the Meta AI app to create their own. This feedback loop drives engagement.
  • Feed personalization
    Frequent interaction with sci-fi clips leads to more of that aesthetic surfacing over time, signaling algorithmic adaptation.

Overall, Vibes feels experimental yet promising: ideal for creative exploration, not yet for cinematic production.


Comparing Vibes to Other AI Video Tools

Tool / Platform

Strengths / Differentiators

Challenges / Weaknesses

Ideal Use Case

OpenAI Sora

High-fidelity, cinematic visuals, Cameo feature for user inclusion

Closed ecosystem, limited remix options

Storytelling and marketing assets

Google Veo 3

Fast text-to-video generation, Google integration

Less social-native, weaker remix flow

Automated ad/video generation for developers

Runway / Pika / Synthesia

Full creative control, editing tools

Higher cost, standalone

Professional agencies or branded content

TikTok / Reels / Shorts

Human authenticity, reach

Manual production, limited generative tools

Traditional video creators

Trade-offs:
Vibes prioritizes accessibility and virality over technical polish. Competitors like Runway focus on precision but lack built-in audiences. Meta’s ecosystem advantage lies in instant distribution across billions of users.


Strategic Use Cases

Meta Vibes

For Creators and Visual Artists

  • Experiment with styles and motion using quick AI prompts.
  • Launch co-creation challenges where followers remix your clip.
  • Prototype ideas for longer content or animations.
  • Test visual memes that might trend across Meta’s platforms.

For Marketers and Brands

  • Run remix-based campaigns inviting users to generate brand-inspired videos.
  • Prototype ad visuals before investing in full production.
  • Leverage engagement metrics from Vibes to test creative resonance.
  • Use AI outputs as quick content fillers for story or reel rotations.

For Platforms and Developers

  • Integrate generative feeds as experimental spaces separate from human-only content.
  • Explore monetization models like paid remix tools or watermark removal.
  • Study user behavior to balance AI and organic content exposure.

Market Trends and What’s Next

1. Higher fidelity and realism

Expect fewer visual artifacts and smoother animations as Meta refines its models. Generative video is closing the gap with traditional production.

2. Hybrid human-AI creation

Creators will increasingly blend AI footage with real video or 3D assets for richer storytelling.

3. Attribution and rights ecosystems

New frameworks will emerge for crediting remixers and tracking derivative content across feeds.

4. AR and spatial integration

Given Meta’s AR ambitions, expect Vibes to expand into AR displays and smart glasses, merging generative video with spatial storytelling.

5. Monetization and premium tiers

Future paid features might include longer videos, advanced models, or branded prompt packs. This could open new creator revenue paths.

6. Human-AI balance

Meta must ensure Vibes complements rather than overwhelms real content. A balanced feed will be key for long-term trust and engagement.


Final Takeaway

If you’re a creator, Vibes is a creative playground - not perfect, but full of discovery.
If you’re a marketer, it’s a low-cost experiment hub for testing ideas.
If you’re a platform builder, it’s a glimpse into the social future of generative media.

Decision Matrix

Objective

Use Vibes?

Notes

Rapid ideation

Great for short visual drafts

Audience engagement

Best for remix campaigns

High-quality branding

⚠️

May require post-editing

Ad rotation

Works for testing creative hooks

Long-form storytelling

Not yet capable

At least one of your Vibes experiments will surprise you - whether it becomes a viral remix or simply a visual spark that inspires the next big thing.


FAQ

Q: Is Vibes available globally?
Meta has begun rollout in more than 40 countries, though not all users have access yet.

Q: Can creators monetize their videos?
Not currently. Meta has yet to outline monetization or rights for derivative AI videos.

Q: Will AI content replace real creators?
Unlikely soon. Authentic storytelling still relies on human emotion and intent.

Q: How does Meta moderate AI content?
Meta is implementing watermarking, reporting tools, and prompt filters to reduce misuse.

Q: How often will Vibes evolve?
Meta plans ongoing updates to models, quality, and remix tools throughout 2026.


Runbo Li
About Runbo Li
Co-founder & CEO of Magic Hour
Runbo Li is the Co-founder & CEO of Magic Hour. He is a Y Combinator W24 alum and was previously a Data Scientist at Meta where he worked on 0-1 consumer social products in New Product Experimentation. He is the creator behind @magichourai and loves building creation tools and making art.