Meta launches AI voice translation for Facebook & Instagram creators globally

Techcrunch

Meta has unveiled an AI-powered voice translation feature for creators on Facebook and Instagram, rolling out globally to markets where Meta AI services are currently operational. This new capability empowers creators to translate their content into other languages, significantly broadening their potential audience reach.

First previewed at Meta’s Connect developer conference last year, the feature underwent pilot testing for automatic voice translation in Reels across both social platforms. A key innovation highlighted by Meta is the system’s ability to retain the creator’s original voice, including its unique sound and tone, ensuring that the dubbed audio sounds authentic in the new language. Additionally, creators have the option to activate a lip-sync feature, which further enhances the natural appearance of the translated content by synchronizing the audio with their lip movements.

Initially, the system facilitates translations between English and Spanish, with the company promising support for additional languages in subsequent updates. Access to these AI-driven translations is extended to Facebook creators boasting at least 1,000 followers, as well as all public Instagram accounts, contingent on Meta AI availability in their region. To activate the feature, creators simply select the “Translate your voice with Meta AI” option prior to publishing a Reel. They can then toggle on translations and decide whether to incorporate the lip-syncing function. Upon sharing the Reel, the translated version becomes instantly available to viewers.

Before public posting, creators retain the ability to preview translations and lip-syncs, with the flexibility to disable either option at any point. Importantly, rejecting a translation does not affect the original Reel. For viewers, a clear notice at the bottom of a translated Reel will indicate its AI-generated origin. Users who prefer not to view translated content in specific languages also have the option to adjust their settings accordingly.

Complementing this rollout, creators will gain a new metric within their analytics dashboard, displaying views segmented by language. This insight aims to provide a clearer understanding of how translated content resonates with and expands their audience, a utility that will grow as more languages are integrated. For optimal performance, Meta advises creators to record facing forward, articulate clearly, and avoid obstructing their mouths. Minimizing background noise or music is also recommended. The system currently accommodates up to two distinct speakers, provided they do not speak simultaneously, which could impede accurate translation.

In a related but distinct offering, Facebook creators can also manually upload up to 20 pre-recorded dubbed audio tracks to a Reel. This feature, accessible via the “Closed captions and translations” section within Meta Business Suite, allows for greater linguistic reach beyond the AI’s initial English-Spanish scope and supports adding translations both before and after content publication, unlike the automated AI system. While Meta has confirmed plans for broader language support in its AI translation feature, specific details regarding the next additions or their release timelines remain undisclosed.

Adam Mosseri, head of Instagram, underscored the strategic intent behind the feature, stating in a post, “We believe there are lots of amazing creators out there who have potential audiences who don’t necessarily speak the same language. And if we can help you reach those audiences who speak other languages, reach across cultural and linguistic barriers, we can help you grow your following and get more value out of Instagram and the platform.” This product launch coincides with reports indicating a significant reorganization within Meta’s AI division, which is reportedly refocusing its efforts across four core pillars: research, superintelligence, product development, and infrastructure.