"we're delivering more than connectivity"
T-Mobile is preparing to test a new artificial intelligence feature that translates live phone calls into more than 50 languages.
The service, called Live Translation, will launch in beta “this Spring”, according to a company press release.
The feature operates entirely within the operator’s network rather than on individual devices. No app is required, and it is not limited to specific handsets. Only one caller needs to be a T-Mobile customer for the service to function.
During a call, users dial 87 to activate Live Translation. They can then select from more than 50 supported languages. The service also works for calls to over 215 international destinations.
The network-based design signals a shift in how telecom providers deploy AI.
Instead of relying on device-level processing, T-Mobile appears to have embedded the capability into its core network infrastructure.
The company has positioned the feature as a solution to a widespread communication barrier. According to Pew Research statistics, around 60 million Americans live in multilingual households.
T-Mobile CEO Srini Gopalan said: “Some of the biggest barriers wireless customers face are the simplest ones, like being able to understand each other.
“When language gets in the way, the network gets reduced to just a signal and that’s not who we are.
“By bringing real-time AI directly into our network, we’re delivering more than connectivity, turning conversations into community, starting with Live Translation.”
Registration for the beta version is free. However, it is currently limited to postpaid customers.
The company has not confirmed when the feature will become widely available. It has also not clarified whether it will carry a subscription fee or additional conditions.
As with any service that processes live voice data, privacy concerns are likely to surface.
Because the system must listen to calls in order to translate them, questions around surveillance and potential censorship may arise.
T-Mobile has addressed these concerns directly. The company insists that “what you say is what they hear, every word and emotion, with no censorship or editing”.
The launch of Live Translation marks another step in T-Mobile’s broader artificial intelligence strategy.
At its 2024 capital markets day, then-CEO Mike Sievert outlined plans to make the network more intelligent, which included developing AI-RAN technology alongside Nvidia, Ericsson and Nokia.
T-Mobile also partnered with OpenAI to improve customer experience by using interaction data to train AI systems.
Live Translation does not fall neatly into AI-RAN development or customer service automation. However, it reflects the same strategic intent to embed AI deeper into network operations.
John Saw, president of technology and CTO of T-Mobile, said:
“Live Translation shows what’s possible when you rethink the role of a wireless provider.”
“We started this journey years ago by betting big on 5G and creating a network that wasn’t just about speed, but also one that could adapt and evolve.
“With nationwide 5G Advanced as the foundation, we can now run real-time AI services directly in the network with Live Translation being the first in a new era of AI-driven experiences for customers.”
The reference to 5G Advanced underlines the infrastructure requirements behind the service. Running real-time translation within the network demands low latency and high reliability.
For now, Live Translation remains in beta. Its long-term impact will depend on accuracy, privacy safeguards and pricing.








