Samsung subsidiary STAR Labs has officially unveiled its mysterious “artificial human” project, Neon. As far as we can tell, though, there’s no mystery here at all. Neon is just digital avatars — computer-animated human likenesses about as deserving of the “artificial human” moniker as Siri or the Tupac hologram.
In fairness to STAR Labs, the company does seem to be trying something new with its avatars. But exactly what it’s doing we can’t tell, as its first official press release today fails to explain the company’s underlying tech and instead relies solely on jargon and hype.
“Neon is like a new kind of life,” says STAR Labs CEO Pranav Mistry in the release. “There are millions of species on our planet and we hope to add one more.” (Because nothing says “grounded and reasonable” like a tech executive comparing his work to the creation of life.)
Even more annoyingly, it seems that the teaser images and leaked videos of the Neon avatars we’ve seen so far are fake. As the company explains (emphasis ours): “Scenarios shown at our CES Booth and in our promotional content are fictionalized and simulated for illustrative purposes only.” So really we have no idea what Neon’s avatars actually look like.
Sorting through the chaff in STAR Labs’ press release today, here’s what we know for sure.
Each Neon avatar is “computationally generated” and will hold conversations with users while displaying “emotions and intelligence,” says the company. Their likenesses are modeled after real humans, but have newly generated “expressions, dialogs, and emotion.” Each avatar (known individually as “NEONs”) can be customized for different tasks, and is able to respond to queries “with latency of less than a few milliseconds.” They’re not intended to be just visual skins for AI assistants, but put to more varying uses instead:
“In the near future, one will be able to license or subscribe to a NEON as a service representative, a financial advisor, a healthcare provider, or a concierge. Over time, NEONs will work as TV anchors, spokespeople, or movie actors; or they can simply be companions and friends.”
So far, so good. It’s no secret that CGI humans have become more lifelike in recent years, and are already being used in some of the scenarios outlined above. If STAR Labs can make these avatars more realistic, then they might be adopted more widely. Fine.
But if you’ve ever interacted with, say, a virtual greeter at an airport or museum, you’ll know how paper-thin the “humanity” of these avatars are. At best, they’re Siri or Alexa with a CGI face, and it’s not clear if STAR Labs has created anything more convincing.
In its PR, the company veers into strange territory is in its description of the avatars’ underlying technology. It says it’s using proprietary software called “Core R3” to create the avatars, and that its approach is “fundamentally different from deepfake or other facial reanimation techniques.” But it doesn’t say how the software does work, and instead relies on wishy-washy assurances that Core R3 “creates new realities.” We’d much rather know if the company is using, say, high-resolution video captures pinned onto 3D models or AI to generate facial movements — whatever the case may be.
We’ve reached out to STAR Labs with our questions, but it seems we’ll have to wait to see the technology in person to get a better understanding. The firm is offering private demos of its avatars at CES this week, and The Verge is scheduled to check out the technology.
We look forward to giving you our hands-on impressions later this week, but until then, don’t worry about any “AI android uprising” — these aren’t the artificial humans you’re looking for.