Kinetix, the AI startup bringing emotes to video games and virtual worlds, announces ‘Text2Emotes’: generative AI technology that heralds a new era of user-generated gaming content (UGC) by enabling anyone to create 3D animations and emotes for games – from nothing more than a simple text prompt.
Text2Emotes offers one of the first ever examples of AI which can create high-quality, playable 3D animations and emotes – animations which express emotion – from a basic text input. For example, users could enter a well-known dance such as ‘griddy’ or a literal prompt such as ‘I’m angry’, and see their avatars come to life through AI-generated emotes. The AI has been trained on the large proprietary dataset created by Kinetix over the last three years. With millions of 3D animations and emotes, it produces far more advanced and industry specific content than typical models, which leverage limited academic and standard datasets.
Uniquely, Kinetix has created a format that enables animations to be saved and published across multiple games and virtual worlds, meaning emotes generated with Kinetix can be used on any avatar, within any video game or metaverse world that integrates the Kinetix SDK. Game developers and publishers can now open their games up to a whole new era of user-generated content, allowing players to import viral trends into their favourite games – or even fuel the next one by coming up with their own in-game emotes.
Emotes have been used by MMOs like World of Warcraft for over two decades but, after rocketing into the mainstream with games like Fortnite and PUBG, are fast becoming a must-have feature for self-expression in gaming content. UGC is core to the success of many gaming platforms such as Roblox, and is set to become the new standard for helping games generate sustainable, long-lasting revenue, as shown by Fortnite’s Creative 2.0 announcement. The crossover potential of emotes is endless, with videos tagged #emote having received more than 2.4 billion views on TikTok alone.
Kinetix CEO and co-founder Yassine Tahi said, “As we know, UGC is becoming the bedrock of modern gaming and enabling gamers to generate their own content on the fly using AI will be a critical part of this. So for two years we’ve focused on perfecting our AI models and the animations generated by them; first using video inputs and now, incredibly excitingly, using just a simple text prompt. The clear early use case for this tech is in gaming. Emotes have become an essential part of users’ self-expression and a revenue driver for game-makers, and our AI-powered emotes tech is already being integrated by some of the best-known developers and virtual world builders.”
ZEPETO ecosystem global head Jay Lee said, “Kinetix’s Text2Emotes feature represents an exciting use case for generative AI in virtual worlds, with the potential to enable a new dimension of expression for ZEPETO users. When integrated, it will provide the ZEPETO community with an even more immersive social experience.”
Text2Emotes will be added to the Kinetix Studio and existing Kinetix SDK, which already offers a ready-made, fully customisable emote wheel and a constantly growing library of more than 1,000 emotes across PC, console, and mobile. The Kinetix SDK is currently being integrated by more than 10 video games and virtual worlds including The Sandbox and PolyLand. Later this year, Text2Emotes will also be offered as an API (application programming interface).