everies

AR that transforms surrounding objects into interactive characters.

What it does

With the everies app, powered by Gemini technology, your surrounding objects come to life with unique, playful, interactive characters "everies." Simply scan your surrounding objects with your camera, and everies's eyes will appear on the objects.
Tapping on these eyes generates a character with a distinct personality, appearance, dialogue, and voice, all tailored to the specific object, environment, and context. You can also tap on any object and generate everies character, even if the eye doesn't initially appear.
Every character can engage in continuous conversation through your voice input, enhancing the interaction. You can also save your favorite everies and your memorable scenes.

everies’s one-and-only ability to turn 'everything' into interactive, playful characters is a standout feature of LLM technology, simplifying the experience and making it more user-friendly. The process is automated by the Gemini API, with object detection powered by TensorFlow Lite and character generation through JSON files. Eye positioning is determined at high frame rates using on-device object detection combined with AR. ARCore anchors the characters' eyes and faces, while high-frame-rate detection and the multi-modal LLM (Gemini) ensure fast responses and a high-quality experience.

Currently in beta as a proof of concept, everies aims for a formal release with future updates, including continuous conversations with favorite everies and video recording features.

Built with

  • Android
  • ARCore
  • TensorFlow Lite
  • EfficientDet model for Object Detection

Team

By

Team everies

From

Japan