top of page

Eva, a Robot with Feelings

Since May 2024 — Active prototype

A cardboard-bodied robot with LCD eyes, a live camera, and a local AI brain. Eva listens, speaks, and emotes in multiple languages — exploring how AI feels when it takes physical form.

1. The Starting Point

AI was moving fast — local models, better speech tools, cheap displays, tiny motors. We wondered: what if we combined them into a small desk robot with emotions?

2. What We Built

Eva is a playful prototype. She can:

  • Understand and speak English, French, and Russian

  • Handle accents naturally

  • See through a live camera and react

  • Blink and shift her eyes with dynamic animations

  • Move with small servo gestures

  • Run fully offline on a local LLM

3. How It Works

  • Brain: Python + local LLM

  • Voice: Multilingual STT + TTS

  • Vision: Live camera with recognition logic

  • Eyes: LCD displays with generated states

  • Gestures: Arduino-controlled servos

  • Body: Cardboard shell

4. Status

For now, Eva is waiting in our digital garage for her next upgrade.

Our projects

© 2025 Stacia & Mathieu Moser. All rights reserved.

bottom of page