Eva, a Robot with Feelings
Since May 2024 — Active prototype
A cardboard-bodied robot with LCD eyes, a live camera, and a local AI brain. Eva listens, speaks, and emotes in multiple languages — exploring how AI feels when it takes physical form.








1. The Starting Point
AI was moving fast — local models, better speech tools, cheap displays, tiny motors. We wondered: what if we combined them into a small desk robot with emotions?
2. What We Built
Eva is a playful prototype. She can:
-
Understand and speak English, French, and Russian
-
Handle accents naturally
-
See through a live camera and react
-
Blink and shift her eyes with dynamic animations
-
Move with small servo gestures
-
Run fully offline on a local LLM
3. How It Works
-
Brain: Python + local LLM
-
Voice: Multilingual STT + TTS
-
Vision: Live camera with recognition logic
-
Eyes: LCD displays with generated states
-
Gestures: Arduino-controlled servos
-
Body: Cardboard shell
4. Status
For now, Eva is waiting in our digital garage for her next upgrade.
Our projects

A small desk robot with LCD eyes, a camera, and a local AI brain that listens, talks back, and reacts with little gestures. Built to find out what AI feels like when it gets a face.

A large-format 3D printer we built because every off-the-shelf option was too loud, too expensive, or too big for an apartment. Prints up to 48×30×30 cm, runs quietly, and you can monitor it from your phone.

An assistant that handles the worst part of dealing with public services: the waiting. It calls phone lines, navigates the menus with a local LLM, and pings you the moment an appointment is finally yours.

Our first AI project, born out of business trips drowning in receipts. Snap a photo, the app reads it, sorts it, converts the currency, and builds a clean PDF report ready to send to your accountant.

A small platform a cup sits on, that quietly tracks how full and how warm the drink is — and pings the other person in the apartment when a refill is due. Built around a small ritual at home.

A machine that paints AI-generated compositions on real canvas with real brushes. Built to ask whether something dreamed up by code and made by a robot arm can still feel like a real, expressive painting.

A lamp that lets Stacia share her mood with Mathieu without saying a word. She picks an emoji in the iOS app, and the lamp glows in the matching color across the room — half joke, half real.

A compact CNC built into a rolling tool cart, because every machine we looked at was too bulky for a small studio. Cut wood, plastics, and PCBs from a browser, and became our daily prototyping tool for over a year.

A cardboard home robot that uses LiDAR to map the apartment and figure out where it is, then announces its position out loud. Our way into indoor robot navigation, built from scratch to learn how it works.

A safety system we built when we got a kitten. BLE beacons mark off zones of the apartment we wanted him to avoid, a tag on his collar notices when he crosses one, and our phones and smartwatches buzz immediately.

Back when our bus stop had no schedule and the official site was useless, we built a live tracker for the one line we used every day. A simple map, real arrival times, no more guessing.

A small desk robot with LCD eyes, a camera, and a local AI brain that listens, talks back, and reacts with little gestures. Built to find out what AI feels like when it gets a face.

A large-format 3D printer we built because every off-the-shelf option was too loud, too expensive, or too big for an apartment. Prints up to 48×30×30 cm, runs quietly, and you can monitor it from your phone.

An assistant that handles the worst part of dealing with public services: the waiting. It calls phone lines, navigates the menus with a local LLM, and pings you the moment an appointment is finally yours.

Our first AI project, born out of business trips drowning in receipts. Snap a photo, the app reads it, sorts it, converts the currency, and builds a clean PDF report ready to send to your accountant.

A small platform a cup sits on, that quietly tracks how full and how warm the drink is — and pings the other person in the apartment when a refill is due. Built around a small ritual at home.

A machine that paints AI-generated compositions on real canvas with real brushes. Built to ask whether something dreamed up by code and made by a robot arm can still feel like a real, expressive painting.

A lamp that lets Stacia share her mood with Mathieu without saying a word. She picks an emoji in the iOS app, and the lamp glows in the matching color across the room — half joke, half real.

A compact CNC built into a rolling tool cart, because every machine we looked at was too bulky for a small studio. Cut wood, plastics, and PCBs from a browser, and became our daily prototyping tool for over a year.

A cardboard home robot that uses LiDAR to map the apartment and figure out where it is, then announces its position out loud. Our way into indoor robot navigation, built from scratch to learn how it works.

A safety system we built when we got a kitten. BLE beacons mark off zones of the apartment we wanted him to avoid, a tag on his collar notices when he crosses one, and our phones and smartwatches buzz immediately.

Back when our bus stop had no schedule and the official site was useless, we built a live tracker for the one line we used every day. A simple map, real arrival times, no more guessing.

A small desk robot with LCD eyes, a camera, and a local AI brain that listens, talks back, and reacts with little gestures. Built to find out what AI feels like when it gets a face.

A large-format 3D printer we built because every off-the-shelf option was too loud, too expensive, or too big for an apartment. Prints up to 48×30×30 cm, runs quietly, and you can monitor it from your phone.

An assistant that handles the worst part of dealing with public services: the waiting. It calls phone lines, navigates the menus with a local LLM, and pings you the moment an appointment is finally yours.

Our first AI project, born out of business trips drowning in receipts. Snap a photo, the app reads it, sorts it, converts the currency, and builds a clean PDF report ready to send to your accountant.

A small platform a cup sits on, that quietly tracks how full and how warm the drink is — and pings the other person in the apartment when a refill is due. Built around a small ritual at home.

A machine that paints AI-generated compositions on real canvas with real brushes. Built to ask whether something dreamed up by code and made by a robot arm can still feel like a real, expressive painting.

A lamp that lets Stacia share her mood with Mathieu without saying a word. She picks an emoji in the iOS app, and the lamp glows in the matching color across the room — half joke, half real.

A compact CNC built into a rolling tool cart, because every machine we looked at was too bulky for a small studio. Cut wood, plastics, and PCBs from a browser, and became our daily prototyping tool for over a year.

A cardboard home robot that uses LiDAR to map the apartment and figure out where it is, then announces its position out loud. Our way into indoor robot navigation, built from scratch to learn how it works.

A safety system we built when we got a kitten. BLE beacons mark off zones of the apartment we wanted him to avoid, a tag on his collar notices when he crosses one, and our phones and smartwatches buzz immediately.

Back when our bus stop had no schedule and the official site was useless, we built a live tracker for the one line we used every day. A simple map, real arrival times, no more guessing.