Can a machine appear alive through behavior alone?
AmViSH v1 was built in November 2024 by Aman Vishwas at age 14 as a robotics prototype during a school summer assignment. The project explored how simple sensors, behavioral logic, and symbolic design elements could create the illusion of life in a robotic system.
The robot is a 15 cm tall cardboard-bodied prototype consisting of a head and chest connected by a thin cylindrical neck. While mechanically simple, the system focuses on behavioral expression rather than mechanical complexity.
At the center of the robot's design is a symbolic concept: a removable heart. The robot only becomes active when its heart is present.
The central idea behind AmViSH v1 is a physical life trigger.
The robot's chest contains a transparent circular chamber representing the heart. Inside this chamber is a hidden digital Hall-effect sensor connected to the Arduino controller.
The heart itself is a coin-shaped cardboard module containing a small magnet.
When the heart is placed into the chest cavity, the magnetic field activates the sensor and triggers the robot's Alive Mode. When the heart is removed, the robot transitions into Alert Mode.
This design intentionally mirrors a biological metaphor: the robot is only “alive” when it has its heart.
Beyond symbolism, the heart module acts as a physical state switch that controls the robot's behavioral system.
When the heart module is inserted, the Hall sensor detects the magnetic field and the robot enters Active Mode.
In this state the robot expresses signs of life through visual behavior patterns.
The transparent chest window contains a ring of individually controlled LEDs positioned behind the surface. These LEDs activate sequentially in a looping pattern, producing a rotating energy-like animation that gives the appearance of a living technological core.
At the same time, the robot's eyes — constructed from LEDs behind a translucent sheet — begin a slow PWM fade pattern. The fading light creates the illusion of blinking awareness and attention.
Together, these behaviors simulate the presence of an active internal system.
In addition to visual behavior, the robot can speak using an external AI voice system. A smartphone running ChatGPT voice mode connects through Bluetooth to an internal amplifier module and speaker system inside the robot's body. This allows the robot to respond verbally using an assistant-style voice.
The combination of light animation, behavioral responses, and speech creates a convincing perception of an active machine.
Removing the heart module immediately transitions the robot into Alert Mode.
Without the heart, the robot visually communicates system distress.
The LED ring inside the chest switches from sequential animation to a synchronized blinking pattern. All LEDs flash on and off simultaneously, producing a strong warning signal.
An audible beep-beep alert sound is also generated through the speaker system.
At the same time, the eye lights shut down completely, removing the sense of awareness that existed in Active Mode.
This behavioral shift reinforces the symbolic design: without its heart, the robot is no longer alive.
AmViSH v1 was constructed using accessible prototyping components integrated inside a cardboard body structure.
The chest compartment houses the main electronics system, including the microcontroller, audio hardware, and LED drivers.
The robot is powered through a USB connection, simplifying the electrical design and allowing stable operation during testing and demonstrations.
The control system is based on an Arduino Uno, which manages the sensor input, LED animations, and audio alerts.
The software controlling the robot was written in Arduino C++, using a simple behavioral state machine.
The program relies on if–else logic combined with non-blocking timing techniques to manage LED animations without interrupting sensor monitoring.
AmViSH v1 served as the first experimental prototype exploring how behavioral design and simple sensor systems can create the perception of life in a robotic machine.
The project demonstrated that expressive behaviors — such as light patterns, symbolic components, and responsive feedback — can strongly influence how humans perceive intelligence and presence in machines.
Following this initial prototype, a second iteration of the system, AmViSH 2.0, was later developed to expand upon the ideas introduced in the first version.
While AmViSH v1 focused primarily on symbolic interaction and behavioral expression, later iterations of the project explore how embodied systems can combine hardware design, sensing, and intelligent behavior to create more advanced robotic experiences.
The AmViSH series continues as an ongoing exploration into how physical design, behavioral logic, and artificial intelligence can work together to produce machines that feel more lifelike to human observers.