← Back

Would You Understand

A two-person heartbeat-sharing device that creates physical empathy through real-time haptic feedback, exploring whether feeling another's heartbeat can build understanding when words fail.

Course: Independent Research (Delta Residency) | Timeline: 3 weeks | Team: Solo | Status: Functional Prototype (Dual-Person System)

Technologies: Python (OpenCV, NumPy, SciPy), Arduino Mega 2560, rPPG, haptic motors, HC-05 Bluetooth

The Question

Can physically feeling another person's heartbeat increase empathy and emotional connection when words fail?

In moments of emotional overwhelm, social stress, or communication barriers, we often retreat into silence. Yet our hearts continue to beat—a rhythm that speaks louder than any explanation. This project investigates whether transmitting one person's heartbeat as tactile vibrations to another can create a channel for understanding that transcends language and operates at a physiological level.

The Approach

Rather than displaying heartbeat data visually or aurally, I wanted to create direct physical feeling—translating one person's heart rhythm into haptic vibrations felt by another. This required solving three major technical and conceptual challenges:

Challenge 1: Contactless Detection – Wired sensors on fingers or chest break intimacy and limit interaction. Solution: Implemented smartphone camera-based remote photoplethysmography (rPPG) that detects blood flow color changes in facial skin—no sensors needed.

Challenge 2: Realistic Haptic Sensation – Raw sensor-to-motor mapping creates erratic, clinical-feeling vibrations. Solution: Developed temporal smoothing and beat interpolation to generate physiologically plausible "lub-dub" patterns, making the sensation feel like a real heartbeat rather than robotic pulses.

Challenge 3: Dual-Person Synchronization – Tracking and transmitting two people's heart rates simultaneously while maintaining low latency is non-trivial. Solution: Face cascade classifier with size-based sorting, dual independent signal buffers, and serial communication via Arduino.

Technical Implementation

Hardware: Arduino Mega 2560, 2x Coin Vibration Motors (1027 type, 3V), MacBook Webcam (720p, 30fps), External 5V Power Supply, HC-05 Bluetooth Module

Software: Python 3.13 with OpenCV (face detection), NumPy (signal processing), SciPy (FFT), PySerial (Arduino communication); Arduino C++ with PWM motor control, dual-motor timing, and LED feedback

Detection Accuracy
±3 BPM

vs clinical reference

Update Frequency
2 Hz

BPM recalculation rate

Latency
1-2 sec

face detection to haptics

Process & Iteration

Weeks 1-3: Researched rPPG algorithms, built Arduino motor tests, implemented basic face detection + FFT pipeline. Breakthrough: Discovered direct motor connection works with external power (no transistor needed).

Week 4 Critical Debugging: Serial communication breakdown—Python sending data but Arduino not receiving. Root cause: Arduino IDE Serial Monitor was locking the port. Solution: Close IDE completely before running Python.

Weeks 5-7: Modified code for independent dual-motor control. Added LED feedback system for silent operation confirmation. Implemented 3-second timeout to stop motors when no face detected.

Weeks 8-10: Validated dual-motor system, tested with reference pulse oximeter (±3 BPM accuracy confirmed), prepared comprehensive documentation.

Weeks 11-12: Final testing with actual users, acquired HC-05 Bluetooth module for wireless next iteration, focused on UX rather than technical complexity.

Outcome & Reflection

Successfully created a functional prototype that generates immediate, intuitive heartbeat sensation. Key findings:

  • rPPG accuracy consistently within ±3 BPM of clinical reference
  • Real-time latency (1-2 sec) feels immediate and responsive to users
  • "Lub-dub" vibration pattern recognized as heartbeat immediately without explanation
  • Dual-person detection works reliably with two faces simultaneously

Key Insight: The technical challenge wasn't the algorithm—it was the interface. The biggest breakthroughs came from constraints that forced simplification: eliminating Serial Monitor dependency (LED feedback instead), auto-stopping when no face detected, simplifying data format for Arduino parsing. The system works best when it "disappears"—when users stop thinking about the technology and just experience the sensation.

Next Steps: Conduct structured user study with 10+ participant pairs, implement HC-05 Bluetooth wireless communication, design soft silicone/fabric enclosures for comfortable wear, explore cross-distance heartbeat sharing (internet-based), develop empathy measurement protocol (IRI scale, qualitative interviews), adapt for accessibility (non-visual perception for deaf/blind users).

Gallery

Dual-face detection screenshots, Arduino setup with motors, BPM data stream terminal output, LED confirmation flashing, close-up motor vibration patterns, user testing sessions

Documentation

GitHub Repository – Complete source code, setup instructions, troubleshooting guide

Technical Wiki – Detailed architecture, algorithm explanations, user testing protocol

References

  • Goldstein et al. (2017): Brain-to-brain coupling during handholding
  • Verkruysse et al. (2008): Remote photoplethysmographic imaging
  • Jakubiak & Feeney (2017): Affectionate touch and cardiovascular reactivity