BrainThy — Real-Time Muay Thai Neuro-Performance Engine

Inspiration

Combat sports like Muay Thai expose strengths and weaknesses instantly. But most training focuses only on physical metrics: speed, accuracy, reaction time, punches landed. What remains invisible is the mental layer — stress, focus, cognitive load, emotional control, decision-making under pressure.

We wanted to answer a question normally limited to neuroscience labs:

What actually happens inside your brain during a fight?

BrainThy brings together VR combat, real-time biosensing, and deep neuro-analysis to create a new category of athletic training: neuro-performance.

What It Does

BrainThy is an end-to-end neuro-adaptive combat training system that measures both physical and mental performance simultaneously.

When the user fights an AI Muay Thai agent inside a Unity-built VR arena (Varjo XR-3), the system:

1. Tracks Body and Combat Performance

  • Punches, kicks, blocks, dodges
  • Reaction time and timing precision
  • Defensive success rate and technique efficiency
  • Movement quality based on Unity collision and motion events

2. Reads Brain and Physiology in Real Time (OpenBCI Galea)

We tap into multiple biosignal modalities:

  • EEG: focus, stress, cognitive load, flow state
  • EMG: muscle activation and fatigue
  • EDA: emotional arousal and anxiety
  • PPG: heart rate, HRV, SpO2
  • Eye Tracking: visual attention shifts, threat detection patterns

3. Streams All Data Live to a Web Dashboard

A React dashboard visualizes:

  • Real-time EEG bands and brain-state indicators
  • HRV curves and stress peaks
  • Muscle activation timelines
  • Cognitive-load graphs
  • Punch/block accuracy, efficiency, and timing
  • A combined Neuro-Performance Score

4. Records and Saves Every Session

Every training round is stored, timestamped, and indexed, allowing athletes to review:

  • Stress tolerance
  • Focus consistency
  • Reaction performance
  • Neuro-physical correlations
  • Growth over time

BrainThy turns combat analytics into neuroscience.

How We Built It

The system is composed of five major components:

1. Unity Combat Engine

  • Custom Muay Thai AI opponent
  • Player interaction, collisions, hit detection
  • Event hooks that timestamp combat actions
  • Integrated with Varjo XR-3 for high-fidelity VR
  • Mixed reality support for spatial awareness

2. OpenBCI Galea Sensor Integration

Galea provides EEG, EMG, EDA, PPG, eye tracking, and IMU data. Pipeline: Galea → Galea GUI → Python processing server → Node.js WebSocket bridge → React website

3. Python Backend (Signal Processing)

Python handles:

  • Raw EEG/EMG/EDA/PPG stream ingestion
  • Noise filtering and band extraction
  • Feature generation:
    • Alpha/Beta/Gamma ratios
    • Cognitive load estimate
    • Stress and arousal metrics
    • HRV (RMSSD, SDNN)
    • Muscle fatigue signatures
  • Data packaging + session indexing
  • Long-term storage

4. Node.js + WebSockets Layer

  • Lightweight, sub-100ms broadcast system
  • Streams real-time data to the React dashboard
  • Handles multi-client synchronization
  • Manages backpressure and rate control

5. React Frontend

  • Real-time EEG graphs
  • Stress and focus gauges
  • HRV timelines
  • Muscle activation charts
  • Combat stats overlay
  • Session review and playback

The full stack forms a continuous loop from VR → biosensing → analysis → visualization.

Challenges We Ran Into

  • Synchronizing EEG, EMG, EDA, PPG, eye tracking, and combat events in real time
  • Handling high-frequency EEG streams while maintaining VR frame stability
  • Creating a low-latency Python → Node → React WebSocket pipeline
  • Cleaning biosignals affected by movement (Muay Thai is high-impact)
  • Designing combat metrics that correlate meaningfully with mental states
  • Ensuring Varjo XR-3, Galea, Unity, and web pipeline play together without conflicts

Accomplishments We’re Proud Of

  • Built a functioning real-time neuro-combat pipeline
  • Achieved stable EEG + HRV + EMG streaming inside a live fight
  • Created a Muay Thai VR sparring partner with synchronized biosignal analytics
  • Designed the first prototype of a Neuro-Performance Score
  • Combined neuroscience, VR, and martial arts into one cohesive training system

What We Learned

  • Biosignals are noisy; preprocessing and filtering matter more than expected
  • EEG values differ heavily between users; normalization is challenging
  • HRV and EDA strongly correlate with perceived stress during combat
  • Eye tracking + EEG together give a precise model of panic, focus, and hesitation
  • Timing synchronization is the hardest part of VR + biosensing
  • Mental performance affects physical outcomes more than athletes realize

What's Next for BrainThy

1. Expanding Beyond Muay Thai

Next supported sports:

  • Boxing
  • MMA
  • Taekwondo
  • Wrestling
  • Fencing
  • Tactical and police-military reaction drills

2. Adaptive AI Opponent

  • AI adjusts aggression based on stress
  • Focus-based difficulty scaling
  • Personalized challenge curves

3. Personalized Neuro-Training Programs

  • Breathwork modules based on stress levels
  • Cognitive-load sprints
  • Reaction-time drills
  • Focus endurance sessions

4. Multi-Athlete Comparison

  • Compare yourself to your past
  • Compare to elite benchmarks
  • Style-based analytics

5. Full Cloud Sync

  • Auto-upload training sessions
  • Analysis for coaches and teams
  • Long-term neuro-performance tracking

BrainThy aims to redefine how athletes train by unifying mental and physical analytics into one system.

Share this project:

Updates