Project Overview
Role: UX Engineer • Creative Technologist
Duration: April 2025 – Ongoing
Type: Personal Project / Prototype
Tools: HTML5, JavaScript, TensorFlow.js, Teachable Machine, Lottie
Platform: Web
I set out to design a playful, real-time experience that uses sound recognition and motion to bring the crowd to life—before the first act even hits the stage.
How It Works
By detecting specific crowd sounds—like claps or cheers—the app triggers real-time visual feedback and playful audio cues, inviting the audience to interact and shape the experience together.
- The app listens for real crowd sounds—like clapping or cheering—and responds instantly with vibrant animations
- The more the audience participates, the faster and more intense the animations become.
- Includes glowing bars, expanding circles, and celebratory visual layers like confetti or pulse effects.
- Encourages collective interaction—everyone contributes to speeding up the visuals.

How It Was Made
AI sound Classifier
This browser-based experience was built with a focus on real-time interactivity and seamless performance across devices. I started by training a custom audio model using Teachable Machine, capturing crowd sounds like claps and cheers. The model was deployed using TensorFlow.js and runs entirely in the browser — no backend required.

Interactive Visual Feedback
When the app detects a specific crowd sound, it triggers responsive animations and feedback. I used Lottie animations to bring a panda in a rocket to life, along with a glowing vertical “Loudest Crowd” meter that tracks current audience volume against past highs.
Dynamic Speech Feedback
A Speech Synthesis API adds voice-based prompts, nudging users to clap or cheer at key moments, while the canvas-based visualizer reacts to ambient noise in real time, using animated bars and pulsing effects.

What the Crowd Sees, What It Does
This annotated interface shows how each UI element contributes to the interactive warm-up experience. From visual prompts to real-time feedback, every part of the design was crafted to keep the audience engaged and energized.

- Score Meter tracks crowd volume and cheers, adding momentum as users participate.
- Rocket Animation (powered by Lottie) reflects progress and builds anticipation for the launch moment.
- “Loudest Crowd” Leaderboard shows scores from previous venues, adding a competitive edge.
- Call-to-Action Prompts (e.g., “Up next: Clap!”) guide the audience in real time, syncing behavior across the crowd.
- Visual Feedback Effects like pulsing circles, glowing bars, and confetti add excitement with every interaction.
Conclusion & Next Steps
Blurring the Line Between Audience and Show
This project was a creative exploration of how AI and interaction design can turn passive wait time into a collective moment of energy and excitement. By combining real-time sound recognition, expressive animations, and clear audience prompts, I created a browser-based experience that blurs the line between audience and performance.
The crowd wants more
Next, I’d love to expand this prototype by:
- Introducing team-based modes (e.g., left vs. right side of the venue).
- Exploring mobile sync, allowing users to join in from their phones.
- Experimenting with gesture recognition to complement voice input.
This is just the beginning—there’s so much potential to enhance live events through interactive tech that listens, responds, and amplifies the crowd.