I built VJam, a browser-based VJ app that turns your phone into a live visual controller.
What it does: Listens to music through the mic, detects BPM, and generates real-time beat-reactive visuals. 180+ presets, touch gestures, blend modes, text overlays, video/image layers — all in the browser.
Tech stack: Vanilla JS, Web Audio API (FFT + beat detection), p5.js for Canvas rendering, PWA (works offline after first load). No frameworks, no build step, no server — just static files on Vercel.
Why I built it: Professional VJ software (Resolume, VDMX) costs $200-400 and requires a laptop. I wanted something that works on a phone, connects to a projector via HDMI adapter, and costs less than a night out.
How it works: Open the URL on your phone, allow mic access, and BPM detection kicks in. Tap/swipe to control visuals. HDMI adapter to a projector for the full experience.
Free tier: 36 presets + auto mode + tap controls + blend/filter + BPM picker. Full version ($27 one-time) unlocks all 180 presets + text editing + video/image layers + scenes + grid.
Try it: https://vjam.vercel.app
2-min demo: https://www.youtube.com/watch?v=_nrSgubFBL8
Happy to answer questions about the Web Audio API, real-time canvas rendering on mobile, or anything else.
0 comments