I've been building something. It started the way most side projects do. I wanted a tool that didn't exist, got annoyed, and opened a text editor.
RiffLogic is a guitar practice and learning tool that runs in the browser. You can explore scales and modes on an interactive fretboard, flip through chord voicings, load Guitar Pro tabs, and play along with YouTube backing tracks. All synced together. All in one place.
I've been playing guitar for the past year. I've also been writing software for several years. At some point those two worlds were bound to collide.

There are guitar tools out there. Songsterr exists. Ultimate Guitar exists. But nothing quite worked the way I wanted. I wanted to load my own tabs, hear them with decent sounding instruments, and practice specific sections on loop without fighting the UI. I wanted the fretboard to light up and show me what's happening in real time. I wanted to pull up a YouTube track and have the tab follow along.
The Stack
No frameworks. Vanilla JavaScript, Web Audio API, and a whole lot of Canvas rendering. FluidSynth compiled to WASM handles the instrument sounds through a 247MB General MIDI soundfont. The fretboard is SVG. The tab viewer is Canvas. The server is a simple Express app that proxies YouTube audio. Prototype-ish.
The Hard Parts
The Dual Clock Problem. This one almost broke me. When you play a YouTube backing track alongside a Guitar Pro tab, you have two independent clocks. The Web Audio API has its own clock. The HTML audio element has its own clock. They drift apart over time. Slowly at first. Then enough to notice. Then enough to ruin the experience.
I tried everything. Polling with requestAnimationFrame. Periodic seeking to correct the drift. Audio analysis to detect where the beat actually lands. None of it worked reliably. The solution ended up being surprisingly simple in concept but tricky in practice. I flipped the clock hierarchy. Instead of the tab player running on its own clock and trying to match the audio, it reads time directly from the audio element. The audio becomes the single source of truth.
But even that wasn't enough. Real recordings don't have perfect tempo. The guitarist speeds up in the chorus, drags a little in the verse. The theoretical tempo map and reality diverge. So I built a checkpoint system. You pause the track, press C where the audio actually is, and the system interpolates between your markers to warp the tab timeline. It's manual. It works.
Guitar Pro File Parsing. The GP format is not well documented. It's a ZIP file containing XML, but the XML schema has quirks that only make sense if you've stared at hex dumps for a while. Vibrato can live in two different places in the XML tree. Slide types are encoded as a bitmask. Ties that cross system line breaks need special arc rendering logic. I spent a lot of time comparing my output against Songsterr to figure out what I was getting wrong.
Canvas Performance. Rendering 200+ measures with all the annotations (bends, slides, hammer-ons, tuplet brackets, dynamics) is a lot of pixels. The trick was splitting it into a static canvas that renders once and an overlay canvas for the playback cursor. Pure math for layout computation, no DOM involvement. It's not perfect. Dense 32nd note passages on a lower end machine will still make you sweat. Good enough.
The ScriptProcessorNode bridge is deprecated but it's more stable across browsers than AudioWorklet. The 8192 sample buffer size is a compromise between glitches and latency. And caching a 247MB soundfont in IndexedDB so users don't re-download it every session was one of those "obvious in hindsight" solutions.
Where It's Going
RiffLogic is a work in progress.
I want to improve the AudioWorklet integration so the audio pipeline is fully off the main thread. I want to add better visual feedback for practice sessions, like accuracy tracking and progress over time. The scale explorer could use more advanced theory (arpeggios, interval training). And the YouTube sync checkpoint system could potentially be automated with some clever audio analysis.
The Point
This project is the intersection of two things I care about deeply. Music and software.
If you're a guitarist who has ever wished your practice tools were better, keep an eye on this one.
Github link: https://github.com/whleucka/rifflogic
Leave a Comment