The Unexpected Chaos of Streaming Video
< Back to BlogOver the last few months, I built an end-to-end video streaming system — from encoding to delivery — because every other option was either broken, overpriced, or both. I had a client that was forced to switch streaming providers when their existing solution announced that they were shutting down. The suggested replacement tanked performance so badly that videos stopped playing altogether, Chromecast support completely broke, and users were unsurprisingly upset.
Rather than gamble on another provider, they decided to cut down their reliance on third-party systems entirely — and asked me to build something custom. Along the way, I had to untangle a video library shaped by years of shifting file-naming conventions and write software that could accurately match everything back up. The real challenge was avoiding database corruption, because a run of bad matches could poison the entire system, and pulling that apart again would’ve been a full-blown forensic nightmare.
What started as a quick workaround turned into a crash course in everything that can go wrong — and a much bigger project than I ever expected.
CHAPTER 1: THE PLAN
The first real decision was picking a programming language for the backend — the part of the system that does all the real work. I chose a language called Go. It was created at Google by a team of veteran engineers — most notably Rob Pike, who helped design the Unix operating system and the C programming language.
Go was designed for fast, scalable systems that need to run a lot of tasks at once. Quickly, reliably, and under heavy load. Another word for this is concurrency. When you’re dealing with 4K video uploads, simultaneous encodes, and real-time status updates — predictability matters, and Go handles this out of the box.
I looked at a few options. Python, Node.js and Rust were all candidates, but Go hit a strategic middle ground between performance and simplicity that was really attractive. Once I settled on Go, I started building. First up: a server that could take video files, hand them off to FFMPEG, and return something streamable.
If you’ve never heard of FFMPEG, you’ve definitely used it. It’s the backend workhorse behind most video software — compressing, converting, trimming, packaging. YouTube uses it. OBS uses it. Your favorite screen recorder probably does too.
Next, I threw together a simple uploader website. Just a simple drag-and-drop webpage to get videos into the pipeline.
CHAPTER 2: EARLY CHALLENGES
The first issue I encountered was uploading large video files over a typical internet connection. To avoid crashes or timeouts, the safest approach is to break the file into smaller pieces — like slicing up a loaf of bread — and sending them to the server one chunk at a time. In theory, that’s straightforward. In practice, keeping all those chunks in order, confirming they arrived intact, and stitching them back together can be a little tricky.
Then I started running into server timeouts. Basically the point where it just gives up and assumes you’ve disappeared. Uploading a multi-gig file from a hotel lobby or a spotty home connection is an easy way to hit that ceiling, so I needed something dynamic that could adjust based on file size.
Finally, I needed a way to communicate job handling. The server could process multiple videos at once — but to the end-user, it looked like a black box. No progress indicators, no job status, no sense of whether their upload was doing. So, I created a job queue — a way to track every file in the system, show which ones were uploading, processing, failed, etc. Basically, a status dashboard that made the whole thing work for the folks I was building for.
CHAPTER 3: GOING SOLO
But all of that was nothing compared to the real problem: The client’s entire library was built with Microsoft’s Smooth Streaming — a proprietary format whose support with modern devices is steadily dropping. Because of this issue, and a broader desire to reclaim more direct authority over their tech stack, we made the call to move everything out of Microsoft’s cloud service and onto a dedicated storage server.
That left us with the biggest decision yet:
Option A — Keep the original Smooth Streaming files and convert them on the fly for modern devices. This is pretty much how Azure’s suggested partner operated, the one that we had tons of problems with.
Or Option B — Re-encode every video into a universally compatible format, even if it meant reprocessing years of content.
Option A was clever and flexible, but felt like a moving target, and required more maintenance. Option B was slower and dumber. But predictable. No real-time processing, just clean, pre-encoded, stream-ready video.
So we went with the latter: short-term pain for longer-term simplicity.
CHAPTER 4: FILE MIGRATION HELL
Turns out, re-encoding thousands of videos takes time. Four months, to be exact. But it was mostly automated — outside of a few edge-case hiccups, it was rolling night and day on its own. At this point I really thought I was all the way out of the woods.
But then, the biggest curveball of the process hit me. You see, each folder on both the old and new storage servers uses a long, randomly generated string of letters and numbers to avoid naming collisions. The new storage server was supposed to be a straight copy of all of the files, so I (dangerously) assumed the folder names would be the same in both systems — which would’ve made updating each video a breeze once the re-encode process finished.
Imagine the drop in my stomach when I discovered that was completely wrong — every migrated folder had a different ID than its counterpart from the old system. This meant we had to craft a completely custom approach to figure out which files on the new storage server matched the videos currently playing on the website.
The first obvious step was looking at filenames. I could index every file on the new server and simply find the existing video with a matching name, and that actually worked for a good majority of items. But the server was also littered with files that shared the same name. So, I needed a different way to match these up outside of manually watching and updating database entries one at a time.
For those, I pulled the video’s stream manifest to get its duration. If the duration of the existing video on the site matched one on the new server, I could reliably know which one to replace. But even that didn’t cover everything. Some duplicates had the same name and duration. For example, if someone tweaked the credits of a film, that wouldn’t change its runtime. And if they uploaded that new version with the same filename as the original… we’re screwed. Because these files had all been migrated, every creation date was just the date of the migration. So even a simple “which one is newer” approach was too risky.
So, painfully, those had to be re-encoded again, using the highest-res version from the old streaming manifest. Not elegant. But better than data corruption.
CHAPTER 5: THE BREAKTHROUGH / CLEANUP / CLOSING
And suddenly, somehow — after months of watching a slow re-encode process, edge-case nightmares, and comparing folder names, everything was actually working. Chromecast was playing. All of our TV apps fell in line, No buffering, and no angry customers. Now we’re focused on cleaning up — because every video points to the new location, we can run a job to delete anything from the storage server that isn’t tied to a database entry, because we know for sure that video isn’t in use.
So, It’s not a billion-dollar streaming pipeline. But it’s fast. It’s cheap. And the client completely owns it.
Since that initial build out, because I worked with other team members that handle the database and the website-specific API endpoints, I felt like I needed some personal experience with those pieces that I didn’t really get to touch. So, in my free time, I’ve been re-building every bit of the video encode process from the ground up, but in tandem with my own database approach to storing video objects, how those relate to other data structures like categories and even an approach to users that can authenticate and watch uploaded content.
If you need some help with your streaming platform or website, click the button at the top of this page to send me an email.