It’s been a while since I’ve written anything that wasn’t a commit message or a Slack reply. So consider this post equal parts therapy and catching up, a way to get my typing muscles warmed up again while sharing some of the fun things I’ve been building, and plan on building.
Six years ago, I joined dotbigbang to help build a platform for web-based games. I didn’t know then just how many strange, fascinating problems I’d get to solve, or how much I’d learn along the way.
It started with collisions. We needed something reliable and familiar, so I built a system around traditional bounds, with an API that behaved like Unity’s physics components. Sounds simple enough, but working in a dynamic language has its quirks. You learn quickly that type flexibility is a blessing and occasionally, a curse.
Once collisions were sorted, the next frontier was terrain. I built a meshing system for sculpting environments, but rather than manipulating polygons directly, we worked in the world of signed distance fields. Our sculpting tools felt like mesh sculpting at first glance, but under the hood they were altering distance values. We had additive and subtractive CSG-style brushes, and shaders that let you preview your brush strokes — peeling through layers of depth, before committing them.
To keep terrain interactions scalable for game objects, we built an SDF-based collision system specifically for objects moving across the surface. It was LOD-driven: distant objects would collide against the raw distance data, and only when you got close would we generate a small set of neighbouring polygons, using the same meshing mechanics as the terrain system to ensure collisions happened against the actual geometric surface.
Then came the “doodads.” That’s what we called the vegetation and debris system, and it was all about adding life to the terrain. We painted density directly into the terrain data, then used blue-noise barycentric coordinate maps to scatter objects naturally across the surface. More blue noise drove the dithering between full 3D meshes and their billboard versions. Everything was GPU-instanced, so we could pack a surprising amount of detail into the scene without killing performance.
In the end, I had to squeeze the terrain data down as much as possible so our games could run with far less memory. Interestingly, the terrains often ended up as small islands or other game elements, not exactly what they were originally designed for. The plan had always been to have a single main terrain, with everything else built from voxel game items that players constructed block by block.
Now, with dotbigbang closing its doors, I’m exploring the job market for my next gig and dusting off my C and Rust skills with some old hobby projects. Rust turned out to be surprisingly easy to get comfortable with, though to be fair, after so many years of programming, picking up a new language isn’t much of a struggle anymore.
The same goes for new programming paradigms. Whether it’s graphics, LLMs, or infrastructure, the underlying mechanics aren’t all that different. In the end, we’re all just trying to make things run within a desired time frame, applying a mix of old and new academic principles to get there.
Where will I end up next? Hard to say. What I do know is that I’ll keep spending my spare time tinkering with my hobby projects, no matter what my days are filled with. Whether it’s building graphics systems, training LLMs, synchronising cluster nodes to stream data to paying users, or something entirely unexpected.
What will it be?