I built a side project—in-product release notes—with NextJS and Supabase. Development was smooth. Production was not.
Pages that loaded instantly in dev took seconds in production. Before I’d even launched, I hit Vercel’s image processing limits and Supabase’s edge function ceiling. The “lightweight” side project was heading toward €50/month in hosting costs.
So I rewrote it: Go with HTML templates, HTMX, and Alpine, hosted on a €5/month VPS. I also built authentication from scratch.
This wasn’t because NextJS and Supabase are bad—they’re not. They just weren’t right for this project. The performance issues and unexpected costs were the practical reasons. Curiosity about Go was the other reason.
Learning Go
Resources that worked well, in order:
Go’s directness was immediately apparent. Where JavaScript offers a dozen ways to do the same thing, Go typically presents one or two. This felt constraining at first, then liberating. The emphasis on “idiomatic” patterns—one right way to do things—made the language easy to pick up.
Rolling my own auth
Supabase Auth’s abstractions made debugging painful. So I built authentication from scratch.
Two resources made this straightforward:
The implementation: a session service, authentication middleware, rate limiting, and CSRF protection. More work than using a provider, but when something breaks, I know exactly where to look. I wrote more about this in Rolling Your Own Authentication in Go.
Project structure
In NextJS, client and server code blur together. In Go, the boundaries are clear: receive request, process, return response.
Standard structure:
- Handlers (HTTP requests)
- Services (business logic)
- Repositories (database)
- Models (data structures)
- Templates (HTML)
Libraries: Chi for routing, validator for validation, Gorilla schema for form handling.
A consistent pattern emerged for handlers: parse → decode → validate → business logic → response.
The hardest part was replacing React’s component model with Go templates. No state management, no component abstraction—just template partials and inheritance. It forced me to think differently about UI structure.
What surprised me
Speed. The first time I loaded the Go version, I thought it was broken—pages appeared so fast I assumed they hadn’t loaded. A dynamic app with auth, database queries, and business logic, responding instantly. I knew static sites could be fast. I didn’t expect this from a full application.
Simplicity. In NextJS, I was constantly thinking about client vs server components, server actions, data fetching strategies, client/server state interplay. In Go: receive request, process, return response. Debugging got easier. Development got faster.
AI compatibility. Go’s consistency made AI tools significantly more reliable. Point an LLM at a Go handler, ask it to create another one in the same style—it works almost every time. The lack of abstraction layers means less for the model to misunderstand.
On abstractions
Implementing features in Go—routing → parsing → validation → business logic → templates—made something clear: this is what NextJS does under the hood, just with layers of abstraction on top.
Client components, server components, server actions, hooks, context, reducers—they’re abstractions over request-response. Powerful for large teams building complex applications. Overkill for many projects.
Go templates have real limitations. No component model, no state management. Some interactions are harder to build. But the constraint forced me to understand web fundamentals more deeply—browser events, form submissions, AJAX—knowledge that transfers regardless of framework.
Not everything in Go-land is perfect. GORM (the ORM I used) occasionally behaved in surprising ways. Sometimes I wished I’d just written raw SQL.
Will I use React again? Yes—it’s the right tool for certain projects. But I’ll be more critical about whether its abstractions solve problems I actually have.
The result
The Go version is faster, cheaper (€5/month vs €50+), more maintainable, and easier to debug. When something breaks, I know where to look.
The migration started as a practical fix for performance and cost problems. It became a useful reminder that simpler tools often work better—and that understanding what happens beneath the abstractions makes you better at using them.