If you expose an API to the public internet, you need a basic guardrail against bursts. I like a token bucket with rate.NewLimiter because it’s easy to reason about: steady-state rate plus a burst capacity. The middleware checks Allow() and returns 429 quickly when the limit is exceeded, which protects your database and downstream services. The important detail is scope: a global limiter is useful for protecting the whole service, but per-client limiting is usually better. In production, I combine this with an IP-based key (or authenticated principal) and a shared store if I need global enforcement across instances. Even as a local limiter, it’s a helpful first line of defense and keeps your service from getting overwhelmed by accidental client loops.