๐Ÿš€

Goroutines โ€” Lightweight Concurrency on a Different Level Than Ruby's Threads

Run tens of thousands of concurrent tasks with a single go keyword. No GIL.

Anyone who's dealt with concurrency in Ruby knows the GIL's limits. Create 10 threads, but CPU-bound work runs on only one. Concurrency only matters for IO-bound tasks.

Go doesn't have this problem. Goroutines achieve real parallelism.

Starting a goroutine

go doSomething()  // that's it

Similar to Ruby's Thread.new { do_something }, but goroutines aren't threads. They're lightweight "green threads" managed by the Go runtime. Memory usage is ~1/100 of a thread. One goroutine ~2KB, one OS thread ~1MB.

No GIL

Ruby's (CRuby) GIL allows only one thread to execute Ruby code at a time. Other threads only run during IO waits. Go has no such constraint. Goroutines run truly parallel across CPU cores.

In Ruby, you spin up multiple Sidekiq processes to work around this. In Go, goroutines automatically utilize multiple cores.

Problem: Shared State

Like Ruby's Mutex, Go has sync.Mutex. But Go's mantra: "Don't communicate by sharing memory; share memory by communicating." That's channels โ€” covered in the next post.

WaitGroup โ€” Ruby's Thread#join

Like Ruby's threads.each(&:join), Go uses sync.WaitGroup: wg.Add(1) โ†’ go func() { defer wg.Done(); ... }() โ†’ wg.Wait().

Ruby to Go

1

Ruby: Thread.new { } (no CPU parallel due to GIL) โ†’ Go: go func() (true parallel)

2

Goroutine ~2KB, OS thread ~1MB โ€” tens of thousands can run simultaneously

3

Ruby: threads.each(&:join) โ†’ Go: sync.WaitGroup (Add/Done/Wait)

4

Ruby: Mutex.new โ†’ Go: sync.Mutex (exists but channels recommended)

Pros

  • True parallelism without GIL โ€” overwhelming performance vs Ruby for CPU-bound tasks
  • Goroutines are so lightweight that concurrency model design has high freedom

Cons

  • Race condition debugging harder than Ruby โ€” go run -race flag essential
  • Goroutine leaks โ€” goroutines that never finish eating up memory

Use Cases

When converting Ruby Sidekiq workers to Go to reduce process count When you need a server handling thousands of concurrent HTTP requests (Ruby limited by process/thread count)

References