So you've been writing Go code for a bit, and now you're ready to connect your app to a database. That's when you start hearing about "ORMs" and wonder if they're worth your time. If you're nodding along, you're in the right place.
In this guide, we'll break down everything you need to know about Golang ORMs - what they are, why you might want one, and how to pick the right one for your project. Let's cut through the noise and get you up to speed.
What Is an ORM and Why Should You Care?
ORM stands for Object-Relational Mapping. It's the translator between your Go code and your database.
Think of it this way: You write code with objects and methods, but your database speaks in tables and queries. An ORM bridges that gap, letting you work with database data using Go structs instead of writing raw SQL.
slog
effectively.At the technical level, an ORM handles several critical tasks:
- Schema mapping: Translates Go struct definitions to database tables and columns
- Type conversion: Converts between Go types and SQL data types
- Query generation: Builds SQL statements from method calls
- Relationship management: Handles connections between related data (one-to-many, many-to-many)
- Result mapping: Populates Go structs with query results
Here's how the code difference looks:
// Without an ORM, you'd write SQL:
rows, err := db.Query("SELECT id, name, email FROM users WHERE age > ?", 18)
if err != nil {
return err
}
defer rows.Close()
var users []User
for rows.Next() {
var user User
if err := rows.Scan(&user.ID, &user.Name, &user.Email); err != nil {
return err
}
users = append(users, user)
}
// With an ORM, you might do:
var users []User
db.Where("age > ?", 18).Find(&users)
The second option feels more natural in Go, right? That's the point.
The Real Benefits of Using an ORM
- Write less boilerplate - No more tedious SQL string building or manual scanning of results.
- Stick to Go syntax - Work with familiar structs instead of SQL, improving code readability.
- Catch errors earlier - Many errors become compile-time instead of runtime (like trying to access non-existent columns)
- Switch databases easier - Change from MySQL to PostgreSQL without rewriting queries, as the ORM handles dialect differences
- Consistent query patterns - Standardizes how your team interacts with the database
- Built-in protections - Most ORMs include measures against SQL injection and other common security issues
But ORMs aren't magic bullets. They add another layer to your app and can sometimes hide what's happening under the hood. For simple projects, you might not need one at all.
The biggest technical drawbacks include:
- Performance overhead: The abstraction layer adds some CPU and memory cost
- Learning curve: Each ORM has its own API and conventions to master
- Query optimization challenges: Complex queries might be harder to optimize than hand-written SQL
Popular Golang ORMs You Should Know
Let's check out the main players in the Golang ORM space:
GORM: The Heavyweight Champion
GORM is the most popular ORM in the Go ecosystem, and for good reason. It's feature-rich, well-documented, and actively maintained.
Here's a comprehensive example showing GORM's capabilities:
import (
"gorm.io/gorm"
"gorm.io/driver/postgres"
"time"
)
// Define models with relationships
type Product struct {
gorm.Model // Embeds ID, CreatedAt, UpdatedAt, DeletedAt
Code string `gorm:"size:100;not null;unique;index"`
Price float64 `gorm:"type:decimal(10,2);not null"`
Stock int `gorm:"default:0"`
Name string `gorm:"size:200"`
CategoryID uint
Category Category // Belongs to relationship
Tags []Tag `gorm:"many2many:product_tags;"` // Many-to-many
Reviews []Review // Has many relationship
}
type Category struct {
ID uint
Name string
Description string
}
type Tag struct {
ID uint
Name string
}
type Review struct {
ID uint
Content string
Rating int
ProductID uint // Foreign key
UserID uint
CreatedAt time.Time
}
// Connect to database
dsn := "host=localhost user=postgres password=postgres dbname=test port=5432"
db, err := gorm.Open(postgres.Open(dsn), &gorm.Config{})
if err != nil {
panic("failed to connect database")
}
// Migrations - creates/updates tables based on struct definitions
db.AutoMigrate(&Product{}, &Category{}, &Tag{}, &Review{})
// Create records with associations
techCategory := Category{Name: "Tech", Description: "Electronics and gadgets"}
db.Create(&techCategory)
wirelessTag := Tag{Name: "Wireless"}
portableTag := Tag{Name: "Portable"}
db.Create(&[]Tag{wirelessTag, portableTag})
// Create product with relationships
product := Product{
Code: "HP2022",
Price: 499.99,
Stock: 100,
Name: "Headphones Pro",
CategoryID: techCategory.ID,
Tags: []Tag{wirelessTag, portableTag},
}
db.Create(&product)
// Complex querying with preloading
var products []Product
db.Debug(). // Shows SQL being executed
Preload("Category"). // Load associated category
Preload("Tags"). // Load associated tags
Preload("Reviews", "rating > ?", 3). // Load only good reviews
Where("price > ? AND stock > 0", 100).
Order("price DESC").
Limit(10).
Find(&products)
// Transactions
db.Transaction(func(tx *gorm.DB) error {
// Multiple operations in a transaction
if err := tx.Model(&product).Update("Stock", product.Stock - 1).Error; err != nil {
return err
}
// Create order record
// ...
return nil
})
GORM gives you tons of features right out of the box:
- Automatic migrations: Create and update tables from struct definitions
- Hooks/callbacks: Run code before/after create, update, delete, etc.
- Eager loading: Load related data efficiently with Preload
- Scopes: Reusable query parts (like
db.Scopes(ForActiveUsers, OrderByLatest)
) - Soft delete: Keeps deleted records in DB but hides them from queries
- Associations: Handle relationships with minimal code
- Advanced query builder: Complex WHERE clauses, JOINs, and subqueries
- Raw SQL support: When you need custom queries
If you're coming from Ruby on Rails or Laravel, GORM will be familiar to you. But it's still very Go-like in its approach.
SQLBoiler: The Performance King
SQLBoiler takes a different approach. Instead of defining models in Go and generating tables, it looks at your existing database and generates type-safe Go code.
// First, define your DB schema
// Then generate models with sqlboiler command
// Then use the generated code:
product, err := models.FindProduct(ctx, db, 1)
product.Price = 200
product.Update(ctx, db, boil.Infer())
SQLBoiler is blazing fast because it generates code rather than using reflection at runtime. The downside? Less flexibility for changing your models on the fly.
SQLX: The Lightweight Contender
SQLX isn't a full ORM - it's more like the standard database/sql package with extra conveniences. But it's worth mentioning because it's a good middle ground.
import "github.com/jmoiron/sqlx"
type User struct {
ID int `db:"id"`
Name string `db:"name"`
Email string `db:"email"`
}
// Query directly into a struct
var user User
err := db.Get(&user, "SELECT * FROM users WHERE id = $1", 1)
If you want something lightweight that still saves you from writing tons of boilerplate, SQLX is worth a look.
Ent: The New Kid on the Block
Ent is a newer ORM created by Facebook. It uses a code-generation approach with a focus on type safety.
// Define a schema
type User struct {
ent.Schema
}
func (User) Fields() []ent.Field {
return []ent.Field{
field.String("name"),
field.String("email").Unique(),
}
}
// Then use the generated client
u, err := client.User.Create().
SetName("Jake").
SetEmail("jake@example.com").
Save(ctx)
Ent shines for complex data models with lots of relationships. It forces you to think about your schema up front, which can prevent headaches down the road.
Making Your First ORM Connection: A Quick Start Guide
Let's get our hands dirty with a basic GORM setup, since it's the most widely used:
1. Install the packages
go get -u gorm.io/gorm
go get -u gorm.io/driver/sqlite # Or another DB driver
2. Set up your model
package main
import (
"gorm.io/gorm"
"gorm.io/driver/sqlite"
)
type User struct {
gorm.Model // Adds ID, CreatedAt, UpdatedAt, DeletedAt
Name string
Age int
Email string `gorm:"uniqueIndex"`
}
func main() {
// Open connection to database
db, err := gorm.Open(sqlite.Open("test.db"), &gorm.Config{})
if err != nil {
panic("failed to connect database")
}
// Auto migrate schema
db.AutoMigrate(&User{})
// Create a user
db.Create(&User{Name: "John", Age: 18, Email: "john@example.com"})
// Read the user back
var user User
db.First(&user, "email = ?", "john@example.com")
// Update
db.Model(&user).Update("Age", 21)
// Delete
db.Delete(&user)
}
And you've got a working ORM setup! Pretty straightforward, right?
ORM Best Practices You Should Follow
ORMs can make your life easier, but they come with some challenges. Here are some tips to keep you on track:
Know When to Drop Down to SQL
ORMs are great for CRUD operations and simple queries. But for complex reports or performance-critical sections, sometimes raw SQL is best.
When should you consider raw SQL?
- Complex joins or subqueries: When you need to join 3+ tables with specific conditions
- Analytical queries: For reports involving aggregations, window functions, or CUBE/ROLLUP
- Performance hotspots: Sections of code executed thousands of times per minute
- Database-specific features: When you need specialized features like PostgreSQL's JSONB operations
Most Go ORMs let you execute raw SQL when needed:
type Result struct {
Name string
Count int
}
var results []Result
db.Raw(`
SELECT name, COUNT(*) as count
FROM users
WHERE last_login > ?
GROUP BY name
HAVING count > 5
ORDER BY count DESC
`, time.Now().AddDate(0, -1, 0)).Scan(&results)
Look Out for N+1 Query Problems
This is a classic ORM trap that can destroy your application's performance. Say you want to load all users and their posts:
// BAD: This will make a separate query for each user's posts
// If you have 1000 users, this runs 1001 queries!
users := db.Find(&[]User{})
for _, user := range users {
posts := db.Where("user_id = ?", user.ID).Find(&[]Post{})
// Do something with posts
}
What's happening under the hood:
- First query:
SELECT * FROM users
(gets 1000 users) - Then 1000 more queries:
SELECT * FROM posts WHERE user_id = ?
(one for each user)
The solution is to use eager loading:
// GOOD: This loads everything in just two queries
users := db.Preload("Posts").Find(&[]User{})
Now it runs:
SELECT * FROM users
SELECT * FROM posts WHERE user_id IN (1,2,3,...,1000)
This can be the difference between a 5-second and a 50-millisecond response time.
Use Transactions for Multiple Operations
If you're updating multiple records that need to succeed or fail together, wrap them in a transaction. This ensures data consistency even if something fails midway through.
Transactions guarantee ACID properties:
- Atomicity: All operations complete successfully or none do
- Consistency: Database moves from one valid state to another
- Isolation: Transactions don't interfere with each other
- Durability: Completed transactions persist even after system failures
Here's how to implement them:
// Start explicit transaction
tx := db.Begin()
defer func() {
if r := recover(); r != nil {
tx.Rollback()
}
}()
// Perform multiple operations
if err := tx.Model(&user).Update("balance", user.Balance - 100).Error; err != nil {
tx.Rollback()
return err
}
if err := tx.Model(&recipient).Update("balance", recipient.Balance + 100).Error; err != nil {
tx.Rollback()
return err
}
// Commit when everything's successful
return tx.Commit().Error
// Or use GORM's transaction helper:
db.Transaction(func(tx *gorm.DB) error {
// Update user
if err := tx.Model(&user).Update("balance", user.Balance - 100).Error; err != nil {
return err // Automatic rollback on error
}
// Update recipient
if err := tx.Model(&recipient).Update("balance", recipient.Balance + 100).Error; err != nil {
return err // Automatic rollback on error
}
return nil // Automatic commit on success
})
Don't Blindly Trust User Input
ORMs help prevent SQL injection, but you still need to be careful. SQL injection vulnerabilities can occur in unexpected places:
// DANGEROUS: Don't use string concatenation for column names
column := request.URL.Query().Get("sort")
db.Order(column + " DESC").Find(&users)
// A malicious user could set sort to "name; DROP TABLE users;"
Instead, implement proper validation:
// BETTER: Whitelist allowed columns with a map for O(1) lookup
allowedColumns := map[string]bool{
"name": true,
"created_at": true,
"email": true,
}
if allowedColumns[column] {
db.Order(column + " DESC").Find(&users)
} else {
// Default to a safe sorting option
db.Order("created_at DESC").Find(&users)
}
Also watch out for these pitfalls:
- Dynamic table names (always validate against a whitelist)
- Raw WHERE clauses constructed from user input
- LIMIT and OFFSET values (validate they're positive integers)
- JSON field paths in advanced database queries
Choosing the Right ORM: Decision Factors
So how do you pick the right ORM for your project? Consider these factors:
Project Size and Complexity
- Small project or microservice? Consider SQLX or even just database/sql
- Medium app with straightforward needs? GORM is a solid choice
- Complex domain with lots of relationships? Ent might be worth the learning curve
Performance Requirements
- Need absolute max performance? SQLBoiler or SQLX will be faster than GORM
- Average app? Any ORM will likely be fine - your bottleneck will be elsewhere
Team Experience
- New to Go? GORM has the gentlest learning curve
- Experienced team? Consider the code-gen approach of Ent or SQLBoiler for type safety
Database Flexibility
- Might switch databases later? GORM has the best cross-database support
- Staying with one DB? You can pick a more specialized option
When to Skip ORMs Entirely
Sometimes, you're better off without an ORM. Consider these scenarios:
- Super simple app with just a few queries? Raw SQL might be cleaner.
- High-performance data processing where every millisecond counts? Go direct.
- Unusual database operations that don't map well to CRUD? Raw SQL gives more control.
Wrapping Up
ORMs in Go strike a nice balance - they remove tedious boilerplate without hiding too much from you. For most projects, they're a net win.
If you're just starting out:
- Try GORM for a quick win - it's well-documented and friendly
- Build something small but real - a todo app or blog backend
- Once comfortable, experiment with other ORMs to find your perfect fit
The beauty of Go's ecosystem is that switching between different database approaches isn't too painful. The syntax differs, but the concepts remain similar.
FAQs
Are ORMs slower than raw SQL in Go?
Yes, ORMs generally add some overhead compared to raw SQL. However, for most applications, this performance difference isn't significant enough to matter. Modern ORMs like SQLBoiler and Ent minimize this gap through code generation. Unless you're building high-throughput systems where every millisecond counts, the convenience of an ORM usually outweighs the slight performance hit.
Can I use multiple ORMs in the same project?
Technically yes, but it's rarely a good idea. Using multiple ORMs creates inconsistency in your codebase and adds unnecessary complexity. Instead, consider using a single ORM that allows you to drop down to raw SQL when needed for performance-critical operations.
How do ORMs handle database migrations in Go?
It varies by ORM:
- GORM offers Auto-Migration that syncs your struct definitions with database tables
- Ent provides a migration package to generate and run schema changes
- SQLBoiler expects your database schema to exist already, as it generates code based on it
For production environments, you might want to use dedicated migration tools like golang-migrate regardless of which ORM you choose.
Do I need to define database indexes when using an ORM?
Absolutely. ORMs don't automatically optimize your database schema. You should still think about performance and define appropriate indexes. Most Go ORMs provide ways to specify indexes in your models:
// GORM example
type User struct {
gorm.Model
Name string `gorm:"index"`
Email string `gorm:"uniqueIndex"`
}
How do ORMs handle database transactions in Go?
Most Go ORMs provide transaction support through an API similar to this:
// Start transaction
tx := db.Begin()
// Do operations
if err := tx.Create(&user).Error; err != nil {
tx.Rollback()
return err
}
if err := tx.Create(&profile).Error; err != nil {
tx.Rollback()
return err
}
// Commit if everything is OK
return tx.Commit().Error
GORM even offers a transaction function that handles the commit/rollback automatically based on whether your function returns an error.
Can ORMs work with NoSQL databases in Go?
Traditional ORMs are designed for relational databases. For NoSQL databases like MongoDB or Redis, you'll want to use database-specific packages or more generic "ODMs" (Object-Document Mappers). Some options include:
- mongo-go-driver for MongoDB
- redigo or go-redis for Redis
- gocb for Couchbase
How do I handle database connection pooling with Go ORMs?
Most Go ORMs handle connection pooling for you through their underlying database drivers. You can usually configure pool settings when initializing your DB connection:
sqlDB, err := db.DB()
sqlDB.SetMaxIdleConns(10)
sqlDB.SetMaxOpenConns(100)
sqlDB.SetConnMaxLifetime(time.Hour)
Should beginners start with an ORM or raw SQL in Go?
For beginners building real applications, starting with an ORM like GORM can help you get productive quickly. However, it's worth learning the basics of SQL and understanding what your ORM is doing under the hood. This knowledge will be invaluable when you need to optimize queries or debug issues.