Portfolio
Back to Tradeoff Explorers

Cache vs Database: When to Use What

Problem

Deciding when to add a cache layer (Redis) versus querying the database directly for frequently accessed data.

Constraints

  • Need fast response times for user-facing queries
  • Data changes infrequently (user profiles, content metadata)
  • High read volume on credit balances
  • Budget constraints (cache adds cost)

Options Comparison

Database Only

Pros

  • Single source of truth
  • No cache invalidation complexity
  • Always fresh data
  • Simpler architecture
  • Lower infrastructure cost

Cons

  • Slower for frequent reads
  • Higher database load
  • More expensive at scale (DB connections)
  • Limited by DB connection pool

Best For

  • Low read volume
  • Data changes frequently
  • Simple applications
  • When consistency is critical

Worst For

  • High read volume
  • Expensive queries
  • Slow database
  • When latency matters

Scaling Characteristics

Reads:Fair
Writes:Good
Horizontal:Fair

Cache + Database

Pros

  • Dramatically faster reads (sub-millisecond)
  • Reduces database load
  • Better user experience
  • Cost effective at scale
  • Enables higher throughput

Cons

  • Cache invalidation complexity
  • Stale data risk
  • Additional infrastructure
  • More moving parts

Best For

  • High read volume
  • Data changes infrequently
  • Expensive queries
  • When performance matters

Worst For

  • Frequently changing data
  • When consistency is critical
  • Simple, low-traffic apps

Scaling Characteristics

Reads:Excellent
Writes:Good
Horizontal:Excellent

Decision Framework

Consider: read/write ratio, data freshness requirements, query cost, traffic volume, consistency needs

Recommendation

Use cache when: read volume is high, data changes infrequently, or queries are expensive. Use DB only when: data changes frequently or consistency is critical.

Reasoning

For AuthorAI, I cache user credit balances and session data in Redis. Credit balances are updated infrequently (only on purchase or usage), and reads happen on every API request. The cache reduces DB load by 80%+ and improves response times. However, credit deductions always hit the DB first for consistency, then update cache.

Scaling Considerations

Database: Add read replicas, but limited by replication lag. Cache: Horizontal scaling is straightforward with Redis Cluster. Cache hit ratio is the key metric - aim for 80%+ for effective caching.