Skip to content

Feature/unified cache #1897

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: development
Choose a base branch
from

Conversation

Shiva205101
Copy link

🚀 Summary

This PR introduces a high-performance, pluggable caching module to the GoFr framework. It allows seamless injection and usage of either in-memory or Redis-based cache implementations across application layers with support for future observability and middleware extensibility.

This implements Phase 1 of the "Unified Cache Layer" as proposed in the GoFr Summer of Code framework extensions.


🎯 Motivation

GoFr previously lacked a unified caching abstraction, which led developers to implement manual and inconsistent caching logic. This PR standardizes cache usage via:

  • A consistent Cache interface
  • Centralized injection using app.Container.Set("cache", ...)
  • Swappable backend support (in-memory or Redis)
  • Clean handler/service/store integration without tight coupling

🧩 What's Implemented

  • Cache interface: defines Get, Set, Delete, and WrapQuery
  • In-memory implementation with TTL & thread safety
  • Redis-based implementation using go-redis/v9
  • Container-based cache injection via Container.Set()
  • Developer-friendly usage examples and structure

📦 Example Usage

Registering the cache in main.go

cache := inmemory.New() // or redis.New(redisClient)
app.Container.Set("cache", cache)

Using it in application logic (service or store layer)

func (s *UserService) GetCachedUser(id string) (string, error) {
    cache := s.cache

    val, err := cache.Get("user:" + id)
    if err == nil {
        return val, nil
    }

    // Fallback to DB or compute logic
    val = "some-value"
    _ = cache.Set("user:"+id, val, 5*time.Minute)
    return val, nil
}

Optional: WrapQuery (Phase 2 Preview)

result, err := cache.WrapQuery("user:123", 5*time.Minute, func() (string, error) {
    return fetchUserFromDB()
})

🧪 Tests

  • ✅ Manual tests for in-memory and Redis integration
  • ✅ Sample usage validated in Handler → Service → Store stack

📂 Affected Files

pkg/cache/cache.go           # Interface definition
pkg/cache/inmemory.go        # In-memory backend
pkg/cache/redis.go           # Redis backend
pkg/cache/container.go       # Optional helper for container registration
examples/main.go             # Demonstration of registration and use

📊 Observability (Planned)

  • Prometheus metrics: cache_hits_total, cache_misses_total, cache_errors_total
  • OpenTelemetry spans: cache.Get, cache.Set, cache.WrapQuery
  • Logging middleware: WithLogging(cache, logger) wrapper

🗒 Related

Closes issue: TBD by maintainers
Proposal: Unified Cache Layer – Phase 1 (Application-level injection)


🙌 Notes for Reviewers

  • This PR focuses solely on Phase 1: registration and basic cache operations.
  • Context is intentionally not passed to cache operations for simplicity.
  • Backend choice (Redis vs Memory) is fully up to the app developer.
  • All implementations are thread-safe.

Future PRs will extend this with query wrapping logic, metrics, and trace hooks.

Note: The Container is a struct. We use app.Container.Set("cache", cache) to register dependencies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant