Cache Backends¶
The caching system supports multiple backend implementations, each optimized for different use cases and deployment scenarios.
Overview¶
The two-layer caching architecture uses different backends for local and shared caching:
- Local Cache: Fast in-memory storage within each application instance
- Shared Cache: Persistent storage shared across all application instances
Redis Cache¶
RedisCache¶
The standard Redis backend for shared caching across multiple application instances.
Features: - Persistent storage that survives application restarts - Shared between all application instances - Built-in serialization with pickle - Automatic key expiration - Key tracking for bulk operations
Configuration:
{
"CACHE_TYPE": "RedisCache",
"CACHE_REDIS_HOST": "redis://localhost:6379",
"CACHE_REDIS_DB": 0,
"CACHE_DEFAULT_TIMEOUT": 300,
"CACHE_KEY_PREFIX": "alan_cache_"
}
When to use: - Production deployments with multiple instances - Data that needs to persist across restarts - Sharing cache between different services
RedisCacheNoExceptions¶
Enhanced Redis backend that handles serialization failures gracefully.
Features: - Automatic cache key deletion on deserialization errors - Prevents application crashes from corrupted cache data - Same performance as standard RedisCache - Automatic function key tracking
Benefits: - Robust handling of schema changes - Prevents pickle errors from breaking the application - Self-healing cache behavior
RedisCacheAtomic¶
Redis backend with atomic write operations to prevent race conditions.
Features:
- Uses Redis WATCH/MULTI/EXEC for atomic operations
- Prevents cache corruption from concurrent writes
- Automatic retry on watch failures
- Supports both at_least_once and at_most_once semantics
Configuration:
@cached_for(
hours=1,
atomic_writes="at_most_once" # or "at_least_once"
)
def expensive_operation():
return compute_result()
When to use: - Functions with expensive side effects - Singleton initialization - Critical data that must be computed exactly once
SimpleCache¶
SimpleCache¶
In-memory cache backend for local storage within a single application instance.
Features: - Ultra-fast memory-based storage - LRU eviction policy - Configurable size limits - Thread-safe operations
Configuration:
{
"LOCALCACHE_TYPE": "SimpleCache",
"LOCALCACHE_DEFAULT_TIMEOUT": 300,
"LOCALCACHE_THRESHOLD": 500 # Maximum number of items
}
Performance characteristics: - Fastest cache access (no network overhead) - Limited by available memory - Lost on application restart - Not shared between instances
SimpleCacheNoSerializer¶
SimpleCache variant that stores objects without serialization.
Features: - Direct object storage (no pickle overhead) - Perfect for ORM objects and complex data structures - Fastest possible cache performance - Prevents serialization/deserialization issues
When to use: - Caching ORM model instances - Complex objects that shouldn't be serialized - Maximum performance requirements - Local-only data
Configuration:
@cached_for(
minutes=30,
local_ram_cache_only=True,
no_serialization=True
)
def get_user_model(user_id: int):
return User.query.get(user_id) # Stores ORM object directly
NullCache¶
Backend that disables caching entirely.
Features: - All operations are no-ops - Functions always execute - Zero performance overhead - Useful for development and testing
Configuration:
When to use: - Development environments - Testing scenarios - Debugging cache-related issues - Temporary cache disabling
FileSystemCache¶
File-based caching backend for persistent local storage.
Features: - Persistent storage on local filesystem - Survives application restarts - Not shared between instances - Slower than memory-based caches
Configuration:
When to use: - Single-instance deployments - Development environments - When Redis is not available - Persistent cache without Redis overhead
Backend Selection Guide¶
Production Recommendations¶
Multi-instance deployment:
{
"CACHE_TYPE": "shared.caching.backends.rediscache.RedisCacheNoExceptions",
"LOCALCACHE_TYPE": "SimpleCache"
}
Single-instance deployment:
Development Recommendations¶
With Redis available:
Without Redis:
Testing Recommendations¶
Disable caching:
Or programmatically:
Custom Backends¶
You can implement custom cache backends by extending the base classes:
Custom Redis Backend¶
from shared.caching.backends.rediscache import RedisCacheNoExceptions
class MyCustomRedisCache(RedisCacheNoExceptions):
def set(self, key: str, value: Any, timeout: int | None = None) -> Any:
# Custom logic before setting
result = super().set(key, value, timeout)
# Custom logic after setting
return result
Custom SimpleCache Backend¶
from shared.caching.backends.simplecache import SimpleCache
class MyCustomSimpleCache(SimpleCache):
def get(self, key: str) -> Any:
# Custom retrieval logic
return super().get(key)
Usage¶
Backend Performance Comparison¶
| Backend | Speed | Persistence | Shared | Memory Usage |
|---|---|---|---|---|
| SimpleCache | Super Fast | No | No | High |
| SimpleCacheNoSerializer | Fastest | No | No | Highest |
| RedisCache | Fast | Yes | Yes | Low |
| FileSystemCache | Slow | Yes | No | Low |
| NullCache | N/A | No | No | None |
Monitoring and Debugging¶
Backend Status¶
Check which backends are active:
from shared.caching.cache import alan_cache
print(f"Shared cache: {type(alan_cache.shared_cache.cache)}")
print(f"Local cache: {type(alan_cache.local_cache.cache)}")
print(f"Local no-serializer cache: {type(alan_cache.local_cache_no_serializer.cache)}")
Backend Configuration¶
View current backend configuration:
from shared.caching.cache import alan_cache
# Check cache configuration
print(alan_cache.shared_cache.config)
print(alan_cache.local_cache.config)
Performance Metrics¶
Monitor backend-specific metrics:
# Redis backend metrics
if hasattr(alan_cache.shared_cache.cache, 'info'):
redis_info = alan_cache.shared_cache.cache.info()
print(f"Redis used memory: {redis_info.get('used_memory_human')}")
# SimpleCache metrics
if hasattr(alan_cache.local_cache.cache, '_cache'):
cache_size = len(alan_cache.local_cache.cache._cache)
print(f"Local cache entries: {cache_size}")