CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': 'redis://127.0.0.1:6379/1',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'PASSWORD': 'your-redis-password',
'SOCKET_CONNECT_TIMEOUT': 5,
'SOCKET_TIMEOUT': 5,
'CONNECTION_POOL_KWARGS': {'max_connections': 50}
},
'KEY_PREFIX': 'myapp',
'VERSION': 1,
}
}
# Use Redis for sessions
SESSION_ENGINE = 'django.contrib.sessions.backends.cache'
SESSION_CACHE_ALIAS = 'default'
from django.core.cache import cache
from django.views.decorators.cache import cache_page
def get_popular_products():
"""Cache expensive query."""
cache_key = 'popular_products'
products = cache.get(cache_key)
if products is None:
from products.models import Product
products = list(Product.objects.filter(
featured=True
).order_by('-sales')[:10])
cache.set(cache_key, products, 3600) # Cache for 1 hour
return products
# Cache view for 15 minutes
@cache_page(60 * 15)
def product_list(request):
# View implementation
pass
# Invalidate cache
def clear_product_cache():
cache.delete('popular_products')
cache.delete_pattern('product:*') # Delete by pattern
Redis provides fast in-memory caching for Django. I configure it as cache backend and use for session storage. I cache expensive querysets, computed values, and API responses. The cache.get_or_set() method simplifies cache-aside pattern. For cache invalidation, I use versioning or targeted deletes. I set appropriate TTLs based on data volatility. Redis also enables rate limiting, message queues, and pub/sub. For distributed systems, Redis centralizes cache across servers. This dramatically improves response times for read-heavy workloads.