Use structured JSON logging for better searchability and analysis in cloud environments.
Code Snippet
import json
import logging
from datetime import datetime
class StructuredLogger:
def __init__(self, service_name: str):
self.service = service_name
self.logger = logging.getLogger(service_name)
def _log(self, level: str, message: str, **kwargs):
log_entry = {
"timestamp": datetime.utcnow().isoformat(),
"level": level,
"service": self.service,
"message": message,
**kwargs
}
print(json.dumps(log_entry))
def info(self, message: str, **kwargs):
self._log("INFO", message, **kwargs)
def error(self, message: str, **kwargs):
self._log("ERROR", message, **kwargs)
# Usage
logger = StructuredLogger("order-service")
logger.info("Order created", order_id="12345", user_id="u-789", amount=99.99)
# {"timestamp":"2024-01-15T10:30:00","level":"INFO","service":"order-service","message":"Order created","order_id":"12345","user_id":"u-789","amount":99.99}
Why This Helps
- Easy to query in CloudWatch, Datadog, etc.
- Enables correlation across services
- Machine-parseable for alerting
How to Test
- Verify JSON output format
- Test queries in log aggregation tool
When to Use
All cloud-native applications. Essential for microservices and distributed systems.
Performance/Security Notes
Include correlation IDs for request tracing. Consider using OpenTelemetry for full observability.
References
Try this tip in your next project and share your results in the comments!
Discover more from Byte Architect
Subscribe to get the latest posts sent to your email.