-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Cache geocoding API results in Redis (#4653) #4688
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Rounds coordinates to ~1km precision (2 decimal places) for cache key. Same location lookups across sessions and instances now hit Redis instead of making fresh Google Maps API calls. Geocoding API is not covered by GCP credits — this is a direct cost reduction. Resolves the existing TODO comment in the code. Fixes #4653 Co-Authored-By: Claude Opus 4.6 <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces Redis caching for geocoding API results, which is a great improvement for performance and cost-saving. The implementation is straightforward. My review includes a few suggestions to improve robustness and maintainability, mainly around error handling for Redis operations and simplifying data serialization using Pydantic's built-in methods. These changes will make the caching mechanism more resilient and easier to maintain.
| except Exception: | ||
| pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Silently catching all exceptions with except Exception: pass is risky as it can hide critical issues with Redis connectivity or data corruption. It's better to log the exception. This allows for monitoring and debugging of the caching layer while still allowing the function to gracefully fall back to the API call. Please also add import logging at the top of the file.
| except Exception: | |
| pass | |
| except Exception as e: | |
| logging.warning('Redis cache get failed for key %s: %s', cache_key, e) |
| cache_data = { | ||
| 'google_place_id': geo.google_place_id, | ||
| 'latitude': geo.latitude, | ||
| 'longitude': geo.longitude, | ||
| 'address': geo.address, | ||
| 'location_type': geo.location_type, | ||
| } | ||
| r.set(cache_key, json.dumps(cache_data), ex=172800) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of manually constructing the cache_data dictionary, you can use the Pydantic model's built-in .dict() method to serialize the geo object. This is more concise and robust, as it will automatically adapt if the Geolocation model is updated in the future.
r.set(cache_key, json.dumps(geo.dict()), ex=172800)| except Exception: | ||
| pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar to the cache read, silently ignoring exceptions on cache write is not ideal. Logging the error will provide visibility into any issues with writing to Redis, without breaking the request flow. Please ensure import logging is present at the top of the file.
except Exception as e:
logging.warning('Failed to cache geocode for key %s: %s', cache_key, e)|
We need to be more precise about the local cache. Lets go with 100 m instead of 1 km. Should we? |
- Change cache key rounding from .2f (~1km) to .3f (~100m) per review - Replace silent except-pass with logging.warning for cache failures - Use model_dump() instead of manual dict construction (Pydantic v2) - Add 9 unit tests covering cache hits, misses, precision, and Redis failures Co-Authored-By: Claude Opus 4.6 <[email protected]>
|
Done — pushed with 100m precision ( Changes:
All tests pass. |
Address tester gaps: - API status != OK returns None - Missing/empty place_id returns None - Empty/missing types gives location_type=None - Invalid JSON in cache falls through to API with warning - Schema mismatch in cache falls through to API with warning 15 tests total, all pass. Co-Authored-By: Claude Opus 4.6 <[email protected]>
- Use place.get('place_id') instead of place['place_id'] to handle
missing key gracefully (returns None instead of KeyError)
- Add test for missing place_id key in API response
16 tests total, all pass.
Co-Authored-By: Claude Opus 4.6 <[email protected]>
Tests against real Redis + Google Maps API: - First call hits API, caches with 48h TTL - Second call serves from cache (faster) - Cache key uses 3 decimal precision (~100m) - Distant coords produce different keys - Auto-cleanup of test cache keys Run: cd backend && source env && pytest tests/integration/test_geocoding_cache_live.py -v -s Co-Authored-By: Claude Opus 4.6 <[email protected]>
|
lgtm |
Summary
Cache Google Maps geocoding API results in Redis to reduce API costs ($1.1K/mo).
Changes:
.3fcoordinate rounding) and 48h TTLplace.get('place_id')access (prevents KeyError on malformed responses)logging.warning()(no silent swallowing)model_dump()for Pydantic v2 serializationCloses #4653
Test plan
Risks/edge cases
by AI for @beastoin