-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Add redis and in-memory cache #37
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Ignored Deployment
|
New and removed dependencies detected. Learn more about Socket for GitHub ↗︎
🚮 Removed packages: npm/ieee754@1.2.1, npm/is-core-module@2.12.1, npm/path-parse@1.0.7 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A question about caching failed GET requests.
Also, what about adding the a Redis instance to the docker compose for local development?
Co-authored-by: Leandro <alfetopito@users.noreply.github.com>
This PR adds our own caching logics
Caching
First of all, it adds a Fastify pluging
caching
to allow to easily define the cache-control headers.It also adds in the
fastify
instance acache
object which allow to get/save objects from/to an abstract cache.This allows us to not depend on any particular cache implementantion for the main caching logic.
Redis vs In-memory
The environment var allows now to specify:
REDIS_ENABLED=true(false by default)EDIT: Now its enabled if you specify
REDIS_HOST
environment variable.When Redis is enabled, the caching will happen in Redis. Otherwise, it will use an in-memory cache (not suitable for large applications in production)
Redis integration
Additionally, I added the redis plugin, which allows us to access the redis logic through the convenient
fastify.redis
object.This would allow us to be able to access to more advance uses of redis to implement any of the services.
TODO: Pending to not load it if
REDIS_ENABLED=false
Bff cache
Additionally, I've done a plugin so we can have our own logic to automatically cache things based on the cache-control directive.
It should be very easy to use now. Each endpoint will define its own caching policy using the
cache-control
directiveLet's see it with these 2 endpoints:
Both endpoints return the same data after 2 seconds. The difference is
/hello-cached
returns acache-control
header. This header is observed bybffCache
plugin and will cache its content for the instructed time.Additionally, future requests it will return the cached content and will keep track of the TTL at the time of the request so it can inform the clients about it.
Test
Test hello endpoint
Execute several times:
curl -i http://localhost:3010/examples/hello
You should notice a 2seconds delay in the response every single time.
Also, there's no headers for
cache-control
orx-bff-cache
Test hello-cached endpoint
Execute this command:
curl -i http://localhost:3010/examples/hello-cached
Observe how after the first request, next requests are cached for 10 seconds. After that, you should experience the delay again. Additionally, the
cache-control
header should inform about the TTL so clients know when the resource is not consider fresh anymore.Credentials
Added some envs to control the redis connection
Follow ups
After this PR we still have some works to do: