Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Add redis and in-memory cache #37

Merged
merged 17 commits into from
Jun 19, 2024
Merged

Add redis and in-memory cache #37

merged 17 commits into from
Jun 19, 2024

Conversation

anxolin
Copy link
Contributor

@anxolin anxolin commented Jun 17, 2024

This PR adds our own caching logics

Caching

First of all, it adds a Fastify pluging caching to allow to easily define the cache-control headers.
It also adds in the fastify instance a cache object which allow to get/save objects from/to an abstract cache.

This allows us to not depend on any particular cache implementantion for the main caching logic.

Redis vs In-memory

The environment var allows now to specify:
REDIS_ENABLED=true (false by default)

EDIT: Now its enabled if you specify REDIS_HOST environment variable.

When Redis is enabled, the caching will happen in Redis. Otherwise, it will use an in-memory cache (not suitable for large applications in production)

Redis integration

Additionally, I added the redis plugin, which allows us to access the redis logic through the convenient fastify.redis object.

This would allow us to be able to access to more advance uses of redis to implement any of the services.

TODO: Pending to not load it if REDIS_ENABLED=false

Bff cache

Additionally, I've done a plugin so we can have our own logic to automatically cache things based on the cache-control directive.

It should be very easy to use now. Each endpoint will define its own caching policy using the cache-control directive

Let's see it with these 2 endpoints:

fastify.get('/hello', async function (request, reply) {
    await sleep(2000)
    reply.send({ hello: 'world' })
  });

  fastify.get('/hello-cached', async function (request, reply) {
    await sleep(2000)
    console.log('Setting cache header')
    reply.header(CACHE_CONTROL_HEADER, getCacheControlHeaderValue(10))
    reply.send({ hello: 'world' })
  });

Both endpoints return the same data after 2 seconds. The difference is /hello-cached returns a cache-control header. This header is observed by bffCache plugin and will cache its content for the instructed time.

Additionally, future requests it will return the cached content and will keep track of the TTL at the time of the request so it can inform the clients about it.

Test

Test hello endpoint

Execute several times:
curl -i http://localhost:3010/examples/hello

You should notice a 2seconds delay in the response every single time.

Also, there's no headers for cache-control or x-bff-cache

Test hello-cached endpoint

Execute this command:
curl -i http://localhost:3010/examples/hello-cached

Observe how after the first request, next requests are cached for 10 seconds. After that, you should experience the delay again. Additionally, the cache-control header should inform about the TTL so clients know when the resource is not consider fresh anymore.

image

Credentials

Added some envs to control the redis connection

REDIS_HOST=true
REDIS_PORT=6379
REDIS_USER=redis_user
REDIS_PASSWORD=redis_user

Follow ups

After this PR we still have some works to do:

  • Persistance
  • Schedule backup/dump of Redis
  • Infrastructure setups

Copy link

vercel bot commented Jun 17, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Updated (UTC)
bff_legacy ⬜️ Ignored (Inspect) Visit Preview Jun 19, 2024 6:42am

@anxolin anxolin marked this pull request as draft June 17, 2024 08:24
Copy link

socket-security bot commented Jun 17, 2024

New and removed dependencies detected. Learn more about Socket for GitHub ↗︎

Package New capabilities Transitives Size Publisher
npm/@fastify/caching@8.3.0 None +7 133 kB matteo.collina
npm/@fastify/redis@6.2.0 Transitive: network +8 886 kB matteo.collina
npm/abstract-cache-redis@2.0.0 Transitive: network +10 478 kB jsumners
npm/fastify-caching@6.3.0 None +10 146 kB jsumners
npm/redis-errors@1.2.0 None 0 8.85 kB bridgear

🚮 Removed packages: npm/ieee754@1.2.1, npm/is-core-module@2.12.1, npm/path-parse@1.0.7

View full report↗︎

@anxolin anxolin requested a review from a team June 17, 2024 11:26
@anxolin anxolin marked this pull request as ready for review June 17, 2024 11:26
Copy link
Contributor

@alfetopito alfetopito left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A question about caching failed GET requests.

Also, what about adding the a Redis instance to the docker compose for local development?

@anxolin anxolin requested a review from a team June 17, 2024 12:56
@anxolin anxolin requested a review from a team June 19, 2024 06:42
@anxolin anxolin merged commit 662d335 into main Jun 19, 2024
5 checks passed
@alfetopito alfetopito deleted the redis-cache branch June 19, 2024 12:41
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants