Race condition for cache invalidation
MarshallOfSound opened this issue · comments
Refs: https://github.com/electron/update-server/blob/master/index.js#L81
Imagine this scenario. R{X} = Request number {X} so R1 is the first request, R2 is the second request and so on.
- R1: Cache miss, start github API request
- R2: Cache miss, start github API request
- R1: Github API responds
- R3: Cache hit
- R2: Github API responds
- R4: Cache hit
Notice that because a request was made between the github API call returning another github API call was made. We probably need to use locks in redis to make it so this sequence works like:
- R1: Cache miss, start github API request
- R2: Cache miss, request locked, waiting
- R1: Github API responds
- R2: request unlocked, Cache hit
- R3: Cache hit
- R4: Cache hit
For popular apps this race condition will quite easily be hit, E.g. GPMDP update server processes approximately 5 requests a second 24/7.
Thank you for reporting this! I added a redis lock as suggested (redlock), can you give this another look please?