I'm having a hard time understanding jitter as it relates to my other params.
According to Base, DEFAULT_JITTER: = False
, and it doesn't seem like DEFAULT_JITTER is ever updated when I call a new Limiter(). I know I can set it as an arg with Limiter(jitter=X)
.
I also saw in Base in the definition for _get_sleep_duration
def _get_sleep_duration(
consume: Tokens,
tokens: Tokens,
rate: Tokens,
jitter: Jitter = DEFAULT_JITTER,
units: UnitsInSecond = MS_IN_SEC
) -> Duration:
"""Increase contention by adding jitter to sleep duration"""
duration: Duration = (consume - tokens) / rate
match jitter:
case int() | float():
return duration - jitter
case bool() if jitter:
amount: Duration = random() / units
return duration - amount
return duration
It's clear that jitter
can be a bool, int, or float, but I don't really understand how I should size it with respect to my consume
or rate
values. I don't want to add too much jitter that it ends up overpowering my sleep duration and I end up with a negative number.
Let's say I have these params:
rate = 20
# replenish tokens per second
consume = 1
# because I want my function to be allowed to run a maximum of 20 times per second
capacity = 300
# because I want my function to be able to run for 15 seconds maximum before it gets limited (function can't get called more than 300 times per minute max)
But according to the _get_sleep_duration calculation, my duration = (1 - 300) / 20 = -14.95
and so I'm already at a negative number.
- Am I thinking about my params correctly?
- What's the right way to add jitter as a param?