dgilland / pydash

The kitchen sink of Python utility libraries for doing "stuff" in a functional way. Based on the Lo-Dash Javascript library.

Home Page:http://pydash.readthedocs.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pydash.debounce is instead throttling the function

theQuazz opened this issue · comments

Expected Behaviour

When invoked, the debounced function is invoked after wait milliseconds.

Current Behaviour

When invoked, the debounced function is invoked immediately, not after wait milliseconds.

Example

import time
import pydash

t = time.time()
func = pydash.debounce(time.time, 10000)
print(func() - t) # 2.86102294921875e-05
time.sleep(5)
print(func() - t) # 2.86102294921875e-05

Reason

As can be seen in the code below, taken from the source, there is no deferred execution. This is correct for a throttle function, but not for a debounce function, at least in the namesake project Lodash.

class Debounce(object):
    """Wrap a function in a debounce context."""

    def __init__(self, func, wait, max_wait=False):
        self.func = func
        self.wait = wait
        self.max_wait = max_wait

        self.last_result = None

        # Initialize last_* times to be prior to the wait periods so that func
        # is primed to be executed on first call.
        self.last_call = pyd.now() - self.wait
        self.last_execution = pyd.now() - max_wait if pyd.is_number(max_wait) else None

    def __call__(self, *args, **kwargs):
        """
        Execute :attr:`func` if function hasn't been called witinin last :attr:`wait` milliseconds
        or in last :attr:`max_wait` milliseconds.

        Return results of last successful call.
        """
        present = pyd.now()

        if (present - self.last_call) >= self.wait or (
            self.max_wait and (present - self.last_execution) >= self.max_wait
        ):
            self.last_result = self.func(*args, **kwargs)
            self.last_execution = present

        self.last_call = present

        return self.last_result

Thanks for reporting! I'm open to a PR to fix this if you are so inclined.