ScoopInstaller / Excavator

🕳️ This container runs the updating services for all scoop manifest repos (deprecated)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The remote server returned an error: (429) Too Many Requests.

rasa opened this issue · comments

The extras log is showing

The remote server returned an error: (429) Too Many Requests.

hitting sourceforge. Perhaps we should add a rate limiter/exponential rolloff function?

This is now also affecting Github releases.

https://scoop.r15.ch/main/mud-20200403-120001.log

Example code for using Header Last-Modified or ETag to prevent hitting API rate limit.

$ua = "Scoop/1.0 (+http://scoop.sh/)"
$uri = "https://api.github.com/repos/derailed/k9s/releases/latest"

$username = "<github username>"
$token = "<personal access token 'public_repo'>"

$etag = Get-Content "$PSScriptRoot\etag.txt"
write-host "Old: $etag"


$date = Get-Content "$PSScriptRoot\date.txt"
write-host "Old: $date"

try {
    $request = [System.Net.HttpWebRequest]::Create($uri)
    $request.Headers.Add('User-Agent', $ua)
    # $request.Headers.Add('If-None-Match', $etag)
    $request.Headers.Add('If-Modified-Since', $date)
    $request.Headers.Add('Authorization', "token $token")
    $request.AllowAutoRedirect = $false
    $response = $request.GetResponse()
}
catch [System.Net.WebException] {
    if($null -eq $_.Exception.Response -or $_.Exception.Status -ne [System.Net.WebExceptionStatus]::ProtocolError) {
        throw
    }
    write-host $_.Exception.Response.StatusCode
    $response = $_.Exception.Response
}

$etag = $response.Headers["ETag"]
write-host "New: $etag"
$etag | Set-Content "$PSScriptRoot\etag.txt"

$date = $response.Headers['Last-Modified']
write-host "New: $date"
$date | Set-Content "$PSScriptRoot\date.txt"

As we’re seeing more and more 429 errors in https://scoop.r15.ch/main/mud-20200823-140001.log, we need to flush out the work started at ScoopInstaller/Scoop#3912 to add some rate limiting logic when hitting github.com. Here’s how I did it in python, and it seems to work well:
https://github.com/rasa/scoop-directory/blob/5cfcaabfd6387f422f7b49afe7978d63b8216a2e/maintenance/github-crawler.py#L376

Again, The remote server returned an error: (429) too many requests.

Yeah, this is a real issue.

Fixed