composer / composer

Dependency Manager for PHP

Home Page:https://getcomposer.org/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

zlib_decode(): data error

Ugoku opened this issue · comments

I realise this issue has been raised before (#3006, #3270) but no replies were given there and the results are many months old. Since a while, I'm getting the error mentioned in the issue title, when doing a composer require xxx/yyy or composer install
Stack trace from the "very verbose" mode is as follows:

  [ErrorException]
  zlib_decode(): data error



Exception trace:
 () at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Util/RemoteFilesystem.php:218
 Composer\Util\ErrorHandler::handle() at n/a:n/a
 zlib_decode() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Util/RemoteFilesystem.php:218
 Composer\Util\RemoteFilesystem->get() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Util/Remote
Filesystem.php:83
 Composer\Util\RemoteFilesystem->getContents() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Rep
ository/ComposerRepository.php:587
 Composer\Repository\ComposerRepository->fetchFile() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Compos
er/Repository/ComposerRepository.php:296
 Composer\Repository\ComposerRepository->whatProvides() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Com
poser/DependencyResolver/Pool.php:191
 Composer\DependencyResolver\Pool->computeWhatProvides() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Co
mposer/DependencyResolver/Pool.php:180
 Composer\DependencyResolver\Pool->whatProvides() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/
DependencyResolver/RuleSetGenerator.php:221
 Composer\DependencyResolver\RuleSetGenerator->addRulesForPackage() at phar://C:/ProgramData/ComposerSetup/bin/composer.
phar/src/Composer/DependencyResolver/RuleSetGenerator.php:293
 Composer\DependencyResolver\RuleSetGenerator->addRulesForJobs() at phar://C:/ProgramData/ComposerSetup/bin/composer.pha
r/src/Composer/DependencyResolver/RuleSetGenerator.php:333
 Composer\DependencyResolver\RuleSetGenerator->getRulesFor() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/sr
c/Composer/DependencyResolver/Solver.php:172
 Composer\DependencyResolver\Solver->solve() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Insta
ller.php:505
 Composer\Installer->doInstall() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Installer.php:230

 Composer\Installer->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Command/RequireCommand.
php:158
 Composer\Command\RequireCommand->execute() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/symfony/cons
ole/Symfony/Component/Console/Command/Command.php:257
 Symfony\Component\Console\Command\Command->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/symfon
y/console/Symfony/Component/Console/Application.php:874
 Symfony\Component\Console\Application->doRunCommand() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/s
ymfony/console/Symfony/Component/Console/Application.php:195
 Symfony\Component\Console\Application->doRun() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Co
nsole/Application.php:146
 Composer\Console\Application->doRun() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/symfony/console/S
ymfony/Component/Console/Application.php:126
 Symfony\Component\Console\Application->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Cons
ole/Application.php:82
 Composer\Console\Application->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/bin/composer:43
 require() at C:\ProgramData\ComposerSetup\bin\composer.phar:25

Any thoughts on why this happens and/or how to fix it?
I'm only getting this on one of my systems, and it worked fine in the past, I'm not sure what changed since. I have reinstalled Composer just now, so I'm using the latest version. Specs of my system: Windows 8.1 x64, PHP 5.6.5 x64

Edit: when I require a new dependency on an other machine, then run composer install on my own system, that works just fine.

Edit 2: I just noticed Composer is connecting to http://packagist.org, despite having openssl enabled.

commented

Please, share more useful information. What are you installing for example (your composer.json), etc.

It doesn't matter what I try to install, everything fails. But in the case today, I did composer require knplabs/knp-snappy
My composer.json (stripped to its bare essentials, but still failing):

{
  "require": {
    "knplabs/knp-snappy": "^0.4.0",
    "h4cc/wkhtmltopdf-amd64": "0.12.0"
  },
  "autoload": {
    "psr-4": {
      "Application\\": "Application"
    }
  }
}

commented

Can you try running composer config --global repositories.packagist.allow_ssl_downgrade false and then try to install or update again?

That composer.json works fine for me.

@Ugoku - installing with debug verbosity - composer install -vvv will output something like this:

Downloading https://api.github.com/repos/h4cc/wkhtmltopdf-amd64/zipball/ebaa56a8ba0ae70a147ee70d544abf743db290d3
    Downloading: 100%
Writing /Users/user/.composer/cache/files/h4cc/wkhtmltopdf-amd64/ebaa56a8ba0ae70a147ee70d544abf743db290d3.zip into cache

See if the URL is working or the zip file is valid.

commented

The difference is you are on OSX, and he is on Windows. Also he is falling back to http, and you are using https. Hence why I am asking him to try my config change.

Why would the OS matter when you're talking URLs and files? How do you know he's using http?

commented

The OS difference should not matter, but should not and does not are not exactly the same and there is no guarantee. It is after all, Windows.

As for knowing that he is using http (not https), because he said so.

@alcohol: after changing that config setting and running composer update -vvv again, I get exactly the same error as in my opening post :(

In the verbose output, it says this (in order):

Downloading https://packagist.org/packages.json
(some "Reading xxx from cache" stripped)
Reading C:/Users/Sander/AppData/Local/Composer/repo/https---packagist.org/provider-symfony$process.json from cache
Reading C:/Users/Sander/AppData/Local/Composer/repo/https---packagist.org/provider-cilex$cilex.json from cache
Reading C:/Users/Sander/AppData/Local/Composer/repo/https---packagist.org/provider-erusev$parsedown.json from cache
Reading C:/Users/Sander/AppData/Local/Composer/repo/https---packagist.org/provider-monolog$monolog.json from cache
Downloading http://packagist.org/p/symfony/config$b1c50ac649cbb1d1952965df19066f264a56e227694a6105e42a853499b64e77.json
Downloading http://packagist.org/p/symfony/config$b1c50ac649cbb1d1952965df19066f264a56e227694a6105e42a853499b64e77.json
Downloading http://packagist.org/p/symfony/config$b1c50ac649cbb1d1952965df19066f264a56e227694a6105e42a853499b64e77.json
Reading C:/Users/Sander/AppData/Local/Composer/repo/https---packagist.org/provider-symfony$config.json from cache
Reading C:/Users/Sander/AppData/Local/Composer/repo/https---packagist.org/provider-symfony$config.json from cache
zlib_decode(): data error
http://packagist.org could not be fully loaded, package information was loaded from the local cache and may be out of date
(some more Reading xxx from cache)
Downloading http://packagist.org/p/zendframework/zend-cache$7aff615084f7758fef3e699ffeacd289c0f39b13a27577645960c13b94955d14.json
(Bunch of Downloading/Reading lines stripped)
Downloading http://packagist.org/p/guzzle/batch$49f7767b8e6b6c1211f57539881d28350e506bae2c4c21df5e22be9a4838cf30.json



  [Composer\Downloader\TransportException]
  The "http://packagist.org/p/guzzle/batch$49f7767b8e6b6c1211f57539881d28350e506bae2c4c21df5e22be9a4838cf30.json" fil
  e could not be downloaded: failed to open stream: HTTP request failed!



Exception trace:
 () at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Util/RemoteFilesystem.php:267
 Composer\Util\RemoteFilesystem->get() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Util/Remote
Filesystem.php:83
 Composer\Util\RemoteFilesystem->getContents() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Rep
ository/ComposerRepository.php:587
 Composer\Repository\ComposerRepository->fetchFile() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Compos
er/Repository/ComposerRepository.php:296
 Composer\Repository\ComposerRepository->whatProvides() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Com
poser/DependencyResolver/Pool.php:191
 Composer\DependencyResolver\Pool->computeWhatProvides() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Co
mposer/DependencyResolver/Pool.php:180
 Composer\DependencyResolver\Pool->whatProvides() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/
DependencyResolver/RuleSetGenerator.php:221
 Composer\DependencyResolver\RuleSetGenerator->addRulesForPackage() at phar://C:/ProgramData/ComposerSetup/bin/composer.
phar/src/Composer/DependencyResolver/RuleSetGenerator.php:330
 Composer\DependencyResolver\RuleSetGenerator->getRulesFor() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/sr
c/Composer/DependencyResolver/Solver.php:172
 Composer\DependencyResolver\Solver->solve() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Insta
ller.php:505
 Composer\Installer->doInstall() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Installer.php:230

 Composer\Installer->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Command/UpdateCommand.p
hp:140
 Composer\Command\UpdateCommand->execute() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/symfony/conso
le/Symfony/Component/Console/Command/Command.php:257
 Symfony\Component\Console\Command\Command->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/symfon
y/console/Symfony/Component/Console/Application.php:874
 Symfony\Component\Console\Application->doRunCommand() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/s
ymfony/console/Symfony/Component/Console/Application.php:195
 Symfony\Component\Console\Application->doRun() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Co
nsole/Application.php:146
 Composer\Console\Application->doRun() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/vendor/symfony/console/S
ymfony/Component/Console/Application.php:126
 Symfony\Component\Console\Application->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/src/Composer/Cons
ole/Application.php:82
 Composer\Console\Application->run() at phar://C:/ProgramData/ComposerSetup/bin/composer.phar/bin/composer:43
 require() at C:\ProgramData\ComposerSetup\bin\composer.phar:25

After which Composer exits.

Notice the first line says "Downloading https://packagist.org/packages.json" and subsequent cache readings use the repo/https---packagist.org folder, but then it switches to "Downloading http://packagist.org/p/symfony/config$b1c50ac649cbb1d1952965df19066f264a56e227694a6105e42a853499b64e77.json"

@nevvermind: To clarify my HTTP/HTTPS statement: when I run composer update (NOT in verbose mode), the error is "http://packagist.org could not be fully loaded, package information was loaded from the local cache and may be out of date". Looking in the Composer source, this is not a hardcoded error but really is the URL that is used.

If I disable the openssl extension for PHP and remove the allow_ssl_downgrade part from config.json, I get the same error as described in my opening post.

commented

Weird. Did you try running the command after doing a composer clear-cache ?

I had not, but have now.
It's downloading a bunch of provider*.json files over HTTPS, until it gets to http://packagist.org/p/sebastian/global-state$b53387feba200d8b5e8edceb38f7ee998e77dcd375fcc64bf23bc511961af794.json which, as you can see, is loaded over HTTP. All files after that are downloaded over HTTP until it stops at http://packagist.org/p/symfony/finder$5faa85a87ad45dd50d43d33c64a3e2088fddc9373b2b582289bbc99fe95b3495.json to give the zlib error.

@Ugoku do you have a network proxy or something that could modify the http connections in some way? Because that could cause the zlib failure if the response body is fiddled with and it isn't valid gzip anymore.

Not that I know of... :P it works fine for my colleague in the same network, so there's no network proxy that does that. I also don't have the problem with any other program/website, it's only Composer.

Just tried this:
Running curl -H "Accept-Encoding: gzip" -I http://packagist.org/p/symfony/finder$5faa85a87ad45dd50d43d33c64a3e2088fddc9373b2b582289bbc99fe95b3495.json gives a 302 Found with a redirect to the HTTPS equivalent, which in turn gives a 404 Not Found.
I'm guessing that has something to do with it?

You get that redirect because the file doesn't exist anymore, so it hits symfony and symfony forces https and redirects, but then you get a 404 if you follow that redirect.

Could you install composer from source and check what's going on in the RemoteFilesystem when it fails with that zlib error? It's kinda hard for us to work on this if we can't reproduce it :/

Sorry, what do you mean by "install composer from source" ?

commented
cd /tmp
git clone git@github.com:composer/composer.git
cd composer
curl -sS https://getcomposer.org/installer | php
php composer.phar install
cd ..
/tmp/composer/bin/composer require -vvv xxx/yyy

You can modify the actual source files in /tmp/composer/src now and add more debugging statements wherever you require them. Every time you call /tmp/composer/bin/composer it will execute Composer with those modified source files, instead of running from a Phar binary.

is there a conclusion yet? I had the exact same problem here, which is killing me...~~~ help!

Thanks @alcohol, I had to tweak it a bit (I'm on Windows :) ) to get it running, I generated composer.bat in the composer/bin folder with the following content:

@ECHO OFF
php "%~dp0composer" %*

Going to debug now, I'll keep you updated.

-edit-
Okay, first thing I noticed:

[ErrorException]
Undefined variable: PHP_VERSION_ID

Which is quite odd, but not the source of the problem.

If I write $result to a file (before zlib_decode on line 218) it has the correct gzip header (0x1F, 0x8B) but decompressing the file (using PHP's zlib_decode, WinRAR, or 7-Zip) does not work. WinRAR and 7-Zip report the uncompressed file size as 2.504.067.759 bytes (2.5 GB!) whereas the actual file (https://packagist.org/p/symfony/finder$577ee87ebd6ba36e3412bda5920ca2713d3265429fd8523f0c1704025f9abb89.json, downloaded in a browser) is only 683 KB.

Would it help if I gave you the gzip file?

commented

That is weird. http://php.net/phpversion mentions nothing about that constant being OS dependent or anything. Merely that PHP_VERSION_ID is available as of PHP 5.2.7.

@woodytang and I had the same issue on a Mac and I've tested it on a vagrant machine, same problem

{
    "name": "laravel/laravel",
    "description": "The Laravel Framework.",
    "keywords": ["framework", "laravel"],
    "license": "MIT",
    "type": "project",
    "require": {
        "laravel/framework": "5.0.*",
        "xuma/laravel-amaran": "~1.0@dev",
        "baum/baum": "~1.1",
        "laracasts/presenter": "0.1.*",
        "nicolaslopezj/searchable": "1.*",
        "jenssegers/date": "~2.0",
        "guzzlehttp/guzzle": "6.0.1",
        "symfony/dom-crawler": "3.0.*@dev",
        "symfony/css-selector": "3.0.*@dev"



    },
    "require-dev": {
        "phpunit/phpunit": "~4.0",
        "phpspec/phpspec": "~2.1",
        "laracasts/generators": "dev-master",
        "barryvdh/laravel-ide-helper": "~2.0",
        "fzaninotto/faker": "1.5.*@dev",
        "codesleeve/laravel-stapler": "dev-master",
        "offline/persistent-settings": "dev-master",
        "way/generators": "dev-feature/laravel-five-stable"
    },
    "autoload": {
        "classmap": [
            "database"
        ],
        "psr-4": {
            "App\\": "app/"
        }
    },
    "autoload-dev": {
        "classmap": [
            "tests/TestCase.php"
        ]
    },
    "scripts": {
        "post-install-cmd": [
            "php artisan clear-compiled",
            "php artisan optimize"
        ],
        "post-update-cmd": [
            "php artisan clear-compiled",
            "php artisan ide-helper:generate",
            "php artisan optimize"
        ],
        "post-create-project-cmd": [
            "php -r \"copy('.env.example', '.env');\"",
            "php artisan key:generate"
        ]
    },
    "config": {
        "preferred-install": "dist"
    },

    "repositories": [
        {
            "packagist": false
        },
        {
            "type": "composer",
            "url": "http://packagist.cn"
        }
    ]

}

I've runned it without the repositories part and it seemed to work fine. @woodytang tried it again with the repositories part and it worked to...

Maybe a problem with server connection or something?...

@Ugoku it'd be interesting to get the file yes. Then I can also download it here and see if there is any difference.

@Ugoku ok so I compared yours and my cleanly downloaded one with curl -H 'Accept-Encoding:gzip' https://... - yours is exactly the same but truncated at 30377 bytes while mine is 41493 bytes.

Question is.. why would it be truncated without failing the file_get_contents call.

Did some more testing, every time Composer hits the zlib_decode code block, I output the URL and write the contents to a file with a unique filename. Result:

http://packagist.org/p/provider-2015-04$299ad23cd43cfbd7b6750ca1946f5d32297ca20ddefe30f59288b45aa0ddde0a.json
http://packagist.org/p/provider-latest$c902af54b316e276d0f67ae7397acbd9f95e297085a53786c7379559fae2bedd.json
http://packagist.org/p/symfony/finder$32c5233e919bcf365383bc5f656630fe7291654fcba9a8da89f2df5545b93194.json
http://packagist.org/p/symfony/finder$32c5233e919bcf365383bc5f656630fe7291654fcba9a8da89f2df5545b93194.json
http://packagist.org/p/symfony/finder$32c5233e919bcf365383bc5f656630fe7291654fcba9a8da89f2df5545b93194.json

It downloads the finder file 3 times, is that normal? All of them are corrupt and of different size (23085 bytes for the first, 18705 for the second, 20165 for the third)
At first glance in a hex editor, they all look identical, except for the file size of course.
The other files (provider-2015-04 and provider-latest) are larger in size, so it's not a size issue, and they decompress just fine.

That's because we retry 3 times in case of failed download, so yes when it starts going belly up it downloads the file again and again then fails, and they all look the same because they are the same. Except they're truncated at different points wherever the connection got dropped.

Could you print a timestamp on every request before/after the file_get_contents? nginx will close the connection after 20seconds if the body isn't fully sent to the client. Apart from that I can't think of what could go wrong on our end.

Ah, thanks for the clarification on the 3 downloads. Here's the full log of all file_get_contents calls when I run it now, with before/after microtime output:

./composer.json
Before: 1434543152.6749
After: 1434543152.676

C:/Users/Sander/AppData/Roaming/Composer/composer.json
Before: 1434543152.9187
After: 1434543152.92

Loading composer repositories with package information
https://packagist.org/packages.json
Before: 1434543153.3466
After: 1434543154.6304

Updating dependencies (including require-dev)
http://packagist.org/p/provider-latest$df5bc15ec6faf00d160fe6d132679e7ffc29c7227d9d629716c32fe02fdf21c1.json
Before: 1434543154.9412
After: 1434543155.1317

http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Before: 1434543156.8442
After: 1434543156.9064

http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Before: 1434543157.0114
After: 1434543157.085

http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Before: 1434543157.1905
After: 1434543157.2729

They all download in < 0.1 second and the entire update takes less than 5 seconds, so nowhere near the 20 second timeout :(

commented

@Seldaek can you share your nginx configuration? Just out of curiosity.

Nope, nothing odd in the headers as far as I can see. First one works, second one fails:

http://packagist.org/p/provider-latest$cf9f4ef31fc663282a889059600faa4bb57f47fed42f41fee943c597028d1363.json
Array
(
[0] => HTTP/1.1 200 OK
[1] => Server: nginx
[2] => Date: Wed, 17 Jun 2015 12:33:10 GMT
[3] => Content-Type: application/json
[4] => Last-Modified: Wed, 17 Jun 2015 12:32:14 GMT
[5] => Connection: close
[6] => Vary: Accept-Encoding
[7] => ETag: W/"558168ce-a1e58"
[8] => Content-Encoding: gzip
)
http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Array
(
[0] => HTTP/1.1 200 OK
[1] => Server: nginx
[2] => Date: Wed, 17 Jun 2015 12:33:12 GMT
[3] => Content-Type: application/json
[4] => Last-Modified: Wed, 17 Jun 2015 08:12:16 GMT
[5] => Connection: close
[6] => Vary: Accept-Encoding
[7] => ETag: W/"55812be0-ad219"
[8] => Content-Encoding: gzip
)

Disabling gzip works fine :| no corrupted files anymore..... So while it's a lot slower, I can at least use Composer again!

commented

I think @Seldaek would have preferred you showed the headers of the file that gets repeated multiple times. You are now showing the headers for two different files.

@Ugoku the main difference with gzip is that it switches to chunked encoding and removes the Content-Length header as it compresses the stream while sending it. So yeah the content length might help php know when to stop reading and not stop early for some obscure reason.. Although it still doesn't explain why it works with some files and not with others.

@alcohol the second one is the file that gets repeated 3 times. Headers are identical for each time:
http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Array
(
[0] => HTTP/1.1 200 OK
[1] => Server: nginx
[2] => Date: Wed, 17 Jun 2015 12:33:12 GMT
[3] => Content-Type: application/json
[4] => Last-Modified: Wed, 17 Jun 2015 08:12:16 GMT
[5] => Connection: close
[6] => Vary: Accept-Encoding
[7] => ETag: W/"55812be0-ad219"
[8] => Content-Encoding: gzip
)
http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Array
(
[0] => HTTP/1.1 200 OK
[1] => Server: nginx
[2] => Date: Wed, 17 Jun 2015 12:33:12 GMT
[3] => Content-Type: application/json
[4] => Last-Modified: Wed, 17 Jun 2015 08:12:16 GMT
[5] => Connection: close
[6] => Vary: Accept-Encoding
[7] => ETag: W/"55812be0-ad219"
[8] => Content-Encoding: gzip
)
http://packagist.org/p/symfony/console$4ffb81d50a0e96344723a85c7c48e85719b27528c97365855690209a0e94538c.json
Array
(
[0] => HTTP/1.1 200 OK
[1] => Server: nginx
[2] => Date: Wed, 17 Jun 2015 12:33:12 GMT
[3] => Content-Type: application/json
[4] => Last-Modified: Wed, 17 Jun 2015 08:12:16 GMT
[5] => Connection: close
[6] => Vary: Accept-Encoding
[7] => ETag: W/"55812be0-ad219"
[8] => Content-Encoding: gzip
)

commented

It literally requests all 3 of them in the same second? Weird, I can't imagine why the request would terminate so sudden and quickly.

Yes, indeed that's exactly what happens... in this comment above I used microtime before and after each file_get_contents and they all fail within 0.1 second (small files, 120 Mb connection, so that seems to make sense)

@Ugoku How about removing context? I mean: what happens if you create a seperate script and just run file_get_contents? This way you can isolate your problem to the specific code that fails.

@frederikbosch without context you don't get gzipped data though..

@Seldaek Lol, I did not mean the stream context but the general composer application context. @Ugoku should be able to post a tiny script that exposes his problem. Then we are able to use that script to try to reproduce his problem.

At this stage we are guessing and giving suggestions based on thought. But there are too many external factors involved. By moving the failing lines to a seperate script these factors can be eliminated. Hence we can solve a specific problem.

@Ugoku do you run some type of antivirus software or firewall?

Do you access the internet through a http proxy?

@staabm: yes, I am running both antivirus and firewall normally, but disabling them has no effect. I am not accessing internet through a proxy.

@frederikbosch: I will get to that today, I'll let you know.

@Ugoku I believe this gist should cover the context composer sets up.

Sorry, didn't get around to this yesterday. Thanks for the script @slbmeh, but when I use that I get a non-gzipped file back :/

I could try a clean install of WAMP and Composer, see if that helps...

@Ugoku It would be interesting to see and know why this fails. Could you share the gist that is returning the non-gzipped file? And maybe you could figure out which parameters have changed compared to the original script in composer.

Sure, anything I can do to help! :)

Here it is
I took @slbmeh's gist and changed it to the URL that was failing for me.

I also tried changing $options to match the options Composer uses, which are as follows:

Array
(
    [http] => Array
        (
            [protocol_version] => 1.1
            [header] => Array
                (
                    [0] => User-Agent: Composer/source (Windows NT; 6.2; PHP 5.6.5)
                    [1] => Accept-Encoding: gzip
                    [2] => Connection: close
                )
            [ignore_errors] => 1
        )
)

But that just does the same

@Ugoku That gist turns 404 for me. Is the url correct? Maybe it changed.

Yes, the URL appears to change every so often. I'm guessing everything between $ and .json is a hash for caching?

Current URL is http://packagist.org/p/symfony/finder$a4bc31ef5d391952af0e7202b072f6d587ae89623277f89aefad0f9d49a9bc2c.json

@Ugoku Try this script: https://gist.github.com/frederikbosch/27b0620e153d3d9b3116. The http header in the stream context must be a string. Tiny mistake by @slbmeh I guess. I also commented out the content type. Now at least you should have the correct headers. If I take strlen of the content. It says int(41528). How about yours?

Nope: int(31837)... If I run the script a couple of times, the file size varies every time but the end result is the same: a corrupt file.
From stream_get_meta_data, $headers['eof'] is false and unread_bytes is > 0, if that helps.

@Ugoku Interesting... could you share the exact output from this file https://gist.github.com/frederikbosch/af1e6225a56b6bbbe465? I am interested in the header results before and after fetching the stream contents.

commented

Weird, your 'unread_bytes' count (before the stream_get_contents() call) is completely different. Can you repeat the process several times and see if it is different for each iteration?

commented

Oh it is different for me too (7917), but consistently the same on multiple iterations.

No problem, here is Iteration / Before / After / Content size (fwiw)
1 / 1185 / 0 / 31837
2 / 1185 / 0 / 18705
3 / 1185 / 0 / 21625
4 / 1185 / 0 / 27457
5 / 1185 / 0 / 14325
So yeah, unread_bytes is consistent, but strlen($content) is not.

@Ugoku @alcohol Good to know. This might not even be an actual PHP problem. We should find out whether all data actually gets to your computer. @Ugoku Do you have some experience with a tcpdump variant on Windows? Maybe WinDump, or similar. In the past there was something like Ethereal, I believe it is now called Wireshark.

If the data does get to your computer, there is a problem getting the correct data to PHP. If not, there is a problem with request to the packagist server (by PHP) or with the response of the packagist server.

Could you run the script again and enable the tcp inspection?

commented

Weird, the strlen($content) value is also consistent for me (41528).

Actually, I've just ran the iteration a lot more times, and I did notice that unread_bytes sometimes fluctuates to a different value for me. But the end result is always the same. So even when unread_bytes fluctuated, the strlen($content) value remained identical.

/tmp $ tail -1 test.php
printf("%d:%d\n", $headersBeforeContents['unread_bytes'], strlen($content));
/tmp $ for i in {1..100}; do php test.php; done
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
2461:41528
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
7917:41528
5197:41528
2461:41528
5197:41528
7917:41528
7917:41528
7917:41528
..snipped..
7917:41528
7917:41528
1093:41528
1093:41528
7917:41528
7917:41528
7917:41528
7917:41528
2461:41528
7917:41528
7917:41528
5197:41528
7917:41528
7917:41528
7917:41528

@frederikbosch I've used Wireshark before though it's been a while. I can see traffic to and from 87.98.253.214 (packagist.org) but what should I be looking for exactly?

@alcohol I confirm your findings. I think it is definitely not the PHP script, so it is not a composer bug. It is either network related, PHP bug or webserver problem.

@Ugoku You should inspect the specific packages and see if you see a complete json is coming in.

Right, thanks. Yes, the line with 200 OK gives an "Uncompressed entity body" of 698634 bytes which is identical in both size and content to when I download it in the browser.

Glad to see it's not a Composer bug. Thanks so much for all the help!
Should I close the issue or should we wait until @BloomPhilippe from #4176 shares his information?

commented

Also interesting to try out is https://github.com/simsong/tcpflow

@Ugoku We are not there yet. It is maybe not a composer bug, but maybe we can work around. Since there are multiple people affected, we might come to a fix or or hack.

But for this hack, I wonder the following. Why does his computer receive the correct data but is PHP not able to read it correctly? What could be wrong? Is it a bug in PHP? Are there any other possible explanations?

@Ugoku Just to be sure: you see the correct JSON file in Wireshark, right?

commented

For what it is worth, I did notice that for me, the connection handler arbitrarily picks ipv4 or ipv6 (since my laptop is also setup with ipv6 connectivity). This doesn't seem to affect the final result (for me), but figured it might be worth mentioning.

@frederikbosch Yes, Wireshark gives the correct JSON file and the exact same contents as the JSON file downloaded through a browser.

For what it's worth, I just installed PHP 5.6.10 (TS x64) and ran composer update again, didn't help.

@Ugoku @alcohol I wonder how many people are affected by this. If all Windows users were affected, we should have had much more problems. Alright, there have been multiple reports of Windows users affected, but the group is minor I believe. But it must be related to Windows, right? And since we can rule out network, there must be something really specific with his PHP version in combination with Windows that is causing this problem.

And because he receives a correct json file, I cannot figure how can overcome this problem by creating a hack. Changing the request to packagist will not affect the problem.

If nobody can come up with another possibility, I think we should report a bug at php.net.

@Seldaek You are the pro here. What do you think?

My specs:

  • Windows 8.1 Home Premium Edition, AMD64 (Windows NT 6.3 build 9200)
  • PHP 5.6.5 (Build Date: Jan 21 2015 16:25:33, Compiler: MSVC11 (Visual C++ 2012)) thread safe / PHP 5.6.10 (Build Date: Jun 10 2015 15:53:53, Compiler: MSVC11 (Visual C++ 2012)) thread safe
  • Yes, it only happens when gzip is enabled.
  • AFAIK, I am not using IPv6.

I reported this on June 8, it began happening I think about a week before that. It definitely worked on the same machine a while before that.

I'll try your last suggestion, will report on that. I will also post my home specs when I get the chance.

@Ugoku Did you update anything on your computer related to PHP between June 1 and June 8?

@Ugoku also did you try with composer alpha10 for example? Just to rule out any recent changes. Sorry if you did already I can't read the whole backlog right now :)

@Ugoku You might also try to modify some headers in the script. Maybe you will get different results.

@frederikbosch I don't remember changing anything PHP related... definitely not PHP itself, that's been running since February.

@Seldaek no need to apologise :) no, I've been running the latest master for as long as I remember. I'll try with alpha10.

I've uploaded the original JSON file to https://dashboard.shore2ship.com/json.json (gzip is enabled) and when I run the script on that URL, it works fine (output here). I don't have access to an nginx server though, this is on an IIS 8 server (Azure). I can try on an Apache server too, if that helps.

@Ugoku in order to replicate consistently, you should use 'http' not 'https'.

commented

You can find the file here behind an nginx server:

http://dump.robbast.nl/finder$a4bc31ef5d391952af0e7202b072f6d587ae89623277f89aefad0f9d49a9bc2c.json

$ nginx -V
nginx version: nginx/1.8.0
built with OpenSSL 1.0.2a 19 Mar 2015 (running with OpenSSL 1.0.2c 12 Jun 2015)
TLS SNI support enabled
configure arguments: --prefix=/etc/nginx --conf-path=/etc/nginx/nginx.conf --sbin-path=/usr/bin/nginx --pid-path=/run/nginx.pid --lock-path=/run/lock/nginx.lock --user=http --group=http --http-log-path=/var/log/nginx/access.log --error-log-path=stderr --http-client-body-temp-path=/var/lib/nginx/client-body --http-proxy-temp-path=/var/lib/nginx/proxy --http-fastcgi-temp-path=/var/lib/nginx/fastcgi --http-scgi-temp-path=/var/lib/nginx/scgi --http-uwsgi-temp-path=/var/lib/nginx/uwsgi --with-imap --with-imap_ssl_module --with-ipv6 --with-pcre-jit --with-file-aio --with-http_dav_module --with-http_gunzip_module --with-http_gzip_static_module --with-http_realip_module --with-http_spdy_module --with-http_ssl_module --with-http_stub_status_module --with-http_addition_module --with-http_degradation_module --with-http_flv_module --with-http_mp4_module --with-http_secure_link_module --with-http_sub_module
commented

Odd, the gist script returns a much smaller strlen($content) on the one from my server. I simply grabbed it from Packagist using wget.

@alcohol Then the contents cannot be equal.

commented

They are though. I just checked..

@alcohol I mean the contents of $content must be different, because it tells you a different strlen.

commented

It's a significant difference though; 41528 from packagist.org and 27875 from my domain. Isn't $content supposed to reflect the json? Or what?

commented

Oh maybe I use different gzip configuration than packagist does;

    gzip on;
    gzip_vary on;
    gzip_min_length 10240;
    gzip_comp_level 6;
    gzip_http_version 1.1;
    gzip_proxied expired no-cache no-store private auth;
    gzip_types text/plain text/css text/xml text/javascript
        application/json application/x-javascript application/javascript
        application/xml application/xml+rss application/xhtml+xml application/rss+xml;

My server block;

    server {
        server_name dump.robbast.nl;
        access_log /var/log/nginx/robbast.nl.log custom;
        error_log /var/log/nginx/robbast.nl.err;
        root /srv/http/robbast.nl/dump;
        autoindex on;
        location = /robots.txt  { access_log off; log_not_found off; }
        location = /favicon.ico { access_log off; log_not_found off; }
    }

BTW for everyone, I copied that file somewhere to avoid it disappearing in the future, just so we can all work with the same file for tests: it is now available at http://packagist.org/_p/symfony/finder%24a4bc31ef5d391952af0e7202b072f6d587ae89623277f89aefad0f9d49a9bc2c.json

@alcohol maybe your server always sends gzipped? In a browser both responses look the same to me (packagist vs your server). Here is the gzip config I have:

    gzip on;
    gzip_comp_level 3;
    gzip_disable msie6;
    gzip_http_version 1.0;
    gzip_min_length 1400;
    gzip_proxied any;
    gzip_types text/plain application/json text/css application/x-javascript application/javascript text/xml applicatio$
    gzip_vary on;

The comp_level probably is the difference, and that's interesting that the result is so different, maybe I should bump that a bit.

@alcohol Could you compare the value $content after grabbing the json from your own server to that of packagist? I mean the Content-Length could be different, but the resulting value in $content must be the same! It should be the full json contents.

commented

Not really comparable. $content is the gzip output. @Seldaek just changed his compression level on Packagist to the same as mine and now I get the exact same content length for both my file and his file.

@alcohol Of course, my bad, but the central question still is how can @Ugoku end up with a corrupted file while the full file went over his network?

commented

The problem is though that @Ugoku is not even receiving the full gzip output. His $content length is different every time he makes a request. If he could submit several requests to my server, I can check the logs to see if it sends the full file (I have bytes send in my logs). Then we can determine if the client aborts the connection prematurely (if my server cannot send the full file), or if something else cuts off data at his end (if my server does send the full file).

@alcohol He already watched with Wireshark. He inspected his tcp packages. Wireshark reports the full file going over his network. But the response is not interpreted correctly by PHP.

commented

Well then it must definitely be a PHP issue, or a related lib it relies on.

@alcohol But since it worked for him in the past, there must be a specific setting that triggers this bug. If we want to report a PHP issue, we at least should find out what that is.

commented

It still seems very weird though that it just "suddenly" started to happen. Something must have changed in the environment that causes things to now be broken.

@alcohol Very true that, and it must be in the very tiny script that we extracted from composer as he claims that there were no changes to his system.

The content type sort was there because composer moves it to the end due to a bug in php. I suppose it doesn't apply here.

It would be interesting to confirm that this can be reproduced on ipv4 and ipv6. I'm still not convinced that this isn't a bad gateway somewhere between your connection to your isp and packagist.

What are the ISPs of the individuals affected? I know comcast just enabled ipv6 in my area at the beginning of the month.

@slbmeh The ISP is not important in my opinion. @Ugoku reports that Wireshark shows that he receives the file correctly. So there is no ISP problem. The packagist servers sends correctly, the ISP delivers the file accordingly but PHP truncates the file somehow.

Nonetheless, it is striking that it worked before. So something is triggering this behaviour, but we are not able to find it (yet).

The file @alcohol serves from robbast.nl works. The static file from @Seldaek on packagist.org does not, but I can see the full file in Wireshark. So yes, it's served correctly by the server and received correctly by my computer, but between there and PHP processing the file, something goes wrong.

As I said, I changed from PHP 5.6.5 to 5.6.10 yesterday but that didn't change anything. Disabling antivirus does not help.

-edit-
Whoah... changed from ethernet to wifi (same network!) and now everything works again, composer update too. This is very consistent, if I turn off wifi and use the ethernet connection it fails again.
This is definitely weird, but I can live with this now that I know how to fix it.

If you think this is something that Composer could work around, or should be submitted as a PHP issue, I am more than happy to try anything that might help. Otherwise, I could close the issue.

Replying as I have been asked about it on twitter.

If it works using one connection but fails with another then it is definitely a OS configuration problem. Maybe one uses a proxy, firewall or something related.

I do not see how it could be a php bug, on windows or other.

@pierrejoye First of all, thanks a lot for providing help, much appreciated. Am I correct you think it is a server problem on pacakgist.org? I think it is not a client OS configuration issue. There have been filed multiple issues, but all have been closed as duplicate of this one, so it already affects multiple people. I am not ruling that out completely, just think is not likely.

Let us assume you are right, and it is a server issue. What nginx configuration can cause the behaviour as reported by @Ugoku? He receives the full file (Wireshark proves that) but it is truncated in PHP and only by PHP on Windows?

Maybe @BloomPhilippe can confirm that the behaviour. @BloomPhilippe If you use the script and download the file from both servers (in top of the file). What do you see?

@BloomPhilippe Use this file to test both at the same time.

commented

I think @Seldaek already lowered his compression rate due to the hit on cpu
it incurred, so you will receive different gzip content.
On Jun 25, 2015 10:33 AM, "Frederik Bosch" notifications@github.com wrote:

@BloomPhilippe https://github.com/BloomPhilippe Use this file
https://gist.github.com/frederikbosch/2b75db3aa837c2ede873 to test both
at the same time.


Reply to this email directly or view it on GitHub
#4121 (comment).