Archiconda / build-tools

necessary build tools for the archiconda distribution

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Package build order

hmaarrfk opened this issue · comments

  • make <- mine just seems broken all the time. I really don't know hwo to build make correctly.....
  • perl
  • m4 -- problems with gnumake
  • autoconf
  • automake
  • xz
  • bzip2
  • zlib
  • expat
  • ncurses -- strip command is missing??? fixed by creating a fake link to strip
  • libtool -- same issue as m4
  • rhash -- same issue as m4
  • libuv (needs libtool)
  • help2man
  • flex -- I probably did this one wrong, it is really weird to test this one.
  • pkg-config
  • bison (needs flex) -- same problem as m4 with gnumake
    • 3.1 wouldn't build. needs version 3.2.4 conda-forge/bison-feedstock#15
    • 533 tests need to run. Takes about 10 mins on circleci, but about 49 mins on shippable.
  • ca-certificates
  • openssl
  • libssh2 !!! circular dependency, depend on ubuntu's
  • [ ] python 2.7 (krb5) <- why does krb5 need such a high level program.
  • [ ] krb5 (curl) <- this seems to be optional for curl
  • curl
  • cmake (needs libuv):
  • texinfo -- seems to need a c compiler (send issue upstream too)
  • libffi (needs texinfo)
  • readline (needs ncurses)
  • libedit (built on docker with qemu)
  • sqlite -- removed readline dependency in favor of libedit, nobody needs readline.
  • tk needs too many cdts, just skip, jjhelmus, what repo did you have to include???

Now we get to build python

  • python 3.7
    • removed xorg cdt and tk
      • skip tk tests in run tests.
    • need to disable optimizations because compilation takes too long and I don't feel like waiting.
      • Compilation now takes about 20 mins
    • remember to remove readline from tests too
      • Potentially, we can include the readline module during compilation, ignore the run exports, and include it as a test dependency. People should really use ipython if they need readline-like functionality. Prompt-toolkit is much better....
    • rebuild with optimizations (build number: 1001 https://app.shippable.com/github/Archiconda/python-feedstock/runs/27/1/console)
  • python 3.6 (remove xorg cdt and tk)
  • python 2.7 (remove xorg cdt and tk)
    • make sure to disable test for the command idle which requires tk
    • doesn't include runtime optimizations

Now we get to start building the python ecosystem.

Because python takes so long to build, I'm building a few packages on docker, but I only have python 3.7, so these will need to be rebuild.
These next 4 pacakges need:

conda config --set add_pip_as_python_dependency false
  • certifi
  • setuptools
  • wheel (requires setup tools)
  • pip

I think at this point the circular dependency is over.

These can be built together

  • six
  • enum34 # [py<34]
  • futures >=3.0.0 # [py<34]
  • asn1crypto
  • pycosat >=0.6.3
  • idna

Package in bold had to be made arch dependent..... yea it didn't last long until i wrote a script that just fixed the recipes I needed. Of course, the script doesn't keep track of what it patched, so who knows what packages are necessary for a minimal distribution.

  • pycparser <- had to upload 3.7 using qemu
  • cffi <- Was getting built before pycparser, so I had to built it on qemu before other packages start to fail.

Cryptography is annoying since it runs pytest....

  • iso8601
  • pretend
  • pytz
  • cryptography-vectors
    • must be built with conda-build != 3.16.0
    • This one was failing for me on qemu ..... hopefully the builds can continue with jjhelmus' packages
    • make sure this is the same version as cryptography. on conda forge, cryptography is not bleeding edge.
  • pyopenssl >=16.2.0
  • setuptools_scm
  • pytest-runner -- can be noarch, need to disable tests because no pytest now.
  • chardet >=3.0.2,<3.1.0
  • pysocks
  • ipaddress
pytest dependencies are too much ...

Now for the pytest

  • atomicwrites <- circular dependency with pytest
  • attrs <---- I didn't go through the dependencies of this one....
  • funcsigs # py27
  • more-itertools >=4.0
  • scandir # [py27] <- for pathlib2
  • pathlib2 >=2.2.0 # [py<36]
  • pluggy >=0.7
    -[ ] py >=1.5.0
  • pytest
  • cryptography
    • disabled tests due to pytest

These need cryptography

  • urllib3 >=1.21.1,<1.25

  • requests >=2.12.4,<3

  • yaml (c package, requires libtool)

  • ruamel_yaml

    • requires pytest
  • conda

    • Skip the tests because jjhelmus says they require defaults + x86
    • conda-env >=2.6 <- just remove this dependency.

Other notes:
I mostly killed noarch packages because I just wanted the user to be able to install conda-build and anaconda-client, rather than to ship them in a single executable.

Other challenges I had

  • zstd: Needs CMAKE_AR Archiconda/zstd-feedstock@3b6d1bd

  • libarchive: is tricky, make sure to build its dependencies first. All of them. using the same version of conda-build. SPecifically, if it finds any of the .la files it thinks it needs, it will try too statically link everything.

@jjhelmus do you have a good list of the order to follow?
I've already run into many issues with make, perl, and m4, I'm not sure how much I can automate this process.

When make is found to be necessary, document it here:
conda-forge/docker-images#78

The order that I build the bootstrap packages can be found in the conda4aarch64 repository. I'll update the order as I rebuild the packages in a CentOS 7 container.

I'll be uploading the packages from this build out to the c4aarch64 channel. GCC 7.3.0 compilers are available there that run on CentOS 7. This should be preferred over the ones in the aarch64_bootstrap label.

Are you cross compiling? I had major issues with GNUMakefile

I'm building inside a Docker container running on a baremetal aarch64 machine. Recipes are in the conda4aarch64 repository if you want to compare.

Is there a way to sync our efforts?

My goal is to follow conda-forge really closely, eventually cleaning up my changes and submitting them back.

Will you be doing a graphical stack?

Yeah, I have the same m4 recipe, it is something weird about either my ubuntu image, or the specific version of crosstools I have :/

Do I need to rebuild all my package with the Centos7 compiler in https://anaconda.org/c4aarch64/repo ? or should it be compatible with the one that worked only on ubuntu 18.04

My plan was to build everything needed for conda, conda-build, constructor and anaconda-client. I want to submit the modification back to conda-forge but working in a single repo initially is easier.

Do I need to rebuild all my package with the Centos7 compiler in https://anaconda.org/c4aarch64/repo ? or should it be compatible with the one that worked only on ubuntu 18.04

The packages existing package should be alright but switching to CentOS 7 base image without compilers or other build tool would be good. I'll add the image I'm working from to conda4aarch64.

Ok, I'll switch, I just have to change my mods to conda-smithy. Maybe that will fix my m4 issues. If those m4 issues get fixed, then I think we have a good case to seriously start pushing some patches.

I've got the Readme and some basic rerendering down. I'm trying to see if I am going to need to include the arch in the .ci_support files. I think it would be prudent to, but will cause the names to explode.

https://github.com/jjhelmus/conda4aarch64/tree/master/docker/pkg_builder has the Dockerfile I'm using for builds, I installed an aarch64 miniconda clone that I have locally.

How did you build that aarch64 miniconda? Using the previous bootstrap tags you had? If that is the case, then it is the same as Archiconda3 I think.

Yes, the miniconda I used had packages from the bootstrap, Archiconda3 should be the same.

Did you figure out of shippable has an automated method for registering repositories? Or is that still a manual step?

I just screen scraped them because otherwise I would have to pay for their API

It takes about 10-20 seconds per repo. The code really isn’t robust. It assumes the user knows how to hack in python.

So I run that fork repo script with the names of the packages I want, and it does everything from fork, creating the branch, re rendering, and push.

Unfortunately, many times I’ve had to add make to the requirements.

It can take many repos in parallel, though I haven’t used that much since I’ve hit many packages that require some kind of mod.

Finally, as you know, this early in the process there is only one thing to build at a time

Well, I started a bunch of jobs that might keep shippable busy for a day or two, only 1 aarch64 job is allowed at a time.

https://app.shippable.com/subs/github/Archiconda/dashboard

Python 3.7 might be cheating since technically it can fallback on jjhelmus' bootstrap channel for a missing dependency, but that channel definitely doesn't have the python 2.7 or 3.6. I don't expect python 2.7 to get very far due to the possibility of some things depending on python 2 stuff that Jonathan didn't list in his list.

There is also the possibility of depending on some CDT package, which would make things fail.

I wish I could include all of conda-forge, but I don't think I can add any channels since I think this particular version of conda was bugged and wouldn't let you add more than 1 channel. Apparently I had built conda out of order before and uploaded a package. Seems like Archiconda3 0.1.2 can add mutliple channels.... now I only wish I hadn't rendered so many recipes. Oh well, shippable will get some work done!

Optimized python builds for 3.6 and 3.7 are queued after Jonathan's list. Those are probably guaranteed to time out.

And we are going to be stuck anyway because

  1. conda-build bug (pretty sure this is what is happening to cryptography vectors???) and conda/conda-build#3195
  2. lief isn't packaged on conda-forge conda-forge/staged-recipes#7383 so conda build 3.17 can't be built.

i built conda build 3.16.3 quickly to fix 1, I think I'll just try to build that one for all condas