Giter Club home page Giter Club logo

python-wheels-manylinux-build's Introduction

⚠️ This action has been archived in favor of PyPA/cibuildwheel ⚠️


Python wheels manylinux build - GitHub Action

GitHub release License Open issues Merged PRs GitHub Stars GitHub Forks

Build manylinux wheels for a (Cython) Python package.

This action uses the manylinux containers to build manylinux wheels for a (Cython) Python package. The wheels are placed in a new directory <package-path>/dist and can be uploaded to PyPI in the next step of your workflow.

This is a relatively simple and straightforward action. For more complicated use cases, check out PyPA/cibuildwheel.

Usage

Example

Minimal:

uses: RalfG/[email protected]
with:
  python-versions: 'cp310-cp310 cp311-cp311'

Using all arguments:

uses: RalfG/[email protected]_x86_64
with:
  python-versions: 'cp310-cp310 cp311-cp311'
  build-requirements: 'cython numpy'
  system-packages: 'lrzip-devel zlib-devel'
  pre-build-command: 'sh pre-build-script.sh'
  package-path: 'my_project'
  pip-wheel-args: '-w ./dist --no-deps'

See full_workflow_example.yml for a complete example that includes linting and uploading to PyPI.

Inputs

name description required default example(s)
python-versions Python version tags for which to build (PEP 425 tags) wheels, as described in the manylinux image documentation, space-separated required 'cp37-cp37m cp38-cp38 cp39-cp39 cp310-cp310 cp311-cp311' 'cp310-cp310 cp311-cp311'
build-requirements Python (pip) packages required at build time, space-separated optional '' 'cython' or 'cython==0.29.14'
system-packages System (yum) packages required at build time, space-separated optional '' 'lrzip-devel zlib-devel'
pre-build-command Command to run before build, e.g. the execution of a script to perform additional build-environment setup optional '' 'sh pre-build-script.sh'
package-path Path to python package to build (e.g. where setup.py file is located), relative to repository root optional '' 'my_project'
pip-wheel-args Extra extra arguments to pass to the pip wheel command (see pip documentation), passed paths are relative to package-path optional '-w ./dist --no-deps' '-w ./wheelhouse --no-deps --pre'

Output

The action creates wheels, by default in the <package-path>/dist directory. The output path can be modified in the pip-wheel-args option with the -w argument. Be sure to upload only the *-manylinux*.whl wheels, as the non-audited (e.g. linux_x86_64) wheels are not accepted by PyPI.

Using a different manylinux container

The manylinux2014_x86_64 container is used by default. To use another manylinux container, append -<container-name> to the reference. For example: RalfG/[email protected]_aarch64 instead of RalfG/[email protected].

Contributing

Bugs, questions or suggestions? Feel free to post an issue in the issue tracker or to make a pull request!

python-wheels-manylinux-build's People

Contributors

agates avatar ahartikainen avatar brianhelba avatar cielavenir avatar dries007 avatar ewouth avatar exarkun avatar legoktm avatar levitsky avatar minrk avatar mr-c avatar odidev avatar ralfg avatar skshetry avatar thecapypara avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

python-wheels-manylinux-build's Issues

cp38-cp38m error. do not find pip

hey, i am try to build python package and use your marketpalce, but i find it can't find pip of python38

here is the log

+ for PY_VER in '"${arrPY_VERSIONS[@]}"'
+ /opt/python/cp38-cp38m/bin/pip install --upgrade --no-cache-dir pip
/entrypoint.sh: line 30: /opt/python/cp38-cp38m/bin/pip: No such file or directory

how can i solve it ?

      with:
        python-versions: 'cp36-cp36m cp37-cp37m cp38-cp38m cp39-cp39m'

Can't find Python Header Files

Hi,

Thanks for creating this github action and maintaining it - really appreciated!

I am trying to use it to build a C++ extension that builds with CMake. I've hit an issue that I'm not 100% sure how to fix - my package requires the Python header files, but CMake isn't finding them:

  CMake Error at /tmp/pip-build-env-6h7f921u/overlay/lib/python3.6/site-packages/cmake/data/share/cmake-3.24/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
    Could NOT find Python (missing: Python_LIBRARIES Development
    Development.Embed) (found suitable version "3.6.15", minimum required is
    "3.6.15")

It clearly finds the right version of python; are the Development components also installed in the container? Apologies if I'm just doing something wrong (which is most likely!).

Question: how to use these images to test the packages just build?

Hello,

I am using your project to build setproctitle packages (see e.g. this job)

I would be interested in testing the packages just built, so pretty much using the same images to run a pip install PACKAGE and a pytest or tox, with the tests available in the cloned repos and the package installed. The actions should run in the image, so we can test properly each architecture.

Any idea about how to do it?

Thank you very much!

Switch to package dir created with PRE_BUILD_COMMAND

Hi!

In my use case I want to switch to the build directory which is created by running PRE_BUILD_COMMAND. I cannot do that by specifying PACKAGE_PATH because cd is attempted before the PRE_BUILD_COMMAND.

I tried to cheat by supplying do stuff && cd new_dir or do stuff; cd new_dir as PRE_BUILD_COMMAND, but it is escaped, so that doesn't work. Doing a cd in a script would be of no use either.

Is there a way to achieve what I need?

P.S. I also looked at changing the PIP_WHEEL_ARGS to specify the actual location of my setup.py (because it is an arg for pip wheel) but . is added to the pip wheel command anyway, so it doesn't help me either. If PIP_WHEEL_ARGS would replace all of the standard args, including ., that would help me.

Do not build wheels for dependencies

Currently when wheels are build, dependency wheels are build with it, which (I think) is not necessary. Further, this means that auditwheel will try to fix these wheels as well, which can lead to unexpected problems.

Adding --no-deps to pip wheel should fix this.

@pedrocamargo, following up on your previous issue (#1), I noticed that in a later build the action tried to auditwheel PyQt5, which was missing a dependency, which made the action error and exit... Fixing this, should fix your problem.

pypy images?

PyPy folks maintain a manylinux2010 image which adds pypy installations to the pypa 2010 image, which would be handy to include here. Since these only add installations to the pypa 2010 image, the pypy 2010 image could be used instead of the upstream pypa 2010 without anything lost, but I can see why that might not be exactly what you want to do.

EDIT:

I believe all that needs to be added is one more list entry to create_tags.py, but it's complicated slightly by the fact that the image would come from docker.io/pypywheels/manylinux2010-pypy_x86_64 instead of quay.io/pypa/manylinux2010_x86_64, unlike the rest

setup-python@v2 action overrides LD_LIBRARY_PATH env variable in manylinux container

My script works for a long time and suddenly broke recently. Not sure if it's a docker issue or the library env setup.

The problem is similar to pypa/manylinux#357

/usr/bin/docker run --name e5c351c92e44619324a9fb1b8eb57bf888be8_4b2a21 --label 1e5c35 --workdir /github/workspace --rm -e pythonLocation -e LD_LIBRARY_PATH -e INPUT_PYTHON-VERSIONS -e INPUT_BUILD-REQUIREMENTS -e INPUT_SYSTEM-PACKAGES -e INPUT_PRE-BUILD-COMMAND -e INPUT_PACKAGE-PATH -e INPUT_PIP-WHEEL-ARGS -e HOME -e GITHUB_JOB -e GITHUB_REF -e GITHUB_SHA -e GITHUB_REPOSITORY -e GITHUB_REPOSITORY_OWNER -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_RETENTION_DAYS -e GITHUB_ACTOR -e GITHUB_WORKFLOW -e GITHUB_HEAD_REF -e GITHUB_BASE_REF -e GITHUB_EVENT_NAME -e GITHUB_SERVER_URL -e GITHUB_API_URL -e GITHUB_GRAPHQL_URL -e GITHUB_WORKSPACE -e GITHUB_ACTION -e GITHUB_EVENT_PATH -e GITHUB_PATH -e GITHUB_ENV -e RUNNER_OS -e RUNNER_TOOL_CACHE -e RUNNER_TEMP -e RUNNER_WORKSPACE -e ACTIONS_RUNTIME_URL -e ACTIONS_RUNTIME_TOKEN -e ACTIONS_CACHE_URL -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/viztracer/viztracer":"/github/workspace" 1e5c35:1c92e44619324a9fb1b8eb57bf888be8  "cp36-cp36m cp37-cp37m cp38-cp38 cp39-cp39" "" "" "" "" "-w ./dist --no-deps"
+ PY_VERSIONS='cp36-cp36m cp37-cp37m cp38-cp38 cp39-cp39'
+ BUILD_REQUIREMENTS=
+ SYSTEM_PACKAGES=
+ PRE_BUILD_COMMAND=
+ PACKAGE_PATH=
+ PIP_WHEEL_ARGS='-w ./dist --no-deps'
+ cd /github/workspace/
+ '[' '!' -z '' ']'
+ '[' '!' -z '' ']'
+ arrPY_VERSIONS=(${PY_VERSIONS// / })
+ for PY_VER in '"${arrPY_VERSIONS[@]}"'
+ /opt/python/cp36-cp36m/bin/pip install --upgrade --no-cache-dir pip
Requirement already up-to-date: pip in /opt/_internal/cpython-3.6.12/lib/python3.6/site-packages (20.2.3)
+ '[' '!' -z '' ']'
+ /opt/python/cp36-cp36m/bin/pip wheel . -w ./dist --no-deps
WARNING: The directory '/github/home/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Processing /github/workspace
Building wheels for collected packages: viztracer
  Building wheel for viztracer (setup.py): started
  Building wheel for viztracer (setup.py): finished with status 'done'
  Created wheel for viztracer: filename=viztracer-0.8.1-cp36-cp36m-linux_x86_64.whl size=714538 sha256=ec8a6197af611f0e464023df7f7d3dafbf290be9477f496ea0239a2e7d2b39c2
  Stored in directory: /tmp/pip-ephem-wheel-cache-f1611rw4/wheels/cb/ab/8f/fd29149f371a05d1292662735ad3569bffe7dbc23f93e70d91
Successfully built viztracer
+ for PY_VER in '"${arrPY_VERSIONS[@]}"'
+ /opt/python/cp37-cp37m/bin/pip install --upgrade --no-cache-dir pip
/opt/_internal/cpython-3.7.9/bin/python3.7: error while loading shared libraries: libcrypt.so.2: cannot open shared object file: No such file or directory

Allow specifying output directory

Currently the wheels are placed in the wheelhouse folder, instead of the standard dist.

I think it would be nice to have a input to specify the destination folder.
The default can remain wheelhouse

How do you build packages for things such as i686 and aarch64?

I'm using your action to build x86 packages, but I'm not sure how to build them for alternate architectures. Your documentation mentions using alternate docker images, but I'm not sure how to create that architecture in Github actions. Do you have any examples of that?

cp311-cp311/bin/pip: No such file or directory

Have tried versions 0.6.0, 0.7.0 and 0.7.1 and getting the same error on all, three that it can't find pip for 3.11

+ for PY_VER in '"${arrPY_VERSIONS[@]}"'
+ /opt/python/cp311-cp311/bin/pip install --upgrade --no-cache-dir pip
/entrypoint.sh: line 44: /opt/python/cp311-cp311/bin/pip: No such file or directory

Add option to produce GitHub artifacts

As far as I have understood this GitHub Action allows to produce fixed wheels and push it automatically to public registry. I wonder if we could take the output of this build process and upload it to private registries, for instance, or to produce artifacts that we could use with actions/upload-artifact@v1. Any ideas?

Add custom manylinux container

I need a very brand spanking new version of clang to build my package (the current manylinux image doesn't cut it). So I made a manylinux container at https://hub.docker.com/r/ibellbelli/manylinuxclang with the new clang stuff, and I tried to add that to my call of this action, but it didn't work. I'm not sure if it ever can work with a truly custom docker image, but I'd like it to.

My failing example:

name: pypi deployer
on:
  push:
    tags:
      - "v*" # Push events to matching v*, i.e. v1.0, v20.15.10
jobs:
  Linux-build:
    runs-on: ubuntu-latest
    env:
      TWINE_USERNAME: ${{ secrets.TWINE_USERNAME }}
      TWINE_PASSWORD: ${{ secrets.TWINE_PASSWORD }}
      CXX: clang++
    steps:
      - uses: actions/checkout@v2
      - name: checkout submodules
        run: git submodule update --init --recursive
      - name: build and upload manylinux wheels
        uses: RalfG/[email protected]/manylinuxclang
        with:
          python-versions: 'cp37-cp37m cp38-cp38m cp39-cp39m'
          build-requirements: 'cython numpy'
          pip-wheel-args: '-w ./dist --no-deps'

but no dice:

Getting action download info
Failed to resolve action download info. Error: Unable to resolve action `RalfG/[email protected]***%2Fmanylinuxclang`, unable to find version `v0.3.4-ibell***/manylinuxclang`
Retrying in 20.194 seconds
Failed to resolve action download info. Error: Unable to resolve action `RalfG/[email protected]***%2Fmanylinuxclang`, unable to find version `v0.3.4-ibell***/manylinuxclang`
Retrying in 17.82 seconds
Error: Unable to resolve action `RalfG/[email protected]***%2Fmanylinuxclang`, unable to find version `v0.3.4-ibell***/manylinuxclang`

Some requirements can't be installed as suggested

Thanks for creating this action! However, at least one requirement can't be installed using pip on Linux, PyQT5.

Ideally, installation on windows is done like this:
sudo apt-get install python3-pyqt5

Would it make sense having a third parameter to the action with a list of commands to run prior to compilation?

Stack trace:

Collecting PyQt5
Downloading https://files.pythonhosted.org/packages/3a/fb/eb51731f2dc7c22d8e1a63ba88fb702727b324c635***83a32f27f73b8116/PyQt5-5.14.1.tar.gz (3.2MB)
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing wheel metadata: started
ERROR: Complete output from command /opt/_internal/cpython-3.7.6/bin/python /opt/_internal/cpython-3.7.6/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py prepare_metadata_for_build_wheel /tmp/tmprhikcrkk:
Preparing wheel metadata: finished with status 'error'
ERROR: Traceback (most recent call last):
File "/opt/_internal/cpython-3.7.6/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 64, in prepare_metadata_for_build_wheel
hook = backend.prepare_metadata_for_build_wheel
AttributeError: module 'sipbuild.api' has no attribute 'prepare_metadata_for_build_wheel'

During handling of the above exception, another exception occurred:

cannot access *-linux*.whl

I tried full_workflow_example.yml on my actions.
https://github.com/morugu/publish-pypi-example/blob/master/.github/workflows/main.yml

however following error occurred.

Successfully built pypi-example
+ for whl in '/github/workspace/wheelhouse/*-linux*.whl'
+ auditwheel repair '/github/workspace/wheelhouse/*-linux*.whl' --plat manylinux2010_x86_64 -w /github/workspace/wheelhouse/
usage: auditwheel [-h] [-V] [-v] command ...
auditwheel: error: cannot access /github/workspace/wheelhouse/*-linux*.whl. No such file
+ echo 'Repairing wheels failed.'
Repairing wheels failed.
+ auditwheel show '/github/workspace/wheelhouse/*-linux*.whl'
usage: auditwheel [-h] [-V] [-v] command ...
auditwheel: error: cannot access /github/workspace/wheelhouse/*-linux*.whl. No such file

I have no idea how to solve it.

compatible with pybind11?

Hi, thanks for this great package.
I used pybind11 rather than Cython.
It seems that this package requires Cython. is there a way to work around?

ERROR: Directory '/github/workspace/' is not installable. Neither 'setup.py' nor 'pyproject.toml' found.
+ echo 'Building wheels failed.'
Building wheels failed.
+ exit 1

Facing error "standard_init_linux.go:211: exec user process caused "exec format error"" while using RalfG/[email protected]_aarch64 in github-actions with qemu

Hi,

I am trying to use RalfG/[email protected]_aarch64 image in github-actions to build manylinux wheels for aarch64 along with qemu, but I am facing error standard_init_linux.go:211: exec user process caused "exec format error" while building the image at step RUN chmod +x /entrypoint.sh, as it is building before running qemu.

Please suggest if I need to add anything to build this image. Thanks.

Build report: https://github.com/odidev/github-test/runs/930701151?check_suite_focus=true

Suggestion: As we are already setting entrypoint.sh to the executable mode, I think we can remove this step from the Dockerfile.

Please share your opinion on the same.

Seeking help for use

Hi everyone, I am trying to upload a project with C extensions to pypi, this is my first time uploading this kind of platform exclusive project, I have encountered some problems and need help.

I want to use github actions to help me automate the process of publishing my project to pypi, I managed to implement the windows version of the upload, but when coming to linux, I tried to use this repo, but there raised an error that I couldn't understand the error message.

My repo link is GoodManWEN/py-fnvhash-c, where simplified version of the project tree structure is as follow:

.
├── fnvhash_c     #  pypi project name
│   ├── bfilt.py
│   └── __init__.py
├── LICENSE
├── README.md
├── requirements.txt
├── setup.cfg
├── setup.py
└── src
     ├── cityhash.pyx
     ├── cityhash_setup.py
     └──  include
          └── city.h

On windows system, in a full upload operation, I need to do the following commands:

cd src
cython cityhash_setup.py build_ext --inplace
move *.pyd ../fnvhash_c
cd ..
python setup.py bdist_wheel
twine upload dist/*

And then I tried to switch it into a many linux version, I used the following actions command:

    - name: Build manylinux Python wheels
      uses: RalfG/[email protected]_x86_64
      with:
        python-versions: 'cp36-cp36m cp37-cp37m cp38-cp38 cp39-cp39'
        build-requirements: 'cython'

Got the error message:

Building wheels for collected packages: fnvhash-c
  Building wheel for fnvhash-c (setup.py): started
  Building wheel for fnvhash-c (setup.py): finished with status 'done'
  Created wheel for fnvhash-c: filename=fnvhash_c-dev-cp36-cp36m-linux_x86_64.whl size=4647 sha256=fc10cd549f184c78159b4dca9326676111cae7dafc6dc69b60c6b70a1adfc819
  Stored in directory: /tmp/pip-ephem-wheel-cache-yh13j51a/wheels/cb/ab/8f/fd29149f371a05d1292662735ad3569bffe7dbc23f93e70d91
  WARNING: Built wheel for fnvhash-c is invalid: Metadata 1.2 mandates PEP 440 version, but 'dev' is not
Failed to build fnvhash-c
ERROR: Failed to build one or more wheels
+ echo 'Building wheels failed.'
+ exit 1
Building wheels failed.

One thing I didn't quite understand through the docs is that, theoretically building a c++ plugin and building a pypi wheel are two different builds.When using this action repo, I'm not quite sure when the cython build command is supposed to happen.

How should I adjust my project to make it work properly? Thanks.

Does it create wheels for Windows as well?

I'm sure the title itself is clear. To explain, I need a build wheel for pyaudio for Py3.7 - 32 Bit - windows 10. I'm confused by the name 'manylinux'. Can I make it work for Windows or is it just Linux?

P.S I don't really need to put it on PyPi. Just need to install on my system.

Thanks You.

`./auditwheel: No such file or directory` in manylinux1 container

I have the following GitHub action:

https://github.com/kliment/Printrun/pull/1082/files#diff-8efddb34c08d40af8b87450ad25f4cea

The find ... auditwheel call fails with: sh: ./auditwheel: No such file or directory

+ find . -type f -iname '*-linux*.whl' -execdir sh -c 'auditwheel repair '\''{}'\'' -w ./ --plat '\''manylinux1_x86_64'\'' || { echo '\''Repairing wheels failed.'\''; auditwheel show '\''{}'\''; exit 1; }' ';'
sh: ./auditwheel: No such file or directory
Repairing wheels failed.

Printrun-2.0.0rc6-cp37-cp37m-linux_x86_64.whl is consistent with the
following platform tag: "manylinux1_x86_64".

The wheel references external versioned symbols in these system-
provided shared libraries: libc.so.6 with versions {'GLIBC_2.2.5'}

The following external shared libraries are required by the wheel:
{
    "libc.so.6": "/lib64/libc-2.5.so",
    "libpthread.so.0": "/lib64/libpthread-2.5.so"
}

I cannot really see why auditsheel is searched in . rather than on $PATH but is it possible that something is wrong on the docker image?

There is no facility for installing additional unpackaged non-Python dependencies

If I want to build manylinux2010 wheel for a project that requires the Rust build toolchain then it seems I am somewhat out of luck. The manylinux2010 build environment has no system packages for the Rust build toolchain and I can't install it myself using the parameters exposed by this Github action.

It would be useful to have an escape hatch for executing an arbitrary executable (maybe a shell script) in the build environment as part of setup.

Action runs pip as root

Running everything as root can break the next actions in the pipeline.

In my case the issue was caused by the fact that for my package, say foobar, the .egg-file created in the src directory was created by this action and had rights only for root; see the output of ls -al

drwxr-xr-x 4 runner docker 4096 Nov 10 10:13 .
drwxr-xr-x 5 runner docker 4096 Nov 10 10:13 ..
drwxr-xr-x 2 root   root   4096 Nov 10 10:13 foobar.egg-info

Later in the workflow, I also build an sdist using python -m build --sdist. When setuptools then tries to use the .egg-file, it cannot have rights to do so.

Importantly, I have several packages and this error only pops out for the one one that has a C++ extension.

If there is hope to fix this issue, I am happy to deliver a small demo package, or a whole repo, which will demonstrate the issue.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.