Giter Club home page Giter Club logo

craft-parts's Introduction

Craft Parts

Documentation Status

Craft-parts provides a mechanism to obtain data from different sources, process it in various ways, and prepare a filesystem subtree suitable for deployment. The components used in its project specification are called parts, which can be independently downloaded, built and installed, and also depend on each other in order to assemble the subtree containing the final artifacts.

License

Free software: GNU Lesser General Public License v3

Documentation

https://canonical-craft-parts.readthedocs-hosted.com/en/latest/

Contributing

A Makefile is provided for easy interaction with the project. To see all available options run:

make help

Development Environment

In order to develop any apt related items, the python-apt package is needed. The apt extra will require this package in general.

Apt package prerequisites for this development environment on an Ubuntu system can be installed with:

sudo apt install libapt-pkg-dev intltool fuse-overlayfs

On a Debian or Ubuntu system, the appropriate package can be installed by running

apt source python-apt
pip install ./python-apt_*

or by downloading and installing the appropriate source tarball. On Ubuntu these are:

Running tests

To run all tests in the suite run:

make tests

Adding new requirements

If a new dependency is added to the project run:

make freeze-requirements

Verifying documentation changes

To locally verify documentation changes run:

make docs

After running, newly generated documentation shall be available at ./docs/_build/html/.

Committing code

Please follow these guidelines when committing code for this project:

  • Use a topic with a colon to start the subject
  • Separate subject from body with a blank line
  • Limit the subject line to 50 characters
  • Do not capitalize the subject line
  • Do not end the subject line with a period
  • Use the imperative mood in the subject line
  • Wrap the body at 72 characters
  • Use the body to explain what and why (instead of how)

craft-parts's People

Contributors

aristochen avatar artivis avatar carlcsaposs-canonical avatar cmatsuoka avatar dariuszd21 avatar dboddie avatar facundobatista avatar jgcarroll avatar kubiko avatar lengau avatar liushuyu avatar mr-cal avatar renovate[bot] avatar scarlettgatelymoore avatar sergey-borovkov avatar sergiusens avatar syu-w avatar thp-canonical avatar tigarmo avatar toabctl avatar valentindavid avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

craft-parts's Issues

scons plugin integration test failing if scons is not installed.

This is basically the same as #380 but for scons.

test_scons_plugin
FAILED        [ 84%]
tests/integration/plugins/test_scons.py:9 (test_scons_plugin)
self = <craft_parts.plugins.scons_plugin.SConsPluginEnvironmentValidator object at 0x7f57e615f2b0>
dependency = 'scons', plugin_name = 'scons', part_dependencies = []
argument = '--version'

    def validate_dependency(
        self,
        dependency: str,
        plugin_name: str,
        part_dependencies: Optional[List[str]],
        argument: str = "--version",
    ) -> str:
        """Validate that the environment has a required dependency.
    
        `<dependency-name> --version` is executed to confirm the dependency is valid.
    
        :param dependency: name of the dependency to validate.
        :param plugin_name: used to generate the part name that would satisfy
                            the dependency.
        :param part_dependencies: A list of the parts this part depends on.
        :param argument: argument to call with the dependency. Default is `--version`.
    
        :raises PluginEnvironmentValidationError: If the environment is invalid.
    
        :return: output from executed dependency
        """
        try:
            command = f"{dependency} {argument}"
>           output = self._execute(command).strip()

/home/lengau/Work/Code/craft-parts/craft_parts/plugins/validator.py:100: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <craft_parts.plugins.scons_plugin.SConsPluginEnvironmentValidator object at 0x7f57e615f2b0>
cmd = 'scons --version'

    def _execute(self, cmd: str) -> str:
        """Execute a command in a build environment shell.
    
        :param cmd: The command to execute.
    
        :return: The command output or error message.
        """
        logger.debug("plugin validation environment: %s", self._env)
        logger.debug("plugin validation command: %r", cmd)
    
        with tempfile.NamedTemporaryFile(mode="w+") as env_file:
            print(self._env, file=env_file)
            print(cmd, file=env_file)
            env_file.flush()
    
>           proc = subprocess.run(
                ["/bin/bash", env_file.name],
                check=True,
                capture_output=True,
                universal_newlines=True,
            )

/home/lengau/Work/Code/craft-parts/craft_parts/plugins/validator.py:143: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = True, timeout = None, check = True
popenargs = (['/bin/bash', '/tmp/tmpk2u9crjk'],)
kwargs = {'stderr': -1, 'stdout': -1, 'universal_newlines': True}
process = <Popen: returncode: 127 args: ['/bin/bash', '/tmp/tmpk2u9crjk']>
stdout = '', stderr = '/tmp/tmpk2u9crjk: line 22: scons: command not found\n'
retcode = 127

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                        output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['/bin/bash', '/tmp/tmpk2u9crjk']' returned non-zero exit status 127.

/usr/lib/python3.10/subprocess.py:524: CalledProcessError

The above exception was the direct cause of the following exception:

new_dir = local('/tmp/pytest-of-lengau/pytest-33/test_scons_plugin0')

    def test_scons_plugin(new_dir):
        """Test builds with the scons plugin"""
        source_location = Path(__file__).parent / "test_scons"
    
        parts_yaml = textwrap.dedent(
            f"""
            parts:
              foo:
                plugin: scons
                source: {source_location}
                scons-parameters:
                  - greeting=Hello
                  - person-name=craft-parts
            """
        )
        parts = yaml.safe_load(parts_yaml)
        lf = LifecycleManager(
            parts, application_name="test_scons", cache_dir=new_dir, work_dir=new_dir
        )
        actions = lf.plan(Step.PRIME)
    
>       with lf.action_executor() as ctx:

/home/lengau/Work/Code/craft-parts/tests/integration/plugins/test_scons.py:31: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:284: in __enter__
    self._executor.prologue()
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:92: in prologue
    self._verify_plugin_environment()
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:270: in _verify_plugin_environment
    validator.validate_environment(part_dependencies=part.dependencies)
/home/lengau/Work/Code/craft-parts/craft_parts/plugins/scons_plugin.py:72: in validate_environment
    version = self.validate_dependency(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <craft_parts.plugins.scons_plugin.SConsPluginEnvironmentValidator object at 0x7f57e615f2b0>
dependency = 'scons', plugin_name = 'scons', part_dependencies = []
argument = '--version'

    def validate_dependency(
        self,
        dependency: str,
        plugin_name: str,
        part_dependencies: Optional[List[str]],
        argument: str = "--version",
    ) -> str:
        """Validate that the environment has a required dependency.
    
        `<dependency-name> --version` is executed to confirm the dependency is valid.
    
        :param dependency: name of the dependency to validate.
        :param plugin_name: used to generate the part name that would satisfy
                            the dependency.
        :param part_dependencies: A list of the parts this part depends on.
        :param argument: argument to call with the dependency. Default is `--version`.
    
        :raises PluginEnvironmentValidationError: If the environment is invalid.
    
        :return: output from executed dependency
        """
        try:
            command = f"{dependency} {argument}"
            output = self._execute(command).strip()
            logger.debug("executed %s with output %s", command, output)
            return output
        except subprocess.CalledProcessError as err:
            if err.returncode != COMMAND_NOT_FOUND:
                raise errors.PluginEnvironmentValidationError(
                    part_name=self._part_name,
                    reason=f"{dependency!r} failed with error code {err.returncode}",
                ) from err
    
            if part_dependencies is None:
                raise errors.PluginEnvironmentValidationError(
                    part_name=self._part_name,
                    reason=f"{dependency!r} not found",
                ) from err
    
            part_dependency = f"{plugin_name}-deps"
            if part_dependency not in part_dependencies:
>               raise errors.PluginEnvironmentValidationError(
                    part_name=self._part_name,
                    reason=(
                        f"{dependency!r} not found and part {self._part_name!r} "
                        f"does not depend on a part named {part_dependency!r} "
                        "that would satisfy the dependency"
                    ),
                ) from err
E               craft_parts.errors.PluginEnvironmentValidationError: Environment validation failed for part 'foo': 'scons' not found and part 'foo' does not depend on a part named 'scons-deps' that would satisfy the dependency.

/home/lengau/Work/Code/craft-parts/craft_parts/plugins/validator.py:118: PluginEnvironmentValidationError

Source type not correctly recognized

Explicit source-type declaration not correctly recognized in some cases. E.g. even with source-type: git a git repository URL is not recognized unless ending in .git.

Permissions owner and group not applied

Bug Description

When used in Rockcraft, the directory in the shared yaml is still owned by root in the resulting image

To Reproduce

  • install rockcraft
  • add the shared part as a part
  • run rockcraft

part yaml

zinc:
    plugin: go
    source: https://github.com/zincsearch/zincsearch
    source-type: git
    source-tag: v0.4.7
    build-snaps:
      - go/latest/stable
      - node/18/stable
    build-environment:
      - CGO_ENABLED: 0
      - GOOS: linux
    override-build: |
      COMMIT_HASH="$(git rev-parse HEAD)"
      BUILD_DATE="$(date -u '+%Y-%m-%d_%I:%M:%S%p-GMT')"

      # Build the web ui, which is embedded in the go binary later
      pushd web
      npm install
      npm run build
      popd

      go mod tidy
      go build \
        -ldflags="-s -w
        -X github.com/zincsearch/zincsearch/pkg/meta.Version=${CRAFT_PROJECT_VERSION} \
        -X github.com/zincsearch/zincsearch/pkg/meta.CommitHash=${COMMIT_HASH} \
        -X github.com/zincsearch/zincsearch/pkg/meta.BuildDate=${BUILD_DATE}" \
        -o zincsearch \
        cmd/zincsearch/main.go
    stage-packages:
      - libc6_libs
      - ca-certificates_data
    override-stage: |
      # Create some directories
      mkdir -p "${CRAFT_PART_INSTALL}/bin" "${CRAFT_PART_INSTALL}/var/lib/zincsearch"

      # Install the zincsearch binary
      install -m 0755 "${CRAFT_PART_BUILD}/zincsearch" "${CRAFT_PART_INSTALL}/bin/zincsearch"
      
      # Run the default stage hook
      craftctl default
    permissions:
      - path: /var/lib/zincsearch
        owner: 584792
        group: 584792
        mode: "755"

Relevant log output

n/a

Add INFO-level log messages in prologue

What needs to get done

A lot of things happen during the prologue, and some of them (like installing build packages and snaps) are very time consuming. We should add INFO-level (so, for user-consumption) log messages.

Why it needs to get done

So that the user has better feedback of what's happening.

Allow disabling common-directory removal code in tar source

I have a snapcraft part using a tarball that has a specific top-level directory I want to preserve. Snapcraft strips this directory path during extraction, even before any customization hooks like override-build get to do anything.

I think it would be nice to have a way to control this, perhaps source-strip-path which could be explicitly set (e.g. to an empty string) to disable the current auto-detection mechanism.

execute more than one command at a time

Hey,

I'd like to know if there is a way to execute more than one command at once for e.g. via LXDInstance.execute_run.

I need to execute a number of commands. As command is a list of command parts, is this even possible? Would I need to pass in ; or && or would I need to create a command string which already combines the commands?

Or is there another way?

Thank you.

P.S.: I only care if all commands succeeded - I do not care which specifically failed.

The `craft_parts.packages.deb` module is incorrectly tested

There are 10 calls to assert_has_calls, which is misused in the tests.

E.g., the test test_already_installed_no_specified_version is currently doing...

        fake_deb_run.assert_has_calls([])

Attempting to validate that there are no calls to apt ("already installed!")... however, that line is a noop, as it does not verify any call in the mocker. Probably assert_not_called should have been used here (which would fail, as there are calls, because the test is buggy).

Plugins not verifying required part properties

Plugins unmarshal their own properties, but they're not checking for the presence of required part properties such as source. This can be done by pydantic itself if the required property is declared as a field.

Info leaked to stdout when running subprocess

There are several places that are calling a sub-process and its output is leaked to the terminal, when it shouldn't.

  • craft_parts/packages/snaps.py

    • two check_call calls
  • craft_parts/utils/os_utils.py

    • three check_call calls

Probably they should be converted to something like the following:

subprocess.run(..., stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)

Note that calling check_output would fix the problem anyway, but in reality if the command output is not needed, it's better to discard it directly (send it to devnull) than store it in memory and then discard it (a check_output call without storing the result)

Please improve parameter documentation for `execute_run` -> `command`

The current docstring/param documentation looks like..

:param command: Command to execute.

This is at very least not helpful. It is not a command, but a list of command parts - and yep, I can see it is a list when looking at the type annotations.

There is still no description how these parts should look like.

Anyhow, today I spent a bit too much time figuring out what I need to escape and what not.

e.g. this command looks fine...

update_sources_list = [
    "sed", "-i", f"'1s=^={lines}='", "/etc/apt/sources.list"
]

... and even works when I join it and execute it, but it did not work when passing into execute_run as then some additional escaping happens, and the then assembled sed command was broken.

Recommended dependencies always installed

@valentindavid reports:

When adding packages to section "build-packages", it seems snapcraft tries to install the recommended dependencies. I have tried both 5.x/stable and latest/stable. It calls "apt install" with "--no-install-recommends". But it also passes as parameter all the recommended packages. Probably because it gets those when trying to resolve the versions using the AptCache API. Is that normal? Is there a way to not get recommended dependencies?

Craft-parts should be fixed to not install recommended dependencies.

Gradle plugin support in core22

What needs to get done

Similar to #236, bringing the gradle plugin up to date with core22.

Why it needs to get done

There is a lack of plugins that support Java builds in core22.
Also, it would be useful for this use case.

Support `--help` flag for craftctl

I wrote a reference doc to explain the current functionality of craftctl from the snapcraft perspective.

I had to look through the code to see what commands craftctl supports and which arguments. It might be useful to have a --help flag which simply prints the currently supported functionality.

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Awaiting Schedule

These updates are awaiting their schedule. Click on a checkbox to get an update now.

  • chore(deps): update bugfixes (pyright, tox-gh)
  • chore(deps): update dependency requests to <2.32.4
  • chore(deps): update dependency myst-parser to v3

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Ignored or Blocked

These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.

Detected dependencies

github-actions
.github/workflows/cla-check.yaml
  • canonical/has-signed-canonical-cla v1
  • ubuntu 22.04
.github/workflows/tests.yaml
  • actions/checkout v4
  • actions/setup-python v5
  • actions/checkout v4
  • actions/setup-python v5
  • actions/setup-node v4
  • actions/checkout v4
  • actions/setup-python v5
  • actions/setup-node v4
  • ubuntu 22.04
pip_requirements
docs/requirements.txt
  • furo ==2024.1.29
  • pyspelling ==2.10
  • myst-parser ==2.0.0
  • Sphinx ==7.2.6
  • sphinx-autobuild ==2024.2.4
  • sphinx-autodoc-typehints ==2.0.1
  • sphinx-design ==0.5.0
  • sphinx-jsonschema ==1.19.1
  • sphinx-pydantic ==0.1.1
  • sphinx-tabs ==3.4.5
  • sphinxcontrib-applehelp ==1.0.8
  • sphinx_design ==0.5.0
  • sphinxcontrib-details-directive ==0.1.0
  • sphinxcontrib-devhelp ==1.0.6
  • sphinxcontrib-htmlhelp ==2.0.5
  • sphinxcontrib-jsmath ==1.0.1
  • sphinxcontrib-qthelp ==1.0.7
  • sphinxext-opengraph ==0.9.1
  • canonical-sphinx ~=0.1
tox.ini
  • tox-ignore-env-name-mismatch >=0.2.0.post2
  • tox-gh ~=1.3
pip_setup
setup.py
  • overrides !=7.6.0
  • pydantic-yaml >=0.11.0,<1.0.0
  • requests <2.32.0
  • sphinxcontrib-details-directive ==0.1.0
  • pyright ==1.1.358
  • yamllint ==1.32.0
regex
tox.ini
  • tox-ignore-env-name-mismatch 0.2.0.post2
  • tox-gh 1.3

  • Check this box to trigger a request for Renovate to run again on this repository

Add ruby plugin

Port ruby plugin from Snapcraft PR: canonical/snapcraft#3511

Plugin needs to be reworked to validate an existing ruby interpreter or to use an interpreter built in a different part.

architecture parsing fails on non-linux platforms

Overview

Craft-parts fails to resolve system architectures on non-linux platforms.

Details

On a Linux arm64 machine,

return platform.machine()

returns aarch64.

On Apple silicon, the same call returns arm64, and the subsequent dictionary lookup for _ARCH_TRANSLATIONS fails.

There was an old TODO for Windows architectures not working either, but I haven't verified this.

Craft-parts should be able to determine the architecture for Linux, Mac, and Windows.

Source

Source checksum mismatch is confusing

This error is not for "heif" but for the previous part, it makes it really hard to identify where the error comes from.

Launching instance...
Executed: pull heif
Expected digest 6f70a624d1321319d8269a911c4032f24950cde52e76f46e9ecbebfcb762f28c, obtained 93925805c1fc4f8162b35f0ae109c4a75344e6decae5a240afdfce25f8a433ec.
Failed to execute pack in instance.
Full execution log: '/home/sergiusens/.local/state/snapcraft/log/snapcraft-20230129-175344.550739.log'

part `build-snaps` channel change doesn't work without a `clean`

Bug Description

In my charmcraft.yaml file I updated the charm version via the build-snaps directive, and then a charmcraft pack but the changes were not reflected, ie no new charm version was installed.

Only when I ran charmcraft clean and another charmcraft pack it worked as intended and used the new charm version from the charmcraft.yaml configuration.

To Reproduce

  • create a simple charmcraft.yaml with e.g. build-snaps: [charm/2.x/stable]
  • run charmcraft pack (which creates a container)
  • update charmcraft.yaml to build-snaps: [charm]
  • run charmcraft pack
  • the new charm version is not used

(I noticed that as I added an option which is only available with the new charm version).

Environment

  • Ubuntu 20.04
  • charmcraft -> version 2.3.0
  • using lxd containers

charmcraft.yaml

see above

Relevant log output

none

Permission changes in files are not detected

When permissions are changed on a file, that file is not detected as "updated", and not re processed by the pull part.

Example using charmcraft:

$ charmcraft clean           
Cleaning project 'my-super-charm'.
Cleaned project 'my-super-charm'.
$ charmcraft pack
Packing the charm
Created 'my-super-charm_ubuntu-22.04-amd64.charm'.
Charms packed:
    my-super-charm_ubuntu-22.04-amd64.charm
$ zipinfo -l my-super-charm_ubuntu-22.04-amd64.charm to_be_included.bin
-rw-rw-r--  2.0 unx      163 b-      105 defN 21-Aug-03 12:20 to_be_included.bin
$ chmod -w to_be_included.bin
$ ll to_be_included.bin
-r--r--r-- 1 facundo facundo 163 ago  3  2021 to_be_included.bin
$ charmcraft pack
Packing the charm
Created 'my-super-charm_ubuntu-22.04-amd64.charm'.
Charms packed:
    my-super-charm_ubuntu-22.04-amd64.charm
$ zipinfo -l my-super-charm_ubuntu-22.04-amd64.charm to_be_included.bin
-rw-rw-r--  2.0 unx      163 b-      105 defN 21-Aug-03 12:20 to_be_included.bin

Cleanup warning when building docs

The current sphinx-build output contains a lot of warnings; some of them can be disregarded but some of them can be major, like disabling the autogenerated docs from the code altogether (see #346). Ideally we should cleanup the warnings and then build with -W so that new warnings fail the build. Cleaning up the warnings isn't as straightforward as we might like because some of them are from the various sphinx plugins interacting badly with Pydantic.

AptCache is marking extra packages for installation on Jammy

Bug Description

The unit test tests/unit/packages/test_apt_cache.py::TestAptStageCache::test_stage_packages is failing on the Ubuntu 22.04 Github runner. It marks two extra packages for installation - gcc-13-base and libgcc-s1.

It started failing between 2023-Jul-10 and 2023-Jul-12. There was an update to the Jammy runner in this timeframe.

The test was fixed with #494, but the root cause needs to be investigated.

To Reproduce

Run the Github CI workflow without the patch in #494.

part yaml

No response

Relevant log output

Run make test-units
pytest tests/unit
============================= test session starts ==============================
platform linux -- Python 3.10.12, pytest-7.4.0, pluggy-1.2.0
rootdir: /home/runner/work/craft-parts/craft-parts
configfile: pytest.ini
plugins: requests-mock-1.11.0, mock-3.11.1, hypothesis-6.81.1, cov-4.1.0, check-2.1.5
collected 1682 items

tests/unit/test_actions.py .....                                         [  0%]
tests/unit/test_callbacks.py ..................                          [  1%]
tests/unit/test_ctl.py ......                                            [  1%]
tests/unit/test_dirs.py ...                                              [  1%]
tests/unit/test_errors.py ...................................            [  3%]
tests/unit/test_features.py ...                                          [  4%]
tests/unit/test_infos.py ............................................... [  6%]
......................................................                   [ 10%]
tests/unit/test_lifecycle_manager.py ...........................         [ 11%]
tests/unit/test_parts.py ............................................... [ 14%]
...............                                                          [ 15%]
tests/unit/test_permissions.py .......                                   [ 15%]
tests/unit/test_sequencer.py ...............                             [ 16%]
tests/unit/test_steps.py ...............                                 [ 17%]
tests/unit/test_xattrs.py ......                                         [ 18%]
tests/unit/executor/test_collisions.py ........                          [ 18%]
tests/unit/executor/test_environment.py ..............                   [ 19%]
tests/unit/executor/test_executor.py ...........                         [ 19%]
tests/unit/executor/test_filesets.py ............................        [ 21%]
tests/unit/executor/test_migration.py .................                  [ 22%]
tests/unit/executor/test_organize.py ..........                          [ 23%]
tests/unit/executor/test_part_handler.py ............................... [ 25%]
.............................                                            [ 26%]
tests/unit/executor/test_replace_attr.py ................                [ 27%]
tests/unit/executor/test_step_handler.py ........                        [ 28%]
tests/unit/features/test_parts.py ...                                    [ 28%]
tests/unit/features/overlay/test_executor_environment.py ..............  [ 29%]
tests/unit/features/overlay/test_executor_part_handler.py .............. [ 30%]
........................................................................ [ 34%]
.....                                                                    [ 34%]
tests/unit/features/overlay/test_feature.py .                            [ 34%]
tests/unit/features/overlay/test_lifecycle_manager.py .................. [ 35%]
..........                                                               [ 36%]
tests/unit/features/overlay/test_parts.py .............................. [ 38%]
........................................                                 [ 40%]
tests/unit/features/overlay/test_sequencer.py .......................... [ 42%]
...                                                                      [ 42%]
tests/unit/features/overlay/test_steps.py .................              [ 43%]
tests/unit/features/partitions/test_lifecycle_manager.py ............... [ 44%]
.....                                                                    [ 44%]
tests/unit/features/partitions/test_parts.py ........................... [ 46%]
.....................................                                    [ 48%]
tests/unit/overlays/test_chroot.py ......                                [ 48%]
tests/unit/overlays/test_errors.py ...                                   [ 48%]
tests/unit/overlays/test_layers.py ...........................           [ 50%]
tests/unit/overlays/test_overlay_fs.py ...................               [ 51%]
tests/unit/overlays/test_overlay_manager.py ...............              [ 52%]
tests/unit/overlays/test_overlays.py ........................            [ 53%]
tests/unit/packages/test_apt_cache.py F...........                       [ 54%]
tests/unit/packages/test_base.py .........                               [ 55%]
tests/unit/packages/test_chisel.py ....                                  [ 55%]
tests/unit/packages/test_deb.py .......................................  [ 57%]
tests/unit/packages/test_deb_package.py .....                            [ 57%]
tests/unit/packages/test_dnf.py ........................                 [ 59%]
tests/unit/packages/test_errors.py .................                     [ 60%]
tests/unit/packages/test_normalize.py .................................. [ 62%]
....                                                                     [ 62%]
tests/unit/packages/test_platform.py ................                    [ 63%]
tests/unit/packages/test_snaps.py ...................................... [ 65%]
...                                                                      [ 66%]
tests/unit/packages/test_yum.py ........................                 [ 67%]
tests/unit/plugins/test_ant_plugin.py ...........                        [ 68%]
tests/unit/plugins/test_autotools_plugin.py .......                      [ 68%]
tests/unit/plugins/test_base.py ...                                      [ 68%]
tests/unit/plugins/test_cmake_plugin.py ..........                       [ 69%]
tests/unit/plugins/test_dotnet_plugin.py .............                   [ 70%]
tests/unit/plugins/test_dump_plugin.py .....                             [ 70%]
tests/unit/plugins/test_go_plugin.py ...............                     [ 71%]
tests/unit/plugins/test_make_plugin.py .......                           [ 71%]
tests/unit/plugins/test_maven_plugin.py ...................              [ 72%]
tests/unit/plugins/test_meson_plugin.py ................                 [ 73%]
tests/unit/plugins/test_nil_plugin.py ....                               [ 74%]
tests/unit/plugins/test_npm_plugin.py .............................      [ 75%]
tests/unit/plugins/test_plugins.py ................                      [ 76%]
tests/unit/plugins/test_properties.py ...                                [ 76%]
tests/unit/plugins/test_python_plugin.py ...........                     [ 77%]
tests/unit/plugins/test_rust_plugin.py .......................           [ 78%]
tests/unit/plugins/test_scons_plugin.py ...........                      [ 79%]
tests/unit/plugins/test_validator.py .....                               [ 79%]
tests/unit/sources/test_base.py .............                            [ 80%]
tests/unit/sources/test_cache.py ....                                    [ 80%]
tests/unit/sources/test_checksum.py .............                        [ 81%]
tests/unit/sources/test_deb_source.py .                                  [ 81%]
tests/unit/sources/test_errors.py ..........                             [ 82%]
tests/unit/sources/test_file_source.py ..                                [ 82%]
tests/unit/sources/test_git_source.py .................................. [ 84%]
.......                                                                  [ 84%]
tests/unit/sources/test_local_source.py ..................               [ 85%]
tests/unit/sources/test_rpm_source.py ............                       [ 86%]
tests/unit/sources/test_snap_source.py ..........                        [ 87%]
tests/unit/sources/test_sources.py .....................                 [ 88%]
tests/unit/sources/test_tar_source.py .....                              [ 88%]
tests/unit/sources/test_zip_source.py ...                                [ 88%]
tests/unit/state_manager/test_build_state.py ........                    [ 89%]
tests/unit/state_manager/test_prime_state.py .......                     [ 89%]
tests/unit/state_manager/test_pull_state.py .......                      [ 90%]
tests/unit/state_manager/test_reports.py ...............                 [ 91%]
tests/unit/state_manager/test_stage_state.py ........                    [ 91%]
tests/unit/state_manager/test_state_manager.py ......................... [ 93%]
....                                                                     [ 93%]
tests/unit/state_manager/test_states.py .....................            [ 94%]
tests/unit/state_manager/test_step_state.py .....................        [ 95%]
tests/unit/utils/test_file_utils.py ..................                   [ 96%]
tests/unit/utils/test_formatting_utils.py .........                      [ 97%]
tests/unit/utils/test_os_utils.py ..................................     [ 99%]
tests/unit/utils/test_url_utils.py .........                             [100%]

=================================== FAILURES ===================================
____________________ TestAptStageCache.test_stage_packages _____________________

self = <tests.unit.packages.test_apt_cache.TestAptStageCache object at 0x7f37f84c47f0>
tmpdir = local('/tmp/pytest-of-runner/pytest-0/test_stage_packages0')

    def test_stage_packages(self, tmpdir):
        fetch_dir_path = Path(tmpdir, "debs")
        fetch_dir_path.mkdir(exist_ok=True, parents=True)
        stage_cache = Path(tmpdir, "cache")
        stage_cache.mkdir(exist_ok=True, parents=True)
    
        AptCache.configure_apt("test_stage_packages")
        with AptCache(stage_cache=stage_cache) as cache:
            package_names = {"pciutils"}
            filtered_names = {
                "base-files",
                "libc6",
                "libkmod2",
                "libudev1",
                "zlib1g",
                # dependencies in focal
                "dpkg",
                "libacl1",
                "libbz2-1.0",
                "libcrypt1",
                "liblzma5",
                "libpcre2-8-0",
                "libselinux1",
                "libzstd1",
                "pci.ids",
                "perl-base",
                "tar",
            }
    
            cache.mark_packages(package_names)
            cache.unmark_packages(unmark_names=filtered_names)
    
            marked_packages = cache.get_packages_marked_for_installation()
>           assert sorted([name for name, _ in marked_packages]) == [
                "libpci3",
                "pciutils",
            ]
E           AssertionError: assert ['gcc-13-base...', 'pciutils'] == ['libpci3', 'pciutils']
E             At index 0 diff: 'gcc-13-base' != 'libpci3'
E             Left contains 2 more items, first extra item: 'libpci3'
E             Full diff:
E             - ['libpci3', 'pciutils']
E             + ['gcc-13-base', 'libgcc-s1', 'libpci3', 'pciutils']

tests/unit/packages/test_apt_cache.py:72: AssertionError
=========================== short test summary info ============================
FAILED tests/unit/packages/test_apt_cache.py::TestAptStageCache::test_stage_packages - AssertionError: assert ['gcc-13-base...', 'pciutils'] == ['libpci3', 'pciutils']
  At index 0 diff: 'gcc-13-base' != 'libpci3'
  Left contains 2 more items, first extra item: 'libpci3'
  Full diff:
  - ['libpci3', 'pciutils']
  + ['gcc-13-base', 'libgcc-s1', 'libpci3', 'pciutils']
================== 1 failed, 1681 passed in 61.47s (0:01:01) ===================
make: *** [Makefile:109: test-units] Error 1
Error: Process completed with exit code 2.

Executor clean may not be 100% effective

in executor/executor.py, Executor.clean has this code:

            with contextlib.suppress(FileNotFoundError):
                shutil.rmtree(self._project_info.prime_dir)
                if initial_step <= Step.STAGE:
                    shutil.rmtree(self._project_info.stage_dir)
                if initial_step <= Step.PULL:
                    shutil.rmtree(self._project_info.parts_dir)

Three rmtree surrounded by the supress context manager. If the first rmtree fails with FileNotFoundError the exception will be suppressed, but the other two rmtree will NOT be run.

It's worth to check if this pattern is repeated elsewhere.

Provide a way to return all project variables

Right now, it's difficult to figure out which project variables are set and/or which are supported. I had to dive in the code of snapcraft to figure out what was supported.

Something like craftctl get --all might be useful as a quick way to see everything that's set. Although this is something that would mainly be used during debugging, I think it might still be useful.

ImportError: cannot import name 'YamlModel' from 'pydantic_yaml'

I believe this is because setup.py should pin pydantic-yaml==0.4.2, as this is also done in requirements.txt and requirements-dev.txt. Otherwise we are getting pydantic-yaml 0.5.1, for which the way of importing has changed.

build run-test: commands[0] | charmcraft build
Traceback (most recent call last):
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/bin/charmcraft", line 5, in <module>
    from charmcraft.main import main
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/charmcraft/main.py", line 26, in <module>
    from charmcraft import config, __version__
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/charmcraft/config.py", line 81, in <module>
    from charmcraft.parts import validate_part
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/charmcraft/parts.py", line 26, in <module>
    from craft_parts import LifecycleManager, Step, plugins
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/__init__.py", line 25, in <module>
    from .lifecycle_manager import LifecycleManager  # noqa: F401
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/lifecycle_manager.py", line 27, in <module>
    from craft_parts import errors, executor, packages, plugins, sequencer
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/executor/__init__.py", line 19, in <module>
    from .executor import ExecutionContext, Executor  # noqa: F401
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/executor/executor.py", line 35, in <module>
    from .part_handler import PartHandler
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/executor/part_handler.py", line 34, in <module>
    from craft_parts.state_manager import MigrationState, StepState, states
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/state_manager/__init__.py", line 19, in <module>
    from .state_manager import StateManager  # noqa: F401
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/state_manager/state_manager.py", line 32, in <module>
    from .states import StepState, get_step_state_path, load_step_state
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/state_manager/states.py", line 29, in <module>
    from .build_state import BuildState
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/state_manager/build_state.py", line 23, in <module>
    from .step_state import StepState, validate_hex_string
  File "/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/craft_parts/state_manager/step_state.py", line 23, in <module>
    from pydantic_yaml import YamlModel  # type: ignore
ImportError: cannot import name 'YamlModel' from 'pydantic_yaml' (/home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/lib/python3.9/site-packages/pydantic_yaml/__init__.py)
ERROR: InvocationError for command /home/lourot/Documents/git/canonical/charm-nova-compute-nvidia-vgpu/.tox/build/bin/charmcraft build (exited with code 1)

Refactor integration tests to split data and code

Many integration tests (in tests/integration) start out by creating a bunch of temporary files and directories for whatever is being tested (go files, cmake, autotools, etc). This is undesirable because it distracts from the actual thing that we want to check (craft-part's support for those technologies).
There are some alternatives to support "test data" in pytest. See the rejected PR #272 for an attempt using pytest-datadir. We should come to a decision and refactor/cleanup the tests.

Explain how `stage` and `prime` work in the docs

My understanding of the stage and prime directives are that they let you specify:

  • stage: what files go from $CRAFT_PART_INSTALL to $CRAFT_STAGE
  • prime: what files go from $CRAFT_STAGE to $CRAFT_PRIME
    and that, if omitted, the default behaviour is to copy everything in each of those locations.

It would be great to have this explained more in the docs. There's a few hints at this, but I don't see anything that explicitly describes the behaviour (especially the default behaviour). I found it awkward to learn how to use these when I'd see some rockcraft.yaml files using stage and prime, and others omitting them entirely.

Updating repositories does not trigger necessary actions

This is a behavior seen while using Rockcraft.

Updating repositories entries does not remove PPA files from the disk and does not trigger pull steps.

rockcraft.yaml

name: python-3.8
summary: Python 3.8
license: Apache-2.0
description: |
  Python 3.8
version: "3.8"

base: ubuntu:22.04
platforms:
  amd64:

package-repositories:
  - type: apt
    ppa: dqlite/dev
    priority: always
  - type: apt
    ppa: deadsnakes/ppa
    priority: always

parts:
  python:
    plugin: nil
    stage-packages:
      - python3.8

Performing pull step:

rockcraft pull --shell-after
Launching instance...                                                                                                                                                                                                                                  
Retrieved base ubuntu:22.04 for amd64                                                                                                                                                                                                                  
Extracted ubuntu:22.04                                                                                                                                                                                                                                 
Package repositories installed                                                                                                                                                                                                                         
Executed: pull pebble                                                                                                                                                                                                                                  
Executed: pull python                                                                                                                                                                                                                                  
Launching shell on build environment...                                                                                                                                                                                                                
root@rockcraft-keystone-53910271:~/project# ls -alh /etc/apt/sources.list.d/
total 4.0K
drwxr-xr-x 2 root root   4 Apr  3 13:09 .
drwxr-xr-x 8 root root   9 Apr 26  2022 ..
-rw-r--r-- 1 root root 164 Apr  3 13:09 snapcraft-ppa-deadsnakes_ppa.sources
-rw-r--r-- 1 root root 160 Apr  3 13:09 snapcraft-ppa-dqlite_dev.sources
root@rockcraft-keystone-53910271:~/project# 

Then, removing the deadsnakes/ppa PPA:

diff rockcraft.yaml/before rockcraft.yaml/after
16,18d15
<   - type: apt
<     ppa: deadsnakes/ppa
<     priority: always

This is the result:

rockcraft pull --shell-after
Launching instance...                                                                                                                                                                                                                                  
Retrieved base ubuntu:22.04 for amd64                                                                                                                                                                                                                  
Extracted ubuntu:22.04                                                                                                                                                                                                                                 
Package repositories installed                                                                                                                                                                                                                         
Executed: skip pull pebble (already ran)                                                                                                                                                                                                               
Executed: skip pull python (already ran)                                                                                                                                                                                                               
Launching shell on build environment...                                                                                                                                                                                                                
root@rockcraft-keystone-53910271:~/project# ls -alh /etc/apt/sources.list.d/
total 4.0K
drwxr-xr-x 2 root root   4 Apr  3 13:09 .
drwxr-xr-x 8 root root   9 Apr 26  2022 ..
-rw-r--r-- 1 root root 164 Apr  3 13:09 snapcraft-ppa-deadsnakes_ppa.sources
-rw-r--r-- 1 root root 160 Apr  3 13:09 snapcraft-ppa-dqlite_dev.sources

The sources file is not removed, and the pull stage is not retriggered. (this should have failed)

Consider updating APT index on certain situations

Craft Providers will stop updating APT packages (purposefully, and also disabling automatic ones). Maybe Craft Parts should manually update them on certain situations (e.g. before running any BUILD step).

Using `source-subdir` results in `No such file or directory`

I am using rockcraft with the python plugin (PR).

This works:

source: https://github.com/SeldonIO/seldon-core
# ...
override-build: |
  craftctl default
  install -D -m755 servers/xgboostserver/xgboostserver/XGBoostServer.py ${CRAFT_PART_INSTALL}/opt/XGBoostServer.py

But this doesn't:

source: https://github.com/SeldonIO/seldon-core
source-subdir: servers/xgboostserver
# ...
override-build: |
  craftctl default
  install -D -m755 xgboostserver/XGBoostServer.py ${CRAFT_PART_INSTALL}/opt/XGBoostServer.py
2023-04-27 05:56:12.564 :: 2023-04-27 09:56:08.894 :: + install -D -m755 xgboostserver/XGBoostServer.py /root/parts/xgboostserver/install/opt/XGBoostServer.py
2023-04-27 05:56:12.564 :: 2023-04-27 09:56:08.895 :: install: cannot stat 'xgboostserver/XGBoostServer.py': No such file or directory
2023-04-27 05:56:12.564 :: 2023-04-27 09:56:09.013 'override-build' in part 'xgboostserver' failed with code 1.

@tigarmo suggested this may be from a bad interaction between source-subdir and override-build (code).

Chisel error messages are begin supressed

If you run chisel and it fails, it usually gives you helpful error messages:

$ chisel cut --root root aisjaopsd
error: invalid slice reference: "aisjaopsd"

However, when using chisel slices in stage-packages it looks like this message is being supressed, so the only outcome is something like Command '['chisel', 'cut', '--root', '/root/parts/lego/install', '--release', 'ubuntu-22.04', 'lib6_libs']' returned non-zero exit status 1.

Improve logging of Apt issues in Overlays

What needs to get done

Currently we show little information when the installation of overlay-packages fails, it just says "failed to install requested packages".

Why it needs to get done

In this specific case, the lack of priority on the ppa made a package uninstallable due to the dependencies, but I had to mount/chroot into the base and run the command myself to see the message.

Fails to install using PIP

Using

python3 -m pip install craft-parts==1.14.1

returns an error (it seems that is missing a README.md file). It breaks building Gnome-42-SDK snap.

Version 1.14.0 install fine.

This is a capture with the error

image

Updated directory missing in `charmcraft` pull step

Charmcraft logs in canonical/data-platform-workflows#32 show

2023-03-30 14:49:16.047 :: 2023-03-30 14:49:14.786 process charm:Step.PULL
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:14.786 check if Part('charm'):Step.PULL is dirty
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:14.786 check if Part('charm'):Step.PULL is outdated
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:14.786 ignore patterns: ['*.charm']
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:14.787 updated files: {'tests/unit/test_charm.py', 'metadata.yaml', 'tests/unit/test_tls_lib.py', 'tests/unit/__init__.py', '.gitignore', 'documentation/documentation_landing.md', 'src/machine_helpers.py', 'documentation/tutorial.md', 'tests/unit/test_mongodb_lib.py', 'tests/unit/test_mongodb_helpers.py', 'tests/unit/test_mongodb_backups.py', 'LICENSE', 'config.yaml', 'charmcraft.yaml', 'README.md', 'src/charm.py', '.jujuignore', 'actions.yaml', 'pyproject.toml', 'tests/unit/test_mongodb_provider.py', 'requirements.txt', 'tox.ini', 'tests/unit/helpers.py', 'CONTRIBUTING.md', 'src/grafana_dashboards/MongoDB_Cluster_Summary.json'}
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:14.787 updated directories: {'.github', 'tests/integration', 'tests/unit/data', 'src/alert_rules', 'tests/data', '.git', 'lib'}

Note that updated files contains (but should not contain) 'src/grafana_dashboards/MongoDB_Cluster_Summary.json' and updated directories does not contain (but should contain) 'src/grafana_dashboards'.

This causes this error later in the logs

2023-03-30 14:49:16.047 :: 2023-03-30 14:49:15.747 Parts processing error: Failed to copy '/root/parts/charm/src/src/grafana_dashboards/MongoDB_Cluster_Summary.json': no such file or directory.
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:15.752 Traceback (most recent call last):
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/utils/file_utils.py", line 172, in copy
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:15.752     shutil.copy2(source, destination, follow_symlinks=follow_symlinks)
2023-03-30 14:49:16.047 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/usr/lib/python3.8/shutil.py", line 435, in copy2
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     copyfile(src, dst, follow_symlinks=follow_symlinks)
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/usr/lib/python3.8/shutil.py", line 264, in copyfile
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752 FileNotFoundError: [Errno 2] No such file or directory: '/root/parts/charm/build/src/grafana_dashboards/MongoDB_Cluster_Summary.json'
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752 The above exception was the direct cause of the following exception:
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752 Traceback (most recent call last):
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/charmcraft/parts.py", line 397, in run
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     aex.execute([act], stdout=stream, stderr=stream)
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/executor/executor.py", line 301, in execute
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     self._executor.execute(actions, stdout=stdout, stderr=stderr)
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/executor/executor.py", line 126, in execute
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     self._run_action(act, stdout=stdout, stderr=stderr)
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/executor/executor.py", line 189, in _run_action
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     handler.run_action(action, stdout=stdout, stderr=stderr)
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/executor/part_handler.py", line 130, in run_action
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     self._update_action(
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/executor/part_handler.py", line 535, in _update_action
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     handler(step_info, stdout=stdout, stderr=stderr)
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/executor/part_handler.py", line 627, in _update_build
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     source.update()
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/sources/local_source.py", line 163, in update
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     self.copy_function(
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752   File "/snap/charmcraft/1171/lib/craft_parts/utils/file_utils.py", line 174, in copy
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752     raise errors.CopyFileNotFound(source) from err
2023-03-30 14:49:16.048 :: 2023-03-30 14:49:15.752 craft_parts.errors.CopyFileNotFound: Failed to copy '/root/parts/charm/src/src/grafana_dashboards/MongoDB_Cluster_Summary.json': no such file or directory.

Special attention to the first error in the traceback: FileNotFoundError: [Errno 2] No such file or directory: '/root/parts/charm/build/src/grafana_dashboards/MongoDB_Cluster_Summary.json'

I believe this is because the /src/grafana_dashboards/ directory does not exist in /root/parts/charm/build/. To confirm this, I ran sudo mkdir /var/snap/lxd/common/lxd/containers/charmcraft_charmcraft-mongodb-785608-0-0-amd64/rootfs/root/parts/charm/build/src/grafana_dashboards/ and ran charmcraft pack again and the build succeeded.

Source of bug

I believe this issue is caused by this block of code:

for directory in directories:
path = os.path.join(root, directory)
if os.lstat(path).st_mtime >= target_mtime:
# Don't descend into this directory-- we'll just copy it
# entirely.
directories.remove(directory)

The directories list is modified in-place while iterating over it, which means that not all items in the list will be iterated over.

For example:

>>> directories = [1, 2, 3, 4, 5]
>>> for directory in directories:
...     print(directory)
...     directories.remove(directory)
... 
1
3
5
>>> print(directories)
[2, 4]

Request to have the Maven Plugin ported to craft-parts

I would respectfully ask that the Maven plugin be ported to craft-parts so it can be used moving forward. Currently it does not work with core20 and I would imagine core22 when it is released. Today we are all forced to use core18 and I'm sure in the future that will become unsupported.

This is a request based on the forum discussion with cmatsuoka.

LINK TO FORUM POST

I have quite a few Maven applications in the store and would happy to help test if that would be helpful.

Thank you so much.

Michael

auto-populate apt-cache if empty?

Hello

would it be possible to automatically populate (apt-get update) the apt-cache if it's empty?

I realise this may not be a major use case, but I've been running snapcraft within a docker container in ci and it was falling over at

apt_cache.mark_packages(set(package_names))
because the apt-cache was empty
if name_arch not in self.cache:

you have a method already

def refresh_packages_list(cls) -> None:

2022-10-07 11:11:00.413 Executing parts lifecycle...
2022-10-07 11:11:00.413 source build packages: {'git'}
2022-10-07 11:11:00.414 source build packages: {'git'}
2022-10-07 11:11:00.414 ignore patterns: ['*.snap', 'parts', 'stage', 'prime']
2022-10-07 11:11:00.414 part build packages: ['ubuntu-dev-tools', 'grub-pc-bin', 'grub-common', 'sbsigntool']
2022-10-07 11:11:00.414 plugin build packages: {'make', 'gcc'}
2022-10-07 11:11:00.415 ignore patterns: ['*.snap', 'parts', 'stage', 'prime']
2022-10-07 11:11:00.415 Requested build-packages: ['gcc', 'git', 'grub-common', 'grub-pc-bin', 'make', 'sbsigntool', 'ubuntu-dev-tools']
2022-10-07 11:11:00.448 Marking sbsigntool (and its dependencies) to be fetched
2022-10-07 11:11:00.449 Cannot find package listed in 'build-packages': sbsigntool
2022-10-07 11:11:00.450 Traceback (most recent call last):
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/packages/deb.py", line 448, in _get_packages_marked_for_installation
2022-10-07 11:11:00.450     apt_cache.mark_packages(set(package_names))
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/packages/apt_cache.py", line 302, in mark_packages
2022-10-07 11:11:00.450     raise errors.PackageNotFound(name_arch)
2022-10-07 11:11:00.450 craft_parts.packages.errors.PackageNotFound: Package not found: sbsigntool.
2022-10-07 11:11:00.450 
2022-10-07 11:11:00.450 During handling of the above exception, another exception occurred:
2022-10-07 11:11:00.450 Traceback (most recent call last):
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/snapcraft/parts/parts.py", line 165, in run
2022-10-07 11:11:00.450     with self._lcm.action_executor() as aex:
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/executor/executor.py", line 281, in __enter__
2022-10-07 11:11:00.450     self._executor.prologue()
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/executor/executor.py", line 87, in prologue
2022-10-07 11:11:00.450     self._install_build_packages()
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/executor/executor.py", line 222, in _install_build_packages
2022-10-07 11:11:00.450     packages.Repository.install_packages(sorted(build_packages))
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/packages/deb.py", line 505, in install_packages
2022-10-07 11:11:00.450     marked_packages = cls._get_packages_marked_for_installation(package_names)
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/packages/deb.py", line 261, in wrapped
2022-10-07 11:11:00.450     return method(*args, **kwargs)
2022-10-07 11:11:00.450   File "/snap/snapcraft/current/lib/python3.8/site-packages/craft_parts/packages/deb.py", line 450, in _get_packages_marked_for_installation
2022-10-07 11:11:00.450     raise errors.BuildPackageNotFound(error.package_name)
2022-10-07 11:11:00.450 craft_parts.packages.errors.BuildPackageNotFound: Cannot find package listed in 'build-packages': sbsigntool

many thanks

Adding source handler for deb packages

Even though deb packages are a supported source format for parts there's currently no source handler for it, hence the issue "Failed to pull source: unable to determine source type of ..."

Update to pydantic-yaml >=1.0.0

What needs to get done

Update dependencies to pydantic-yaml >= 1.0.0 and do the code changes that come with it.

Why it needs to get done

Version 1.0 has some API changes that rearrange things, so we'll need to do some slight refactoring.

dotnet integration test not fully working

The dotnet integration test is always succeeding on CI because the CI has dotnet installed by default. When it's not installed, it fails with:

craft_parts.errors.PluginEnvironmentValidationError: Environment validation failed for part 'foo': 'dotnet' not found and part 'foo' does not depend on a part named 'dotnet-deps' that would satisfy the dependency.

We should ensure that dotnet is not installed on the CI and that the succeeds without dotnet.

Go plugin integration tests failing on my machine

Not sure why, could be an issue with my setup so I'm assigning it to myself

test_go_plugin
FAILED              [ 54%]+ go mod download all
+ go install -p 1 -tags=my_tag ./...

tests/integration/plugins/test_go.py:25 (test_go_plugin)
self = <craft_parts.executor.step_handler.StepHandler object at 0x7f57e629f160>

    def _builtin_build(self) -> StepContents:
        # Plugin commands.
        plugin_build_commands = self._plugin.get_build_commands()
    
        # save script to set the build environment
        build_environment_script_path = (
            self._part.part_run_dir.absolute() / "environment.sh"
        )
        build_environment_script_path.write_text(self._env)
        build_environment_script_path.chmod(0o644)
    
        # save script to execute the build commands
        build_script_path = self._part.part_run_dir.absolute() / "build.sh"
        with build_script_path.open("w") as run_file:
            print("#!/bin/bash", file=run_file)
            print("set -euo pipefail", file=run_file)
            print(f"source {build_environment_script_path}", file=run_file)
            print("set -x", file=run_file)
    
            for build_command in plugin_build_commands:
                print(build_command, file=run_file)
    
        build_script_path.chmod(0o755)
        logger.debug("Executing %r", build_script_path)
    
        try:
>           subprocess.run(
                [str(build_script_path)],
                cwd=self._part.part_build_subdir,
                check=True,
                stdout=self._stdout,
                stderr=self._stderr,
            )

/home/lengau/Work/Code/craft-parts/craft_parts/executor/step_handler.py:143: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['/tmp/pytest-of-lengau/pytest-33/test_go_plugin0/parts/foo/run/build.sh'],)
kwargs = {'cwd': PosixPath('/tmp/pytest-of-lengau/pytest-33/test_go_plugin0/parts/foo/build'), 'stderr': None, 'stdout': None}
process = <Popen: returncode: 1 args: ['/tmp/pytest-of-lengau/pytest-33/test_go_plugin...>
stdout = None, stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                        output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['/tmp/pytest-of-lengau/pytest-33/test_go_plugin0/parts/foo/run/build.sh']' returned non-zero exit status 1.

/usr/lib/python3.10/subprocess.py:524: CalledProcessError

The above exception was the direct cause of the following exception:

new_dir = local('/tmp/pytest-of-lengau/pytest-33/test_go_plugin0')
mocker = <pytest_mock.plugin.MockerFixture object at 0x7f57e629ff70>

    def test_go_plugin(new_dir, mocker):
        parts_yaml = textwrap.dedent(
            """
            parts:
              foo:
                plugin: go
                source: .
                go-buildtags: [my_tag]
            """
        )
        parts = yaml.safe_load(parts_yaml)
    
        Path("go.mod").write_text(
            textwrap.dedent(
                """
                module example.com/hello
                go 1.13
                require rsc.io/quote v1.5.2
                """
            )
        )
    
        Path("hello.go").write_text(
            textwrap.dedent(
                """
                // +build my_tag
                package main
    
                import "fmt"
                import "rsc.io/quote"
    
                func main() {
                    fmt.Printf("%s", quote.Glass())
                }
                """
            )
        )
    
        # the go compiler is installed in the ci test setup
        lf = LifecycleManager(parts, application_name="test_go", cache_dir=new_dir)
        actions = lf.plan(Step.PRIME)
    
        with lf.action_executor() as ctx:
>           ctx.execute(actions)

/home/lengau/Work/Code/craft-parts/tests/integration/plugins/test_go.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:304: in execute
    self._executor.execute(actions, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:128: in execute
    self._run_action(act, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:193: in _run_action
    handler.run_action(action, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/part_handler.py:172: in run_action
    state = handler(step_info, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/part_handler.py:309: in _run_build
    self._run_step(
/home/lengau/Work/Code/craft-parts/craft_parts/executor/part_handler.py:493: in _run_step
    return step_handler.run_builtin()
/home/lengau/Work/Code/craft-parts/craft_parts/executor/step_handler.py:106: in run_builtin
    return handler()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <craft_parts.executor.step_handler.StepHandler object at 0x7f57e629f160>

    def _builtin_build(self) -> StepContents:
        # Plugin commands.
        plugin_build_commands = self._plugin.get_build_commands()
    
        # save script to set the build environment
        build_environment_script_path = (
            self._part.part_run_dir.absolute() / "environment.sh"
        )
        build_environment_script_path.write_text(self._env)
        build_environment_script_path.chmod(0o644)
    
        # save script to execute the build commands
        build_script_path = self._part.part_run_dir.absolute() / "build.sh"
        with build_script_path.open("w") as run_file:
            print("#!/bin/bash", file=run_file)
            print("set -euo pipefail", file=run_file)
            print(f"source {build_environment_script_path}", file=run_file)
            print("set -x", file=run_file)
    
            for build_command in plugin_build_commands:
                print(build_command, file=run_file)
    
        build_script_path.chmod(0o755)
        logger.debug("Executing %r", build_script_path)
    
        try:
            subprocess.run(
                [str(build_script_path)],
                cwd=self._part.part_build_subdir,
                check=True,
                stdout=self._stdout,
                stderr=self._stderr,
            )
        except subprocess.CalledProcessError as process_error:
>           raise errors.PluginBuildError(part_name=self._part.name) from process_error
E           craft_parts.errors.PluginBuildError: Failed to run the build script for part 'foo'.

/home/lengau/Work/Code/craft-parts/craft_parts/executor/step_handler.py:151: PluginBuildError
test_go_generate
FAILED            [ 57%]+ go mod download all
+ go generate gen/generator.go

tests/integration/plugins/test_go.py:76 (test_go_generate)
self = <craft_parts.executor.step_handler.StepHandler object at 0x7f57e6943880>

    def _builtin_build(self) -> StepContents:
        # Plugin commands.
        plugin_build_commands = self._plugin.get_build_commands()
    
        # save script to set the build environment
        build_environment_script_path = (
            self._part.part_run_dir.absolute() / "environment.sh"
        )
        build_environment_script_path.write_text(self._env)
        build_environment_script_path.chmod(0o644)
    
        # save script to execute the build commands
        build_script_path = self._part.part_run_dir.absolute() / "build.sh"
        with build_script_path.open("w") as run_file:
            print("#!/bin/bash", file=run_file)
            print("set -euo pipefail", file=run_file)
            print(f"source {build_environment_script_path}", file=run_file)
            print("set -x", file=run_file)
    
            for build_command in plugin_build_commands:
                print(build_command, file=run_file)
    
        build_script_path.chmod(0o755)
        logger.debug("Executing %r", build_script_path)
    
        try:
>           subprocess.run(
                [str(build_script_path)],
                cwd=self._part.part_build_subdir,
                check=True,
                stdout=self._stdout,
                stderr=self._stderr,
            )

/home/lengau/Work/Code/craft-parts/craft_parts/executor/step_handler.py:143: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['/tmp/pytest-of-lengau/pytest-33/test_go_generate0/parts/foo/run/build.sh'],)
kwargs = {'cwd': PosixPath('/tmp/pytest-of-lengau/pytest-33/test_go_generate0/parts/foo/build'), 'stderr': None, 'stdout': None}
process = <Popen: returncode: 1 args: ['/tmp/pytest-of-lengau/pytest-33/test_go_genera...>
stdout = None, stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                        output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['/tmp/pytest-of-lengau/pytest-33/test_go_generate0/parts/foo/run/build.sh']' returned non-zero exit status 1.

/usr/lib/python3.10/subprocess.py:524: CalledProcessError

The above exception was the direct cause of the following exception:

new_dir = local('/tmp/pytest-of-lengau/pytest-33/test_go_generate0')

    def test_go_generate(new_dir):
        """Test code generation via "go generate" in parts using the go plugin
    
        The go code in the "test_go" dir uses "gen/generator.go" to create, at build time,
        the "main.go" file that produces the final binary.
        """
        source_location = Path(__file__).parent / "test_go"
    
        parts_yaml = textwrap.dedent(
            f"""
            parts:
              foo:
                plugin: go
                source: {source_location}
                go-generate:
                  - gen/generator.go
                build-environment:
                  - GO111MODULE: "on"
            """
        )
        parts = yaml.safe_load(parts_yaml)
        lf = LifecycleManager(
            parts, application_name="test_go", cache_dir=new_dir, work_dir=new_dir
        )
        actions = lf.plan(Step.PRIME)
    
        with lf.action_executor() as ctx:
>           ctx.execute(actions)

/home/lengau/Work/Code/craft-parts/tests/integration/plugins/test_go.py:104: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:304: in execute
    self._executor.execute(actions, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:128: in execute
    self._run_action(act, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/executor.py:193: in _run_action
    handler.run_action(action, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/part_handler.py:172: in run_action
    state = handler(step_info, stdout=stdout, stderr=stderr)
/home/lengau/Work/Code/craft-parts/craft_parts/executor/part_handler.py:309: in _run_build
    self._run_step(
/home/lengau/Work/Code/craft-parts/craft_parts/executor/part_handler.py:493: in _run_step
    return step_handler.run_builtin()
/home/lengau/Work/Code/craft-parts/craft_parts/executor/step_handler.py:106: in run_builtin
    return handler()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <craft_parts.executor.step_handler.StepHandler object at 0x7f57e6943880>

    def _builtin_build(self) -> StepContents:
        # Plugin commands.
        plugin_build_commands = self._plugin.get_build_commands()
    
        # save script to set the build environment
        build_environment_script_path = (
            self._part.part_run_dir.absolute() / "environment.sh"
        )
        build_environment_script_path.write_text(self._env)
        build_environment_script_path.chmod(0o644)
    
        # save script to execute the build commands
        build_script_path = self._part.part_run_dir.absolute() / "build.sh"
        with build_script_path.open("w") as run_file:
            print("#!/bin/bash", file=run_file)
            print("set -euo pipefail", file=run_file)
            print(f"source {build_environment_script_path}", file=run_file)
            print("set -x", file=run_file)
    
            for build_command in plugin_build_commands:
                print(build_command, file=run_file)
    
        build_script_path.chmod(0o755)
        logger.debug("Executing %r", build_script_path)
    
        try:
            subprocess.run(
                [str(build_script_path)],
                cwd=self._part.part_build_subdir,
                check=True,
                stdout=self._stdout,
                stderr=self._stderr,
            )
        except subprocess.CalledProcessError as process_error:
>           raise errors.PluginBuildError(part_name=self._part.name) from process_error
E           craft_parts.errors.PluginBuildError: Failed to run the build script for part 'foo'.

/home/lengau/Work/Code/craft-parts/craft_parts/executor/step_handler.py:151: PluginBuildError

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.