Giter Club home page Giter Club logo

Comments (7)

Francommit avatar Francommit commented on July 17, 2024

Ok it's the two verify statements, I just think they're blowing up for some reason,
Without the notifys it partially runs:

 if $image == '' { fail('A source image is required') }

      # Detect changes to the defined podman flags and re-deploy if needed
      Exec { "verify_container_flags_${handle}":
        command  => 'true',
        provider => 'shell',
        unless   => @("END"/$L),
                   if podman container exists ${container_name}
                     then
                     saved_resource_flags="\$(podman container inspect ${container_name} \
                       --format '{{.Config.Labels.puppet_resource_flags}}' | tr -d '\n')"
                     current_resource_flags="\$(echo '${flags_base64}' | tr -d '\n')"
                     test "\${saved_resource_flags}" = "\${current_resource_flags}"
                   fi
                   |END
        # notify   => Exec["podman_remove_container_${handle}"],
        require  => $requires,
        *        => $exec_defaults,
      }

      # Re-deploy when $update is true and the container image has been updated
      if $update {
        Exec { "verify_container_image_${handle}":
          command  => 'true',
          provider => 'shell',
          unless   => @("END"/$L),
            if podman container exists ${container_name}
              then
              image_name=\$(podman container inspect ${container_name} --format '{{.ImageName}}')
              running_digest=\$(podman image inspect \${image_name} --format '{{.Digest}}')
              latest_digest=\$(skopeo inspect docker://\${image_name} | \
                /opt/puppetlabs/puppet/bin/ruby -rjson -e 'puts (JSON.parse(STDIN.read))["Digest"]')
              [[ $? -ne 0 ]] && latest_digest=\$(skopeo inspect --no-creds docker://\${image_name} | \
                /opt/puppetlabs/puppet/bin/ruby -rjson -e 'puts (JSON.parse(STDIN.read))["Digest"]')
              test -z "\${latest_digest}" && exit 0     # Do not update if unable to get latest digest
              test "\${running_digest}" = "\${latest_digest}"
            fi
            |END
          # notify   => [
          #   Exec["podman_remove_image_${handle}"],
          #   Exec["podman_remove_container_${handle}"],
          # ],
          require  => $requires,
          *        => $exec_defaults,
        }
      }

from podman.

Francommit avatar Francommit commented on July 17, 2024

Ok remove image works

Exec["podman_remove_image_${handle}"],

I think it's

Exec["podman_remove_container_${handle}"],

That's blowing up.

from podman.

Francommit avatar Francommit commented on July 17, 2024

Going through the debug logs I see the following:

Debug: /Stage[main]/Podman/Podman::Container[primary-solace]/Exec[verify_container_flags_primary-solace]/unless: /bin/sh: -c: line 24: syntax error: unexpected end of file

Which is from:

Debug: /Stage[main]/Podman/Podman::Container[primary-solace]/Exec[verify_container_flags_-primary-solace]/unless: /bin/sh: -c: line 24: syntax error: unexpected end of file
Debug: Exec[verify_container_flags_primary-solace](provider=shell): Executing '["/bin/sh", "-c", "true"]'
Debug: Executing: '/bin/sh -c true'

I'm just not super sure where it's coming from

from podman.

Francommit avatar Francommit commented on July 17, 2024

@southalc , if I use 0.24 I have no problems at all, so the bugs been introduced between then and now.

from podman.

southalc avatar southalc commented on July 17, 2024

The error looks like it's failing on the "unless" execution statement from here:

unless => @("END"/$L),

What it's doing is converting the container flags from the defined puppet resource to a base64 encoded string, then checking that string against the running container's tag named "puppet_resource_flags" that was set when the container was created. I tested with a simple change of the container resource flags and observed my test container get re-deployed successfully.

At this point I spun up a new RHEL8 VM to test the configuration you submitted, but I am unable to reproduce the issue as my container deploys correctly and re-deploys if I change the resource flags. The error "unexpected end of file" makes me wonder if the heredoc is being parsed correctly? Unfortunately even debug output is not returning the actual command that was being executed by the "unless". What version of the Puppet agent are you using?

BTW, you should be able to clean up a container deployment with something like this from hiera:

podman::containers:
  primary-solace:
    ensure: absent

from podman.

Francommit avatar Francommit commented on July 17, 2024

Hey thanks for the reply. Been doing other work but I revisited this as I found I needed to run podman as a user and not as root.
I've gotten the process a little bit further but it's still falling over at a different spot now when I'm specifying a user.

So, the following logs are from a fresh Redhat Puppet run.

Notice: /Stage[main]/Types/Types::Type[group]/Group[solace]/ensure: created
Notice: /Stage[main]/Types/Types::Type[user]/User[solace]/ensure: created
Notice: /Stage[main]/Role::Solace_monitor/File[solace_home_directory]/ensure: created
Notice: /Stage[main]/Podman::Install/Concat[/etc/subuid]/File[/etc/subuid]/content: content changed '{md5}2075cf9f804d83c3ad908c95202455d7' to '{md5}6d789a6665985785c5e045a2ad91ed59'
Notice: /Stage[main]/Podman::Install/Concat[/etc/subgid]/File[/etc/subgid]/content: content changed '{md5}2075cf9f804d83c3ad908c95202455d7' to '{md5}6d789a6665985785c5e045a2ad91ed59'
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Podman::Rootless[solace]/Exec[loginctl_linger_solace]/returns: executed successfully
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Podman::Rootless[solace]/File[/home/solace/.config]/ensure: created
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Podman::Rootless[solace]/File[/home/solace/.config/systemd]/ensure: created
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Podman::Rootless[solace]/File[/home/solace/.config/systemd/user]/ensure: created
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[verify_container_flags_solace-st-monitor-solace]/returns: executed successfully
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[verify_container_image_solace-st-monitor-solace]/returns: executed successfully
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_remove_container_solace-st-monitor-solace]/returns: Failed to stop podman-st-monitor-solace.service: Unit podman-st-monitor-solace.service not loaded.
 found: no such containeran/Podman::Container[st-monitor-solace]/Exec[podman_remove_container_solace-st-monitor-solace]/returns: Error: no container with name or ID st-monitor-solace
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_remove_container_solace-st-monitor-solace]/returns: Error: failed to evict container: "": failed to find container "st-monitor-sola found: no such containerner with name or ID st-monitor-solace
Error: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_remove_container_solace-st-monitor-solace]: Failed to call refresh: 'systemctl --user  stop podman-st-monitor-solace || podman container stop --time 60 st-monitor-solace
podman container rm --force st-monitor-solace
' returned 1 instead of one of [0]
Error: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_remove_container_solace-st-monitor-solace]: 'systemctl --user  stop podman-st-monitor-solace || podman container stop --time 60 st-monitor-solace
podman container rm --force st-monitor-solace
' returned 1 instead of one of [0]
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_remove_image_solace-st-monitor-solace]: Dependency Exec[podman_remove_container_solace-st-monitor-solace] has failures: true
Warning: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_remove_image_solace-st-monitor-solace]: Skipping because of failed dependencies
Warning: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_create_solace-st-monitor-solace]: Skipping because of failed dependencies
Warning: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_generate_service_solace-st-monitor-solace]: Skipping because of failed dependencies
Warning: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]: Skipping because of failed dependencies
Notice: Applied catalog in 289.99 seconds

My config is as follows:

I've got this in a profile as I couldn't get the types to create the home directory:

   file { 'solace_home_directory':
    ensure  => directory,
    path    => '/home/solace',
    owner   => 'solace',
    group   => 'solace',
    mode    => '0644',
    purge   => false
  }
  
  include types
  include podman

I assume the types module, you're using your own which is listed on the forge.
For my yaml, I've setup the following:

---
types::user:
  solace:
    ensure: present
    forcelocal: true
    uid:  222001
    gid:  222001
    password: 'WIK@LH$#I#$#IUH'
    home: /home/solace
    
types::group:
  solace:
    ensure: present
    forcelocal: true
    gid:  222001


podman::manage_subuid: true
podman::subid:
  '222001':
    subuid: 12300000
    count: 65535

podman::containers:
  solace-host:
    user: solace
    image: 'solace/solace-pubsub-standard'
    flags:
      publish:
        - '8080:8080'
        - '50000:50000'
      env:
       - 'username_admin_globalaccesslevel="admin"'
       - 'username_admin_password="admin"'
      shm-size:
       - '1g'
    service_flags:
      timeout: '960'

I cannot get it to work at all on the latest version, it doesn't even generate the files in /home/user/.config/systemd/user/

In the previous version of the module https://forge.puppet.com/modules/southalc/podman/changelog#release-023 I can get it further, it's I can see it's downloading the docker image (it's 1Gb so it takes a while and I can see it in the users podman image store)

This is where I get to when I use the previous version with 0.24:

Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_create_solace-st-monitor-solace]/returns: executed successfully
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[podman_generate_service_solace-st-monitor-solace]: Triggered 'refresh' from 1 event
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: Created symlink /home/solace/.config/systemd/user/multi-user.target.wants/podman-st-monitor-solace.service → /home/solace/.config/systemd/user/podman-st-monitor-solace.service.
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: Created symlink /home/solace/.config/systemd/user/default.target.wants/podman-st-monitor-solace.service → /home/solace/.config/systemd/user/podman-st-monitor-solace.service.
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: Job for podman-st-monitor-solace.service failed because the control process exited with error code.
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: See "systemctl --user status podman-st-monitor-solace.service" and "journalctl --user -xe" for details.
Error: 'systemctl --user  enable podman-st-monitor-solace.service
systemctl --user  start podman-st-monitor-solace.service
' returned 1 instead of one of [0]
Error: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: change from 'notrun' to ['0'] failed: 'systemctl --user  enable podman-st-monitor-solace.service
systemctl --user  start podman-st-monitor-solace.service
' returned 1 instead of one of [0]
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: Job for podman-st-monitor-solace.service failed because the control process exited with error code.
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: See "systemctl --user status podman-st-monitor-solace.service" and "journalctl --user -xe" for details.
Error: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]: Failed to call refresh: 'systemctl --user  enable podman-st-monitor-solace.service
systemctl --user  start podman-st-monitor-solace.service
' returned 1 instead of one of [0]
Error: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]: 'systemctl --user  enable podman-st-monitor-solace.service
systemctl --user  start podman-st-monitor-solace.service
' returned 1 instead of one of [0]
Notice: Applied catalog in 5.59 seconds

I just don't know what's causing this:

Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: Job for podman-st-monitor-solace.service failed because the control process exited with error code.
Notice: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: See "systemctl --user status podman-st-monitor-solace.service" and "journalctl --user -xe" for details.
Error: 'systemctl --user  enable podman-st-monitor-solace.service
systemctl --user  start podman-st-monitor-solace.service
' returned 1 instead of one of [0]
Error: /Stage[main]/Podman/Podman::Container[st-monitor-solace]/Exec[service_podman_solace-st-monitor-solace]/returns: change from 'notrun' to ['0'] failed: 'systemctl --user  enable podman-st-monitor-solace.service
systemctl --user  start podman-st-monitor-solace.service
' returned 1 instead of one of [0]

from podman.

Francommit avatar Francommit commented on July 17, 2024

After deploying on our actual redhat servers with PE running (not my hacked localhost version) it's all working as intended, thanks again for the great work.

from podman.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.