Giter Club home page Giter Club logo

packer-plugin-vsphere's Introduction

Packer Plugin for VMware vSphere

The Packer Plugin for VMware vSphere is a multi-component plugin can be used with HashiCorp Packer to create virtual machine images for VMware vSphere®.

The plugin includes three builders and two post-processors which are able to create images, depending on your desired strategy:

Builders

  • vsphere-iso - This builder starts from an ISO file and uses the vSphere API to build a virtual machine image on an ESXi host.

  • vsphere-clone - This builder clones a virtual machine from an existing template using the uses the vSphere API and then modifies and saves it as a new template.

  • vsphere-supervisor - This builder deploys and publishes new virtual machine to a vSphere Supervisor cluster using VM Service.

Post-Processors

  • vsphere - This post-processor uploads an artifact to a vSphere endpoint. The artifact must be a VMX, OVA, or OVF file.

  • vsphere-template - This post-processor uses an artifact from the vmware-iso builder with an ESXi host or an artifact from the vSphere post-processor. It then marks the virtual machine as a template and moves it to your specified path.

Differences from the Packer Plugin for VMware

While both this plugin and the packer-plugin-vmware are designed to create virtual machine images, there are some key differences:

  • Platforms: This plugin is specifically developed to utilize the VMware vSphere API, facilitating the creation of virtual machine images by integrating with VMware vCenter Server and the VMware vSphere Hypervisor. On the other hand, packer-plugin-vmware supports a variety of platforms including VMware vSphere Hypervisor and desktop virtualization products such as VMware Fusion, VMware Workstation, and VMware Player, though it does not utilize the vSphere API for its operations.

  • Focus: This plugin is purpose-built with a focus on VMware vSphere, offering capabilities such as creating virtual machine images, cloning and modifying base virtual machine images, and exporting artifacts in specified locations and formats. In contrast, packer-plugin-vmware includes builders that operate on both VMware vSphere Hypervisor and the aforementioned desktop virtualization products, providing a different set of functionalities, including support for Vagrant.

Please refer to the documentation for each plugin to understand the specific capabilities and configuration options.

Requirements

Installation

Using Pre-built Releases

Automatic Installation

Packer v1.7.0 and later supports the packer init command which enables the automatic installation of Packer plugins. For more information, see the Packer documentation.

To install this plugin, copy and paste this code (HCL2) into your Packer configuration and run packer init.

packer {
  required_version = ">= 1.7.0"
  required_plugins {
    vsphere = {
      version = ">= 1.3.0"
      source  = "github.com/hashicorp/vsphere"
    }
  }
}

Manual Installation

You can download pre-built binary releases of the plugin on GitHub. Once you have downloaded the latest release archive for your target operating system and architecture, extract the release archive to retrieve the plugin binary file for your platform.

To install the downloaded plugin, please follow the Packer documentation on installing a plugin.

Using the Source

If you prefer to build the plugin from sources, clone the GitHub repository locally and run the command go build from the repository root directory. Upon successful compilation, a packer-plugin-vsphere plugin binary file can be found in the root directory.

To install the compiled plugin, please follow the Packer documentation on installing a plugin.

Configuration

For more information on how to configure the plugin, please see the plugin documentation.

Contributing

  • If you think you've found a bug in the code or you have a question regarding the usage of this software, please reach out to us by opening an issue in this GitHub repository.

  • Contributions to this project are welcome: if you want to add a feature or a fix a bug, please do so by opening a pull request in this GitHub repository. In case of feature contribution, we kindly ask you to open an issue to discuss it beforehand.

packer-plugin-vsphere's People

Contributors

azr avatar cbednarski avatar chrismarget avatar dependabot[bot] avatar dilyar85 avatar hashicorp-copywrite[bot] avatar hi-angel avatar jamespgriffith avatar jescalan avatar jhawk28 avatar lbajolet-hashicorp avatar lizatretyakova avatar markpeek avatar mheidenr avatar mitchellh avatar mkuzmin avatar mmckeen avatar mwhooker avatar nywilken avatar remijouannet avatar rickard-von-essen avatar saikirandusari avatar seanmalloy avatar stephen-fox avatar swampdragons avatar sylviamoss avatar tenthirtyam avatar tjm avatar vladrassokhin avatar xosmig avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

packer-plugin-vsphere's Issues

Add support to reattach CD-ROM device

This issue was originally opened by @erikgraa as hashicorp/packer#9117. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Feature Description

Including more than one of either iso_urls or iso_paths for a build seems to permanently add an extra CD-rom drive to a virtual machine.

image

Use Case(s)

It would be nice if it were possible to revert to having only a single CD-rom just prior to a build finishing up. Rarely does one want more than one CD-rom apart from during a packer build.

Post-processor failed: can't find base mac address in OVF

Overview of the Issue

Hello, I'm trying to build a Vagrant box with the vsphere-iso builder. I build the VM and export this locally to be able to use the Vagrant post-processor. I register the exported OVF file as an artifact. The moment the Vagrant post-processor is ran, it errors with following log: Post-processor failed: can't find base mac address in OVF. Although I use the "options": ["mac"] key in my Packer template (see below) it doesn't seem to work.

One side note, I'm building a Vagrant box (for Virtualbox provider) from vsphere. I also install Virtualbox guest additions into it so I assume it should be possible to use this box with Virtualbox or not?

Reproduction Steps

You can try the code yourself with this template, makle sure to fill in the variables.

Packer version

From 1.6.0

Simplified Packer Buildfile

Gist

Operating system and Environment details

  • CentOS7 Packer build
  • Running on VMware vSphere

Log Fragments and crash.log files

==> vsphere-iso: Eject CD-ROM drives...
    vsphere-iso: Starting export...
    vsphere-iso: Downloading: centos7_devbox_feature-abi-fdi-devbox-disk-0.vmdk
    vsphere-iso: Exporting file: centos7_devbox_feature-abi-fdi-devbox-disk-0.vmdk
    vsphere-iso: Writing ovf...
    vsphere-iso: Creating manifest...
    vsphere-iso: Finished exporting...
==> vsphere-iso: Running post-processor: artifice
==> vsphere-iso (artifice): Discarding files from artifact: output_vsphere/centos7_devbox_feature-abi-fdi-devbox-disk-0.vmdk, output_vsphere/centos7_devbox_feature-abi-fdi-devbox.mf, output_vsphere/centos7_devbox_feature-abi-fdi-devbox.ovf
==> vsphere-iso (artifice): Using these artifact files: ./output_vsphere/centos7_devbox_feature-abi-fdi-devbox.ovf
==> vsphere-iso: Running post-processor: vagrant
==> vsphere-iso (vagrant): Creating a dummy Vagrant box to ensure the host system can create one correctly
==> vsphere-iso (vagrant): Creating Vagrant box for 'virtualbox' provider
    vsphere-iso (vagrant): Copying from artifact: ./output_vsphere/centos7_devbox_feature-abi-fdi-devbox.ovf
    vsphere-iso (vagrant): Renaming the OVF to box.ovf...
Build 'vsphere-iso' errored: 1 error(s) occurred:
* Post-processor failed: can't find base mac address in OVF
==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: 1 error(s) occurred:
* Post-processor failed: can't find base mac address in OVF
==> Builds finished but no artifacts were created.
ERROR: Job failed: command terminated with exit code 1

Use `vmware/govmomi` types to avoid violating API specifications

This issue was originally opened by @Borkason as hashicorp/packer#10271. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


The vsphere builder uses a custom type HardwareConfig (among others) instead of the govmomi sdk provided VirtualMachineConfigSpec.

The custom HardwareConfig type directly violates with the govmomi-sdk provided type VirtualMachineConfigSpec. For example, VirtualMachineConfigSpec.NestedHV is nil when not explicitly set. However, when HardwareConfig.NestedHV is not set it is false if unset and overwrites any configuration_parameters that perhaps activate that particular settings, such as vhv.enable (which is the key-value for the vmx file).

`http_ip` vs `http_bind_address`

This issue was originally opened by @karma0 as hashicorp/packer#10636. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Documentation needs clarification

The differences between http_ip and http_bind_address in the documentation aren't clear, and some clarification is needed around the default binding address of 0.0.0.0 when packer actually only binds to all IPv6 addresses by default.

Reproduction Steps

I am running Arch Linux and have disabled IPV6 to the best of my ability, but was running into an issue wherein packer was binding the preseed http server to all IPv6 addresses on a random port and no IPv4 addresses. With this, HTTPIP was resolving to the host's IPv4 external interface and it was causing a failure to connect.

When I searched for similar issues, issue 6689 matched my problem almost exactly.

To resolve this, initially, I was trying to set the bind address using http_ip as described in the documentation, but somehow overlooked http_bind_address.

This part appears to be a bug: I used the bind address 0.0.0.0 (which appears to be the default) and packer would still bind to all IPv6 addresses, like so:

tcp6       0      0 :::8783                 :::*                    LISTEN      42229/packer 

The only workaround was to manually specify the exact IP address for the exact IP that HTTPIP resolves to in the http_bind_address vsphere-iso packer configuration.

Packer version

This same thing happened while using vsphere-iso on packer versions 1.6.3, 1.6.5, and 1.7.0.

Operating system and Environment details

  • Laptop running Arch Linux
  • IPv6 disabled using NetworkManager and sysctl
  • VSphere 7 cluster

Add support for vSphere DRS affinity/anti-affinity groups

This issue was originally opened by @JamesPGriffith as hashicorp/packer#10404. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Description

Add the ability to add the VM to an affinity rule (VM/Host Rule).

Use Case(s)

We generate VMs in an isolated network behind a gateway. By design, these groups of VMs (vApps) do not communicate across nodes in the cluster. DRS is running to distribute the vApps, but the entire vApp must be on the same ESX node. In order for the in-flight template to communicate externally it must be on the same ESX node as the gateway which requires it be added to the affinity rule.

Potential References

Using VM-Host Affinity Rules
govmomi cluster.rule.create

`vsphere-iso`: build fails when the `resource_pool` is a vApp

This issue was originally opened by @netapp-jgriffit as hashicorp/packer#10365. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

Attempting to create a VM in a vApp fails.

Reproduction Steps

Packer version

$ packer version
Packer v1.6.5
$

Simplified Packer Buildfile

Packer Buildfile
{
  "variables": {
    "vsphere_server": "",
    "vsphere_user": "",
    "vsphere_password": "",
    "vsphere_dc_name": "",
    "vsphere_cluster": "",
    "vsphere_host": "",
    "vsphere_datastore": "",
    "vsphere_resource_pool": "Standalone OS",
    "local_password": "",
    "cpu_num": "4",
    "mem_size": "8192",
    "disk_size": "32768",
    "vmtools_iso_path":"",
    "os_iso_path": "[iso] WINDOWS/en_windows_server_2019_updated_aug_2020_x64_dvd_f4bab427.iso",
    "vsphere_template_name": "CP00027-W2019DCE",
    "vsphere_portgroup_name": "CP00027-1",
    "vm_notes": "{\n  'Timestamp': '{{isotime \"2006-01-02T03:04:05\"}}',\n  'Packed from': '{{user `os_iso_path`}}'\n}"
  },
  
  "sensitive-variables": ["vsphere_password", "local_password"],
  
  "builders": [
    {
      "type": "vsphere-iso",
      
      "vcenter_server":      "{{user `vsphere_server`}}",
      "username":            "{{user `vsphere_user`}}",
      "password":            "{{user `vsphere_password`}}",
      "insecure_connection": "true",
      
      "datacenter": "{{user `vsphere_dc_name`}}",
      "cluster":     "{{user `vsphere_cluster`}}",
      "resource_pool": "{{user `vsphere_resource_pool`}}",
      "datastore": "{{user `vsphere_datastore`}}",
      
      "convert_to_template": false,
      
      "vm_name": "{{user `vsphere_template_name`}}",
      
      "CPUs": "{{user `cpu_num`}}",
      "RAM": "{{user `mem_size`}}",
      "RAM_hot_plug": true,
      "NestedHV": false,
      "firmware": "bios",
      
      "guest_os_type": "windows9Server64Guest",
      "disk_controller_type":  "lsilogic-sas",
      "storage": [
        {
          "disk_size": "{{user `disk_size`}}",
          "disk_thin_provisioned": true
        }
      ],
      "network_adapters": [
        {
          "network_card": "vmxnet3",
          "network": "{{user `vsphere_portgroup_name`}}"
        }
      ],
      
      "floppy_files": [
        "./01-packer/windows/scripts/",
        "./01-packer/windows/2019/SERVERDATACENTER/"
      ],
      "floppy_dirs": [
        "./01-packer/windows/drivers"
      ],
      
      "iso_paths": [
        "{{user `os_iso_path`}}",
        "{{user `vmtools_iso_path`}}"
      ],
      
      "notes": "{{user `vm_notes`}}",
      
      "communicator": "none",
      "shutdown_command": "shutdown",
      "shutdown_timeout": "15m"
    }
  ]
}

Operating system and Environment details

Ubuntu 18.04.5 LTS x86_64
vSphere Client version 6.7.0.44000

Log Fragments and crash.log files

When building to a vApp with *NO* spaces
$ PACKER_LOG=1 time packer build -var-file=./01-packer/private.json -var-file=./01-packer/dev.json ./01-packer/windows/CP00027-W2019DCE.json
2020/12/09 16:18:13 [INFO] Packer version: 1.6.5 [go1.15.3 linux amd64]
2020/12/09 16:18:13 Checking 'PACKER_CONFIG' for a config file path
2020/12/09 16:18:13 'PACKER_CONFIG' not set; checking the default config file path
2020/12/09 16:18:13 Attempting to open config file: /home/user/.packerconfig
2020/12/09 16:18:13 [WARN] Config file doesn't exist: /home/user/.packerconfig
2020/12/09 16:18:13 Setting cache directory: /home/user/projects/ContentDev-Templates/packer_cache
[[REDACTED]]
2020/12/09 16:18:13 Creating plugin client for path: /usr/bin/packer
2020/12/09 16:18:13 Starting plugin: /usr/bin/packer []string{"/usr/bin/packer", "plugin", "packer-builder-vsphere-iso"}
2020/12/09 16:18:13 Waiting for RPC address for: /usr/bin/packer
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: [INFO] Packer version: 1.6.5 [go1.15.3 linux amd64]
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: Checking 'PACKER_CONFIG' for a config file path
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: 'PACKER_CONFIG' not set; checking the default config file path
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: Attempting to open config file: /home/user/.packerconfig
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: [WARN] Config file doesn't exist: /home/user/.packerconfig
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: Setting cache directory: /home/user/projects/ContentDev-Templates/packer_cache
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: args: []string{"packer-builder-vsphere-iso"}
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: Plugin address: unix /tmp/packer-plugin969226034
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: Waiting for connection...
2020/12/09 16:18:13 Received unix RPC address for /usr/bin/packer: addr is /tmp/packer-plugin969226034
2020/12/09 16:18:13 packer-builder-vsphere-iso plugin: Serving a plugin connection...
2020/12/09 16:18:13 Preparing build: vsphere-iso

The parameter `shutdown_command` is ignored as it requires a `communicator`.

2020/12/09 16:18:14 Build debug mode: false
2020/12/09 16:18:14 Force build: false
2020/12/09 16:18:14 On error: 
2020/12/09 16:18:14 Waiting on builds to complete...
2020/12/09 16:18:14 Starting build run: vsphere-iso
2020/12/09 16:18:14 Running builder: vsphere-iso
2020/12/09 16:18:14 [INFO] (telemetry) Starting builder vsphere-iso
Warning: Warning when preparing build: "vsphere-iso"

The parameter `shutdown_command` is ignored as it requires a `communicator`.


vsphere-iso: output will be in this color.

2020/12/09 16:18:14 packer-builder-vsphere-iso plugin: No URLs were provided to Step Download. Continuing...
2020/12/09 16:18:14 packer-builder-vsphere-iso plugin: No CD files specified. CD disk will not be made.
==> vsphere-iso: Creating VM...
2020/12/09 16:18:17 [INFO] (telemetry) ending vsphere-iso
==> Wait completed after 3 seconds 871 milliseconds
2020/12/09 16:18:17 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/12/09 16:18:17 machine readable: vsphere-iso,error []string{"error creating vm: A specified parameter was not correct: pool"}
==> Builds finished but no artifacts were created.
Build 'vsphere-iso' errored after 3 seconds 871 milliseconds: error creating vm: A specified parameter was not correct: pool
2020/12/09 16:18:17 [INFO] (telemetry) Finalizing.
2020/12/09 16:18:17 Cancelling builder after context cancellation context canceled

==> Wait completed after 3 seconds 871 milliseconds

==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: error creating vm: A specified parameter was not correct: pool

==> Builds finished but no artifacts were created.
2020/12/09 16:18:18 waiting for all plugin processes to complete...
2020/12/09 16:18:18 /usr/bin/packer: plugin process exited
Command exited with non-zero status 1
0.26user 0.12system 0:04.40elapsed 8%CPU (0avgtext+0avgdata 70796maxresident)k
1944inputs+16outputs (9major+17079minor)pagefaults 0swaps
$

However, the vApp exists and this comment indicates that it should work:
https://github.com/hashicorp/packer/blob/58a0bdd780cc18720f7964daae905256b8f00078/builder/vsphere/driver/resource_pool.go#L39


When building to a vApp with spaces
$ PACKER_LOG=1 time packer build -var-file=./01-packer/private.json -var-file=./01-packer/dev.json ./01-packer/windows/CP00027-W2019DCE.json
2020/12/09 16:30:18 [INFO] Packer version: 1.6.5 [go1.15.3 linux amd64]
2020/12/09 16:30:18 Checking 'PACKER_CONFIG' for a config file path
2020/12/09 16:30:18 'PACKER_CONFIG' not set; checking the default config file path
2020/12/09 16:30:18 Attempting to open config file: /home/user/.packerconfig
2020/12/09 16:30:18 [WARN] Config file doesn't exist: /home/user/.packerconfig
2020/12/09 16:30:18 Setting cache directory: /home/user/projects/ContentDev-Templates/packer_cache
[[REDACTED]]]]
2020/12/09 16:30:18 Creating plugin client for path: /usr/bin/packer
2020/12/09 16:30:18 Starting plugin: /usr/bin/packer []string{"/usr/bin/packer", "plugin", "packer-builder-vsphere-iso"}
2020/12/09 16:30:18 Waiting for RPC address for: /usr/bin/packer
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: [INFO] Packer version: 1.6.5 [go1.15.3 linux amd64]
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: Checking 'PACKER_CONFIG' for a config file path
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: 'PACKER_CONFIG' not set; checking the default config file path
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: Attempting to open config file: /home/user/.packerconfig
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: [WARN] Config file doesn't exist: /home/user/.packerconfig
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: Setting cache directory: /home/user/projects/ContentDev-Templates/packer_cache
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: args: []string{"packer-builder-vsphere-iso"}
2020/12/09 16:30:18 Received unix RPC address for /usr/bin/packer: addr is /tmp/packer-plugin881024669
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: Plugin address: unix /tmp/packer-plugin881024669
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: Waiting for connection...
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: Serving a plugin connection...
2020/12/09 16:30:18 Preparing build: vsphere-iso

Warning: Warning when preparing build: "vsphere-iso"

The parameter `shutdown_command` is ignored as it requires a `communicator`.


vsphere-iso: output will be in this color.

The parameter `shutdown_command` is ignored as it requires a `communicator`.

2020/12/09 16:30:18 Build debug mode: false
2020/12/09 16:30:18 Force build: false
2020/12/09 16:30:18 On error: 
2020/12/09 16:30:18 Waiting on builds to complete...
2020/12/09 16:30:18 Starting build run: vsphere-iso
2020/12/09 16:30:18 Running builder: vsphere-iso
2020/12/09 16:30:18 [INFO] (telemetry) Starting builder vsphere-iso
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: No URLs were provided to Step Download. Continuing...
2020/12/09 16:30:18 packer-builder-vsphere-iso plugin: No CD files specified. CD disk will not be made.
==> vsphere-iso: Creating VM...
2020/12/09 16:30:22 [INFO] (telemetry) ending vsphere-iso
==> Wait completed after 3 seconds 894 milliseconds
2020/12/09 16:30:22 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/12/09 16:30:22 machine readable: vsphere-iso,error []string{"error creating vm: A specified parameter was not correct: pool"}
==> Builds finished but no artifacts were created.
Build 'vsphere-iso' errored after 3 seconds 894 milliseconds: error creating vm: A specified parameter was not correct: pool

==> Wait completed after 3 seconds 894 milliseconds

==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: error creating vm: A specified parameter was not correct: pool

==> Builds finished but no artifacts were created.
2020/12/09 16:30:22 [INFO] (telemetry) Finalizing.
2020/12/09 16:30:22 waiting for all plugin processes to complete...
2020/12/09 16:30:22 /usr/bin/packer: plugin process exited
Command exited with non-zero status 1
0.24user 0.14system 0:04.40elapsed 8%CPU (0avgtext+0avgdata 71416maxresident)k
4160inputs+16outputs (20major+17852minor)pagefaults 0swaps
$

This seems to indicate finding the vApp but then there error indicates that a folder is defined. As you can see fromt the build file, folder is not defined.

I pulled the vCenter vpxd*.log to see what was inbound.

vpxd*.log snippet
2020-12-09T21:30:21.969Z info vpxd[04336] [Originator@6876 sub=vpxLro opID=2e8672f2] [VpxLRO] -- BEGIN task-48942 -- group-v22 -- vim.Folder.createVm -- 52171202-a47c-7fd3-0fc1-8c2e2952d5e4(52a81328-a3e3-9e33-ae5c-bb4d058bd38d)
2020-12-09T21:30:21.969Z warning vpxd[04336] [Originator@6876 sub=VmProv opID=2e8672f2] Unable to lookup ds for disk 
2020-12-09T21:30:21.970Z info vpxd[04336] [Originator@6876 sub=vpxLro opID=2e8672f2-01] [VpxLRO] -- BEGIN lro-90342054 --  -- VmprovWorkflow -- 
2020-12-09T21:30:21.970Z info vpxd[04336] [Originator@6876 sub=pbm opID=2e8672f2-01] No datastore for disk -202 in config, filePath ds:///vmfs/volumes/f9492891-680f9214/
2020-12-09T21:30:21.981Z error vpxd[04336] [Originator@6876 sub=VAppUtil opID=2e8672f2-01] [VpxdVAppUtil] The folder argument must not be set on a VM in a vApp
2020-12-09T21:30:21.981Z error vpxd[04336] [Originator@6876 sub=VmProv opID=2e8672f2-01] Get exception while executing action vpx.vmprov.InvokePrechecks: N5Vmomi5Fault15InvalidArgument9ExceptionE(Fault cause: vmodl.fault.InvalidArgument
--> )
2020-12-09T21:30:21.984Z info vpxd[04336] [Originator@6876 sub=VmProv opID=2e8672f2-01] Workflow context:
--> (vpx.vmprov.CreateContext) {
-->    cbData = (vmodl.KeyAnyValue) [
-->       (vmodl.KeyAnyValue) {
-->          key = "workflow.startTime", 
-->          value = 18235084146319
-->       }, 
-->       (vmodl.KeyAnyValue) {
-->          key = "pbmPreCheckSkipped", 
-->          value = false
-->       }, 
-->       (vmodl.KeyAnyValue) {
-->          key = "defaultProfileData", 
-->          value = (vmodl.KeyAnyValue) [
-->             (vmodl.KeyAnyValue) {
-->                key = "0x7fa9c816b4a0", 
-->                value = true
-->             }
-->          ]
-->       }
-->    ], 
-->    prevOutput = <unset>, 
-->    spec = (vim.vm.ConfigSpec) {
-->       changeVersion = <unset>, 
-->       name = "CP00027-W2019DCE", 
-->       version = <unset>, 
-->       createDate = "2020-12-09T21:30:21.969251Z", 
-->       uuid = <unset>, 
-->       instanceUuid = <unset>, 
-->       npivNodeWorldWideName = <unset>, 
-->       npivPortWorldWideName = <unset>, 
-->       npivWorldWideNameType = <unset>, 
-->       npivDesiredNodeWwns = <unset>, 
-->       npivDesiredPortWwns = <unset>, 
-->       npivTemporaryDisabled = <unset>, 
-->       npivOnNonRdmDisks = <unset>, 
-->       npivWorldWideNameOp = <unset>, 
-->       locationId = <unset>, 
-->       guestId = "windows9Server64Guest", 
-->       alternateGuestName = <unset>, 
-->       annotation = "{
-->   'Timestamp': '2020-12-09T09:30:18',
-->   'Packed from': '[iso] WINDOWS/en_windows_server_2019_updated_aug_2020_x64_dvd_f4bab427.iso'
--> }", 
-->       files = (vim.vm.FileInfo) {
-->          vmPathName = "ds:///vmfs/volumes/f9492891-680f9214/", 
-->          snapshotDirectory = <unset>, 
-->          suspendDirectory = <unset>, 
-->          logDirectory = <unset>, 
-->          ftMetadataDirectory = <unset>
-->       }, 
-->       tools = (vim.vm.ToolsConfigInfo) null, 
-->       flags = (vim.vm.FlagInfo) null, 
-->       consolePreferences = (vim.vm.ConsolePreferences) null, 
-->       powerOpInfo = (vim.vm.DefaultPowerOpInfo) null, 
-->       numCPUs = <unset>, 
-->       numCoresPerSocket = <unset>, 
-->       memoryMB = <unset>, 
-->       memoryHotAddEnabled = <unset>, 
-->       cpuHotAddEnabled = <unset>, 
-->       cpuHotRemoveEnabled = <unset>, 
-->       virtualICH7MPresent = <unset>, 
-->       virtualSMCPresent = <unset>, 
-->       deviceChange = (vim.vm.device.VirtualDeviceSpec) [
-->          (vim.vm.device.VirtualDeviceSpec) {
-->             operation = "add", 
-->             fileOperation = <unset>, 
-->             device = (vim.vm.device.VirtualLsiLogicSASController) {
-->                key = -201, 
-->                deviceInfo = (vim.Description) null, 
-->                backing = (vim.vm.device.VirtualDevice.BackingInfo) null, 
-->                connectable = (vim.vm.device.VirtualDevice.ConnectInfo) null, 
-->                slotInfo = (vim.vm.device.VirtualDevice.BusSlotInfo) null, 
-->                controllerKey = <unset>, 
-->                unitNumber = <unset>, 
-->                busNumber = 0, 
-->                device = <unset>, 
-->                hotAddRemove = <unset>, 
-->                sharedBus = "noSharing", 
-->                scsiCtlrUnitNumber = 7
-->             }, 
-->             profile = <unset>, 
-->             backing = (vim.vm.device.VirtualDeviceSpec.BackingSpec) null
-->          }, 
-->          (vim.vm.device.VirtualDeviceSpec) {
-->             operation = "add", 
-->             fileOperation = "create", 
-->             device = (vim.vm.device.VirtualDisk) {
-->                key = -202, 
-->                deviceInfo = (vim.Description) null, 
-->                backing = (vim.vm.device.VirtualDisk.FlatVer2BackingInfo) {
-->                   fileName = "ds:///vmfs/volumes/f9492891-680f9214/", 
-->                   datastore = <unset>, 
-->                   backingObjectId = <unset>, 
-->                   diskMode = "persistent", 
-->                   split = <unset>, 
-->                   writeThrough = <unset>, 
-->                   thinProvisioned = true, 
-->                   eagerlyScrub = false, 
-->                   uuid = <unset>, 
-->                   contentId = <unset>, 
-->                   changeId = <unset>, 
-->                   parent = (vim.vm.device.VirtualDisk.FlatVer2BackingInfo) null, 
-->                   deltaDiskFormat = <unset>, 
-->                   digestEnabled = <unset>, 
-->                   deltaGrainSize = <unset>, 
-->                   deltaDiskFormatVariant = <unset>, 
-->                   sharing = <unset>, 
-->                   keyId = (vim.encryption.CryptoKeyId) null
-->                }, 
-->                connectable = (vim.vm.device.VirtualDevice.ConnectInfo) null, 
-->                slotInfo = (vim.vm.device.VirtualDevice.BusSlotInfo) null, 
-->                controllerKey = -201, 
-->                unitNumber = 0, 
-->                capacityInKB = 33554432, 
-->                capacityInBytes = <unset>, 
-->                shares = (vim.SharesInfo) null, 
-->                storageIOAllocation = (vim.StorageResourceManager.IOAllocationInfo) null, 
-->                diskObjectId = <unset>, 
-->                vFlashCacheConfigInfo = (vim.vm.device.VirtualDisk.VFlashCacheConfigInfo) null, 
-->                iofilter = <unset>, 
-->                vDiskId = (vim.vslm.ID) null, 
-->                virtualDiskFormat = <unset>, 
-->                nativeUnmanagedLinkedClone = <unset>
-->             }, 
-->             profile = (vim.vm.ProfileSpec) [
-->                (vim.vm.EmptyProfileSpec) {
-->                }
-->             ], 
-->             backing = (vim.vm.device.VirtualDeviceSpec.BackingSpec) null
-->          }, 
-->          (vim.vm.device.VirtualDeviceSpec) {
-->             operation = "add", 
-->             fileOperation = <unset>, 
-->             device = (vim.vm.device.VirtualVmxnet3) {
-->                dynamicProperty = <unset>, 
-->                key = -1, 
-->                deviceInfo = (vim.Description) null, 
-->                backing = (vim.vm.device.VirtualEthernetCard.DistributedVirtualPortBackingInfo) {
-->                   port = (vim.dvs.PortConnection) {
-->                      switchUuid = "50 3c 68 92 37 54 1f 69-d5 71 9b 3c 6a b3 eb b6", 
-->                      portgroupKey = "dvportgroup-5171", 
-->                      portKey = <unset>, 
-->                      connectionCookie = <unset>
-->                   }
-->                }, 
-->                connectable = (vim.vm.device.VirtualDevice.ConnectInfo) null, 
-->                slotInfo = (vim.vm.device.VirtualDevice.BusSlotInfo) null, 
-->                controllerKey = <unset>, 
-->                unitNumber = <unset>, 
-->                addressType = <unset>, 
-->                macAddress = <unset>, 
-->                wakeOnLanEnabled = <unset>, 
-->                resourceAllocation = (vim.vm.device.VirtualEthernetCard.ResourceAllocation) null, 
-->                externalId = <unset>, 
-->                uptCompatibilityEnabled = <unset>
-->             }, 
-->             profile = <unset>, 
-->             backing = (vim.vm.device.VirtualDeviceSpec.BackingSpec) null
-->          }
-->       ], 
-->       cpuAllocation = (vim.ResourceAllocationInfo) null, 
-->       memoryAllocation = (vim.ResourceAllocationInfo) null, 
-->       latencySensitivity = (vim.LatencySensitivity) null, 
-->       cpuAffinity = (vim.vm.AffinityInfo) null, 
-->       memoryAffinity = (vim.vm.AffinityInfo) null, 
-->       networkShaper = (vim.vm.NetworkShaperInfo) null, 
-->       cpuFeatureMask = <unset>, 
-->       extraConfig = <unset>, 
-->       swapPlacement = <unset>, 
-->       bootOptions = (vim.vm.BootOptions) null, 
-->       vAppConfig = (vim.vApp.VmConfigSpec) null, 
-->       ftInfo = (vim.vm.FaultToleranceConfigInfo) null, 
-->       repConfig = (vim.vm.ReplicationConfigSpec) null, 
-->       vAppConfigRemoved = <unset>, 
-->       vAssertsEnabled = <unset>, 
-->       changeTrackingEnabled = <unset>, 
-->       firmware = <unset>, 
-->       maxMksConnections = <unset>, 
-->       guestAutoLockEnabled = <unset>, 
-->       managedBy = (vim.ext.ManagedByInfo) null, 
-->       memoryReservationLockedToMax = <unset>, 
-->       nestedHVEnabled = <unset>, 
-->       vPMCEnabled = <unset>, 
-->       scheduledHardwareUpgradeInfo = (vim.vm.ScheduledHardwareUpgradeInfo) null, 
-->       vmProfile = (vim.vm.ProfileSpec) [
-->          (vim.vm.EmptyProfileSpec) {
-->          }
-->       ], 
-->       messageBusTunnelEnabled = <unset>, 
-->       crypto = (vim.encryption.CryptoSpec) null, 
-->       migrateEncryption = <unset>
-->    }, 
-->    dstLocation = (vpx.vmprov.VmLocation) {
-->       service = (vpx.vmprov.ServiceEndpointState) {
-->          instanceName = "devcd1-vcs-01.lod.netapp.com", 
-->          url = "https://devcd1-vcs-01.lod.netapp.com:443/sdk", 
-->          about = (vim.AboutInfo) {
-->             name = "VMware VirtualCenter", 
-->             fullName = "VMware VirtualCenter Server", 
-->             vendor = "", 
-->             version = "6.7.0", 
-->             build = "16046713", 
-->             localeVersion = <unset>, 
-->             localeBuild = <unset>, 
-->             osType = "linux-x64", 
-->             productLineId = "vpx", 
-->             apiType = "VirtualCenter", 
-->             apiVersion = "6.7.3", 
-->             instanceUuid = "ed25bc67-bf6b-422f-9f9b-e9fba1fa9f0a", 
-->             licenseProductName = "VMware VirtualCenter Server", 
-->             licenseProductVersion = "6.0"
-->          }
-->       }, 
-->       datacenter = 'vim.Datacenter:ed25bc67-bf6b-422f-9f9b-e9fba1fa9f0a:datacenter-21', 
-->       folder = 'vim.Folder:ed25bc67-bf6b-422f-9f9b-e9fba1fa9f0a:group-v22', 
-->       computeResource = 'vim.ClusterComputeResource:ed25bc67-bf6b-422f-9f9b-e9fba1fa9f0a:domain-c26', 
-->       pool = 'vim.ResourcePool:resgroup-5353', 
-->       host = (vpx.vmprov.HostState) {
-->          host = 'vim.HostSystem:ed25bc67-bf6b-422f-9f9b-e9fba1fa9f0a:host-389', 
-->          product = (vim.AboutInfo) {
-->             name = "VMware ESXi", 
-->             fullName = "VMware ESXi 6.7.0 build-14320388", 
-->             vendor = "VMware, Inc.", 
-->             version = "6.7.0", 
-->             build = "14320388", 
-->             localeVersion = "INTL", 
-->             localeBuild = "000", 
-->             osType = "vmnix-x86", 
-->             productLineId = "embeddedEsx", 
-->             apiType = "HostAgent", 
-->             apiVersion = "6.7.3", 
-->             instanceUuid = <unset>, 
-->             licenseProductName = "VMware ESX Server", 
-->             licenseProductVersion = "6.0"
-->          }, 
[[TRUNCATED]]

Pulled the two lines that indicate that a folder is still getting defined in the inbound call:

2020-12-09T21:30:21.981Z error vpxd[04336] [Originator@6876 sub=VAppUtil opID=2e8672f2-01] [VpxdVAppUtil] The folder argument must not be set on a VM in a vApp
-->       folder = 'vim.Folder:ed25bc67-bf6b-422f-9f9b-e9fba1fa9f0a:group-v22', 

EFI firmware switching to BIOS when destination is vSphere content library

This issue was originally opened by @cmbits as hashicorp/packer#9898. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

VM boot firmware issue when using Packer 1.6.2 and specifying the destination to be an OVF in a vSphere content library. The buildfile is specifying firmware type efi. The VM created is using efi but something happens when it is cloned to the content library as an OVF causing any new VMs deployed from that content library template switch to using BIOS firmware and can't boot into Windows OS.

Reproduction Steps

  • Using packer 1.6.2, specifying efi firmware, and destination as an OVF in a vSphere content library I get VMs deployed from the content library using BIOS firmware instead of EFI.

  • If I take the same buildfile without the content library destination and packer 1.6.1 I get a VM using EFI firmware. I can then manually clone to library as OVF and that content library template can be deployed multiple times without boot firmware switching to BIOS.

  • I've made sure to specify cdrom_type:sata and convert_to_template:false.

Packer version

From packer version 1.6.2

Simplified Packer Buildfile

https://gist.github.com/cmbits/ae9ba8ef31032bc85597a6e5a3485d23

Operating system and Environment details

Windows 2016/2019 Core/GUI
vSphere 7
vSAN storage

`vsphere-iso`: Add support for SATA `disk_controller_type`

This issue was originally opened by @telefax as hashicorp/packer#10274. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Guidelines

  • Vote on this issue by adding a 👍 reaction to the original issue initial description to help the maintainers prioritize.
  • Do not leave "+1" or other comments that do not add relevant information or questions.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

While trying to build aarch64 vm:s on my RPI 4 ESXi cluster I was struggling with booting my VM:s after install.
It turns out that CentOS 8 only supports "sata" as controller type but this is not supported by packer.

Packer only supports lsilogic, pvscsi, nvme and scsi.

Work around was to run packer in debug mode and change scsi controller to sata and then continue.

Similar issue but closed: hashicorp/packer#9732

I'm not sure if this problem only occurs if you build EFI vm:s. I'm unable to select anything else than EFI anyway (only option in vcenter).

At first it didn't even boot the install ISO, but it worked when I set "cdrom_type": "sata". Default IDE didn't work at all.

vsphere-iso: dial tcp [ip_address]: connect: connection refused

This issue was originally opened by @raduad as hashicorp/packer#10222. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Getting connection refused during a vsphere-iso build.

Using

packer 1.6.5 / 1.6.3
provisioner used: ansible
Red Hat Enterprise Linux Server release 7.4 (Maipo)
Linux 3.10.0-862.11.6.el7.x86_64 hashicorp/packer#1 SMP Fri Aug 10 16:55:11 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux

We see this during the provisioner phase of the build. It happens with both RHEL and SLES builds, but it happens sporadically.

At the moment of the problem happening we can log in to the in-progress VM using SSH without any issues. There's no firewall setting, no condition blocking the connection.

`vsphere-iso/clone`: error creating vm: Invalid configuration for device '2

This issue was originally opened by @sburgess123 as hashicorp/packer#10669. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

I'm trying to fix the mac address/ip on a windows vm which will ultimately allow me to install some required additional software before converting to a template. I have tried this using both vsphere-iso and vsphere-clone builders to set the mac address and get similar errors
vsphere-iso error:
"error creating vm: Invalid configuration for device '2'"

vsphere-clone error:
"error creating vm: Invalid configuration for device '0'"

I am able to set the mac address on vm's generally through the vsphere client with no issue using the same credentials

Reproduction Steps

Steps to reproduce this issue

Packer version 1.6.6

Simplified Packer Buildfile

"guest_os_type": "windows9Server64Guest",
"insecure_connection": "true",
"iso_paths": [
"{{user os_iso_path}}",
"{{user vmtools_iso_path}}/windows.iso"
],
"remove_cdrom": true,
"network_adapters": [
{
"network": "{{user vsphere-network}}",
"network_card": "vmxnet3",
"mac_address": "00-50-56-12-34-56"
}
],
"notes": "Built via Packer",
"password": "{{user vsphere-password}}",
"disk_controller_type": "lsilogic-sas",
"storage": [
{
"disk_size": "{{user vm-disk-size}}",
"disk_thin_provisioned": true
}
],
"type": "vsphere-iso",

Operating system and Environment details

Windows 2016 server, VCenter Server 7

Log Fragments and crash.log files

021/02/20 10:44:21 [INFO] Packer version: 1.6.6 [go1.15.6 windows amd64]
2021/02/20 10:44:21 Checking 'PACKER_CONFIG' for a config file path
2021/02/20 10:44:21 'PACKER_CONFIG' not set; checking the default config file path
2021/02/20 10:44:21 Attempting to open config file: C:\Users\burgess\AppData\Roaming\packer.config
2021/02/20 10:44:21 [WARN] Config file doesn't exist: C:\Users\burgess\AppData\Roaming\packer.config
2021/02/20 10:44:21 Setting cache directory: C:\git\windows-base-templates\provision\packer\packer_cache
cannot determine if process is in background: Process background check error: not implemented yet
2021/02/20 10:44:21 Creating plugin client for path: C:\Apps\packer\packer_1.6.6\packer.exe
2021/02/20 10:44:21 Starting plugin: C:\Apps\packer\packer_1.6.6\packer.exe []string{"C:\Apps\packer\packer_1.6.6\packer.exe", "plugin", "packer-builder-vsphere-iso"}
2021/02/20 10:44:21 Waiting for RPC address for: C:\Apps\packer\packer_1.6.6\packer.exe
2021/02/20 10:44:21 packer.exe plugin: [INFO] Packer version: 1.6.6 [go1.15.6 windows amd64]
2021/02/20 10:44:21 packer.exe plugin: Checking 'PACKER_CONFIG' for a config file path
2021/02/20 10:44:21 packer.exe plugin: 'PACKER_CONFIG' not set; checking the default config file path
2021/02/20 10:44:21 packer.exe plugin: Attempting to open config file: C:\Users\burgess\AppData\Roaming\packer.config
2021/02/20 10:44:21 packer.exe plugin: [WARN] Config file doesn't exist: C:\Users\burgess\AppData\Roaming\packer.config
2021/02/20 10:44:21 packer.exe plugin: Setting cache directory: C:\git\windows-base-templates\provision\packer\packer_cache
2021/02/20 10:44:21 packer.exe plugin: args: []string{"packer-builder-vsphere-iso"}
2021/02/20 10:44:21 packer.exe plugin: Plugin port range: [10000,25000]
2021/02/20 10:44:21 packer.exe plugin: Plugin address: tcp 127.0.0.1:10000
2021/02/20 10:44:21 packer.exe plugin: Waiting for connection...
2021/02/20 10:44:21 Received tcp RPC address for C:\Apps\packer\packer_1.6.6\packer.exe: addr is 127.0.0.1:10000
2021/02/20 10:44:21 packer.exe plugin: Serving a plugin connection...
2021/02/20 10:44:21 Creating plugin client for path: C:\Apps\packer\packer_1.6.6\packer.exe
2021/02/20 10:44:21 Starting plugin: C:\Apps\packer\packer_1.6.6\packer.exe []string{"C:\Apps\packer\packer_1.6.6\packer.exe", "plugin", "packer-provisioner-powershell"}
2021/02/20 10:44:21 Waiting for RPC address for: C:\Apps\packer\packer_1.6.6\packer.exe
2021/02/20 10:44:22 packer.exe plugin: [INFO] Packer version: 1.6.6 [go1.15.6 windows amd64]
2021/02/20 10:44:22 packer.exe plugin: Checking 'PACKER_CONFIG' for a config file path
2021/02/20 10:44:22 packer.exe plugin: 'PACKER_CONFIG' not set; checking the default config file path
2021/02/20 10:44:22 packer.exe plugin: Attempting to open config file: C:\Users\burgess\AppData\Roaming\packer.config
2021/02/20 10:44:22 packer.exe plugin: [WARN] Config file doesn't exist: C:\Users\burgess\AppData\Roaming\packer.config
2021/02/20 10:44:22 packer.exe plugin: Setting cache directory: C:\git\windows-base-templates\provision\packer\packer_cache
2021/02/20 10:44:22 packer.exe plugin: args: []string{"packer-provisioner-powershell"}
2021/02/20 10:44:22 packer.exe plugin: Plugin port range: [10000,25000]
2021/02/20 10:44:22 packer.exe plugin: Plugin address: tcp 127.0.0.1:10000
2021/02/20 10:44:22 packer.exe plugin: Waiting for connection...
2021/02/20 10:44:22 Received tcp RPC address for C:\Apps\packer\packer_1.6.6\packer.exe: addr is 127.0.0.1:10000
2021/02/20 10:44:22 packer.exe plugin: Serving a plugin connection...
2021/02/20 10:44:22 Preparing build: vsphere-iso
vsphere-iso: output will be in this color.
2021/02/20 10:44:22 Build debug mode: false

2021/02/20 10:44:22 Force build: true
2021/02/20 10:44:22 On error:
2021/02/20 10:44:22 Waiting on builds to complete...
2021/02/20 10:44:22 Starting build run: vsphere-iso
2021/02/20 10:44:22 Running builder: vsphere-iso
2021/02/20 10:44:22 [INFO] (telemetry) Starting builder vsphere-iso
2021/02/20 10:44:22 packer.exe plugin: No URLs were provided to Step Download. Continuing...
2021/02/20 10:44:22 packer.exe plugin: No CD files specified. CD disk will not be made.
==> vsphere-iso: Creating VM...
2021/02/20 10:44:24 [INFO] (telemetry) ending vsphere-iso
Build 'vsphere-iso' errored after 2 seconds 385 milliseconds: error creating vm: Invalid configuration for device '2'.
==> Wait completed after 2 seconds 386 milliseconds
2021/02/20 10:44:24 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2021/02/20 10:44:24 machine readable: vsphere-iso,error []string{"error creating vm: Invalid configuration for device '2'."}
==> Builds finished but no artifacts were created.
2021/02/20 10:44:24 [INFO] (telemetry) Finalizing.

==> Wait completed after 2 seconds 386 milliseconds

==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: error creating vm: Invalid configuration for device '2'.

==> Builds finished but no artifacts were created.
2021/02/20 10:44:25 waiting for all plugin processes to complete...
2021/02/20 10:44:25 C:\Apps\packer\packer_1.6.6\packer.exe: plugin process exited

Build 'vsphere-iso' errored: error creating vm: host '' not found

This issue was originally opened by @mashiutz as hashicorp/packer#9623. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Hello there.
With continuance to the previous issue that was automatically closed ( hashicorp/packer#8817 )
It is Imposible for me to provisin a template using Packer 1.6 on vSphere 6.7.
im getting the following error:
Build 'vsphere-iso' errored: error creating vm: host '' not found

This is the template im using:
`

{
"builders": [
{
"type": "vsphere-iso",

    "vcenter_server":       "vsphere.domain.net",
    "insecure_connection":  "true",
    "username":             "[email protected]",
    "password":             "adminpass",
    "datacenter":           "Datacenter-IT",
    "Cluster":              "IT-256GB",
    "resource_pool":        "IT-256GB-DS",
   
    "communicator":         "winrm",
    "winrm_username":       "Superman",
    "winrm_password":       "A123456a",
    "vm_name":              "TEMPLATE-TERM",
    "folder":               "WinTemplate-IT",
    "convert_to_template":  "true",
    "cpus":                 "2",
    "ram":                  "4096",
    "disk_controller_type": "lsilogic-sas",
      "guest_os_type":        "windows9Server64Guest",
      "iso_paths": [
      "[VMFS_ISO_G400PTK] Windows-ISO/SW_DVD9_Win_Server_STD_CORE_2019_1809.1_64Bit_English_DC_STD_MLF_X22-02970.ISO",
      "[VMFS_ISO_G400PTK] VM Tools 10.1.0/VMWare-Tools-10.1.0-core-4449150/vmtools/VMTools 10.1.0-windows.iso"
    ],

    "floppy_files": [
      "floppy/autounattend.xml",
      "floppy/setup.ps1",
      "floppy/vmtools.cmd"
    ],
    
    "network_adapters" : [
      {
      "network_card": "vmxnet3"
      }
    ],

    "storage": [
      {
        "disk_size": "71680",
        "disk_thin_provisioned": false
      }
    ]
  }
]

}

`
accorid to @sylviamoss who helped me with this issue, the problem has to do with the missing network config - if it is missing, Packer will look for the Host config, and if both are missing, Packer will throw that error.
He has compiled some binaries with an error handling mechanism that should throw more details when this error occurs - but those details never come up (maybe the error in my case is not the network config thing, and i "fall down" on some other problem)

this is the log output for those binaries (the same output comes up for the main release as well):

2020/06/25 00:33:34 [WARN] Config file doesn't exist: C:\Users\user\AppData\Roaming\packer.config
2020/06/25 00:33:34 Setting cache directory: packer_cache
cannot determine if process is in background: Process background check error: not implemented yet
2020/06/25 00:33:34 Creating plugin client for path: packer.exe
2020/06/25 00:33:34 Starting plugin: packer.exe []string{"packer.exe", "plugin", "packer-builder-vsphere-iso"}
2020/06/25 00:33:34 Waiting for RPC address for: packer.exe
2020/06/25 00:33:34 packer.exe plugin: [INFO] Packer version: 1.6.0-dev (caf0d09) [go1.13.12 windows amd64]
2020/06/25 00:33:34 packer.exe plugin: Checking 'PACKER_CONFIG' for a config file path
2020/06/25 00:33:34 packer.exe plugin: 'PACKER_CONFIG' not set; checking the default config file path
2020/06/25 00:33:34 packer.exe plugin: Attempting to open config file: C:\Users\user\AppData\Roaming\packer.config
2020/06/25 00:33:34 packer.exe plugin: [WARN] Config file doesn't exist: C:\Users\user\AppData\Roaming\packer.config
2020/06/25 00:33:34 packer.exe plugin: Setting cache directory: packer_cache
2020/06/25 00:33:34 packer.exe plugin: args: []string{"packer-builder-vsphere-iso"}
2020/06/25 00:33:34 packer.exe plugin: Plugin port range: [10000,25000]
2020/06/25 00:33:34 packer.exe plugin: Plugin address: tcp 127.0.0.1:10000
2020/06/25 00:33:34 packer.exe plugin: Waiting for connection...
2020/06/25 00:33:34 Received tcp RPC address for packer.exe: addr is 127.0.0.1:10000
2020/06/25 00:33:34 packer.exe plugin: Serving a plugin connection...
2020/06/25 00:33:34 Preparing build: vsphere-iso
2020/06/25 00:33:34 ui: vsphere-iso: output will be in this color.
2020/06/25 00:33:34 ui:
2020/06/25 00:33:34 Build debug mode: false
2020/06/25 00:33:34 Force build: false
2020/06/25 00:33:34 On error:
2020/06/25 00:33:34 Waiting on builds to complete...
2020/06/25 00:33:34 Starting build run: vsphere-iso
2020/06/25 00:33:34 Running builder: vsphere-iso
2020/06/25 00:33:34 [INFO] (telemetry) Starting builder vsphere-iso
2020/06/25 00:33:35 ui: ==> vsphere-iso: Creating VM...
2020/06/25 00:33:35 [INFO] (telemetry) ending vsphere-iso
2020/06/25 00:33:35 ui error: Build 'vsphere-iso' errored: error creating vm: host '' not found
2020/06/25 00:33:35 machine readable: error-count []string{"1"}
2020/06/25 00:33:35 ui error:
==> Some builds didn't complete successfully and had errors:
2020/06/25 00:33:35 machine readable: vsphere-iso,error []string{"error creating vm: host '' not found"}
2020/06/25 00:33:35 ui error: --> vsphere-iso: error creating vm: host '' not found
2020/06/25 00:33:35 ui:
==> Builds finished but no artifacts were created.
2020/06/25 00:33:35 [INFO] (telemetry) Finalizing.
2020/06/25 00:33:36 waiting for all plugin processes to complete...
2020/06/25 00:33:36 packer.exe: plugin process exited

I realy want to start using packer and automatically build templates for our environemt, i hope someone can help me with this.
Thank ahead :)

vsphere-iso unable to set vhv.enable configuration_parameter

This issue was originally opened by @corrigat as hashicorp/packer#10215. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

Using HCL against vSphere 6.7.0 build 16046713 and ESXi 6.7.0 build 16075168 as an administrative userr, I'm unable to set vhv.enable = "TRUE" in vmx config using configuration_parameters while other configuration options do work.

This is super weird and I'm struggling to determine any difference, other than the vhv.enable setting, between vmx files when the setting is applied manually. My thought was to check if there were other dependent settings and validate the name of the setting, but from what I found is this should be correct.

Here is the vmx after StepConfigParams
https://gist.github.com/corrigat/649f41f64f567c11d2714366d813f5df

Here is the vmx after StepRun
https://gist.github.com/corrigat/c2613ac51bd2aa6c1e1da0e56eebba53

I also have a quiet vsphere 7.0 setup at home where this occurs and can provide logs if it would be any help. I checked hostd.log but the reconfigure task section shows nothing added/changed/deleted: https://gist.github.com/corrigat/8a3ee0099999059ae43dc2cc42e09638

Also, I've found that before vmx_data was moved to configuration_parameters, this seems to have worked: hashicorp/packer#8998

Reproduction Steps

With a basic template (I'm using CentOS), include configuration_parameters

configuration_parameters = {
      "featMask.vm.hv.capable" = "Min:1"
      "vhv.enable" = "TRUE"
}

and build the machine.

  • featMask.vm.hv.capable in this case will be present in the vmx configuration file, however, vhv.enable will not be present during or after the build.
  • When featmask.vm.vh.capable is removed, leaving only vhv.enable, the vhv.enable setting will still be absent during and after the build.

Packer version

1.6.5

Simplified Packer Buildfile

source "vsphere-iso" "centos7-base" {
    # vCenter settings
    vcenter_server           = var.vcenter_server
    username                 = var.vcenter_username
    password                 = var.vcenter_password
    insecure_connection      = true
    cluster                  = var.vcenter_cluster
    datacenter               = var.vcenter_datacenter
    #host                     = var.vcenter_host
    datastore                = var.vcenter_datastore
    convert_to_template      = false
    folder                   = var.vcenter_folder
    configuration_parameters = {
      "featMask.vm.hv.capable" = "Min:1"
      "vhv.enable" = "TRUE"
    }

    # VM Settings
    ip_wait_timeout          = "45m"
    ssh_username             = var.connection_username
    ssh_password             = var.connection_password
    ssh_timeout              = "12h"
    ssh_port                 = "22"
    ssh_handshake_attempts   = "20"
    shutdown_timeout         = "15m"
    vm_version               = var.vm_hardware_version
    iso_paths                = [var.os_iso_path]
    iso_checksum             = var.iso_checksum
    vm_name                  = var.vm_name
    guest_os_type            = var.guest_os_type
    disk_controller_type     = ["pvscsi"]
    floppy_files             = ["boot_config/ks.cfg"]
    network_adapters {
      network                = var.vm_network
      network_card           = var.nic_type
    }
    storage {
      disk_size              = var.root_disk_size
      disk_thin_provisioned  = true
    }
    CPUs                     = var.num_cpu
    cpu_cores                = var.num_cores
    RAM                      = var.vm_ram
    boot_wait                = "5s"
    boot_command             = var.boot_command
  }

Operating system and Environment details

packer executed from MacOS 10.15.7

Log Fragments and crash.log files

Logs do not appear to emit messages regarding success or failure of the configuration_parameter step, no other errors in the output.

vSphere-iso Builder issue with Packer 1.7.0

This issue was originally opened by @yiwryos as hashicorp/packer#10834. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


When attempting to build a VM template using the vsphere-iso builder and packer 1.7.0 with a account using RBAC based on the documentation in here https://www.packer.io/docs/builders/vmware/vsphere-iso I'm getting the following error

Build 'vsphere-iso' errored after 3 seconds 462 milliseconds: 404 Not Found

==> Wait completed after 3 seconds 462 milliseconds

==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: 404 Not Found

==> Builds finished but no artifacts were created.

I know the code is good because I can build the template via packer when I login to ESX as admin. So my code is valid
Using the same non-priv account I can create the VM and Template using using the vSphere UI. So the custom roles for the account are valid
The documentation under Required vSphere Permissions state the following permission is needed, however that is not a valid role under ESX 6.7

Datacenter (this object):
Datastore -> Low level file operations

There is no Low level file operations role under Datacenter. That is a valid role for the Datastore permissions.

`vsphere-clone`: Add support to clone a VM from a template in a content library

This issue was originally opened by @graham1228 as hashicorp/packer#10318. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Please search the existing issues for relevant feature requests, and use the
reaction feature
(https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/)
to add upvotes to pre-existing requests.

Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

As a user of the vsphere-clone builder I would like the ability to create a vm from an existing vm template in a content library.

Use Case(s)

We use the content library to store our base OS OVF images and need to be able to utilize these templates to layer on solution specific software leveraging the vsphere-clone builder.

Potential configuration

Utilizing the same configuration item, additionally search in all content libraries for the template name

template (string) - Name of source VM. Path is optional.

Potential References

PACKER_HTTP_ADDR variable not being set correctly

This issue was originally opened by @DarrenF-G as hashicorp/packer#9973. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Issue;
While trying to use the http_directory directive, packer binds to the correct local interface/address and can be accessed by both Host and guest as expected, however the $env:packer_http_address variable does not get set to the same bind address. It will either get set to an apipa address or pick up an address from a random interface.

Packer: 1.6.2(tried 1.60 with same result)
Builder: vsphere-iso
Host: Wndow 2004 (19041.508)
vCenter: 6.7.0.10000
esxi: 6.7.0, 15018017

Interfaces present - https://gist.github.com/DarrenF-G/8b5c6d010b066bbd79d8460410adb071
Packer log for windows 10 build - https://gist.github.com/DarrenF-G/959bfd8fb7dcb0192d5092d609a4dbba
Packer log for Ubuntu build - https://gist.github.com/DarrenF-G/0b931613860185e43662db2dcaec8f5c
Example json - https://gist.github.com/DarrenF-G/17753bd304235539877c1c423d9cbaad

if I step though the windows post-deployment and and run
cat C:\Windows\Temp\packer-ps-env-vars-5f6341e7-d0d7-b303-d15a-531e95ab11a3.ps1
I get;
$env:PACKER_BUILDER_TYPE="vsphere-iso"; $env:PACKER_BUILD_NAME="vsphere-iso"; $env:PACKER_HTTP_ADDR="172.22.32.1:8067"; $env:PACKER_HTTP_IP="172.22.32.1"; $env:PACKER_HTTP_PORT="8067";

as no http server is binding on 172.22.32.1 i am unable to pull down files using this variable.
Tried to replicate it on another windows host with only one interface and get the same result, it sets the address as 169.* instead of my local IP.
I also tried an ubuntu image too as it uses http.ip and http.port but it also sets the variables to a 169.* address when it should set it to a 10.* address(see the above logs)

Let me know if you need any further information

Add support for `cleanup_remote_cache_max_age` for cache

This issue was originally opened by @jpbuecken as hashicorp/packer#10842. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Please search the existing issues for relevant feature requests, and use the
reaction feature
(https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/)
to add upvotes to pre-existing requests.

Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

This issue should be a follow up feature request to hashicorp/packer#9843

After cleanup_remote_cache has been implemented as described in hashicorp/packer#9843 (which would be sufficient in the first step),
is it possible to do the cleanup based on a

cleanup_remote_cache_max_age

option?

e.g.
cleanup_remote_cache=true
cleanup_remote_cache_max_age = 5 : Delete all files older than 5 days after successful build (not only the one just touched)

Maybe special cases should be allowed:
cleanup_remote_cache_max_age = -1 : Delete file after successful build (behaviour as described in hashicorp/packer#9843). Default value.
cleanup_remote_cache_max_age = 0 : Delete all files after successful build (not only the one just touched)

Maybe a warning should be added or logic is needed to avoid the following:
Warning: If you use cleanup_remote_cache_max_age>=0 you should not run parallel builds since one packer process may delete files by other running packer processes.
Possible Workaround: Use subfolders in packer_cache depending on vm_name value: You cannot build the same vm_name in parallel anyway.

Use Case(s)

Keep the uploaded iso some time to rebuild from it again. But still cleanup if newer iso files for the build has been uploaded.

Potential configuration

   "builders": [
      {
         "type":                      "vsphere-iso",
          "cleanup_remote_cache": true
          "cleanup_remote_cache_max_age": 5.
          [...]

Potential References

hashicorp/packer#9843

Vsphere-iso tag-assignment

This issue was originally opened by @ctorkington-craven as hashicorp/packer#9685. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Could you add the ability to apply a vCenter tag to a vm?
Most of the vsphere hardening standards can be applied through configuration_parameters and force_bios_setup the ability to add a tag would allow me to know that the vm has been hardened already.

Make vSphere vim.vm.FlagInfo API available to configuration file.

This issue was originally opened by @Borkason as hashicorp/packer#10239. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

Please make the vSphere vim.vm.FlagInfo API available to the configuration file:
https://code.vmware.com/apis/358/vsphere/doc/vim.vm.FlagInfo.html

The vim.vm.ConfigSpec API is already available through configuration_parameters:
https://code.vmware.com/apis/358/vsphere/doc/vim.vm.ConfigSpec.html

Use Case(s)

Allows us to build VMs with active Virtualization-Based Security (vbsEnabled) and disk UUIDs (diskUuidEnabled). Allows the usage of the API in general.

Potential configuration

{
    "builders": [
        {
            "vim_vm_FlagInfo": {
                "diskUuidEnabled": true,
                "vbsEnabled": true,
                "vvtdEnabled": true
            }
        }
    ]
}

`vsphere-iso`: Add support for exporting to OVA

This issue was originally opened by @nilic as hashicorp/packer#9645. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Currently, the vsphere-iso builder allows exporting the created template only in OVF format, while vmware-iso also has OVA support. OVA is in many cases preferred because of template portability (one file with OVA vs multiple files with OVF) and sometimes required, since some of the automation tooling (such as Terraform provider for vCloud Director) supports only OVA as input for catalog uploads.

`vsphere-iso`: Packer can't acquire IP from Cisco VM

This issue was originally opened by @bruteForceAttack as hashicorp/packer#10363. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

Problem is with acquiring IP address from vmware vsphere (gathered from vm-tools) so Packer doesn't do the shutdown command via vmware tools. Information such as IP, hostname and so on however is present in vSphere also VM is reachable with SSH client from localhost.
So far this happens only with Cisco image of CSR 1000v: csr1000v-universalk9.16.09.01.iso other OS type like Linux or Windows is good.
Problem seems to be related to the change (GH-9450) as per release notes for version number 1.6.1.
Everything is fine and works with previous versions e.g. 1.5.6 or 1.6.0. I have tested 1.6.1+ and no success at all.

Reproduction Steps

packer build -force -var-file=variables.json template-cisco-csr.json

Packer version

1.6.5 (1.6.1+)

Simplified Packer Buildfile

Packer json template file:

{
    "builders": [
      {
        "type": "vsphere-iso",
        "vcenter_server": "{{user `vcenter_server`}}",
        "username": "{{user `username`}}",
        "password": "{{user `password_vsphere`}}",
        "insecure_connection": "true",
        "vm_name": "{{user `template_prefix`}}_cisco_csr_16.9.1",
        "datacenter": "{{user `datacenter`}}",
        "cluster": "{{user `cluster`}}",
        "resource_pool": "{{user `resource_pool`}}",
        "datastore": "{{user `datastore`}}",
        "host": "{{user `host`}}",
        "folder": "{{user `folder`}}",
        "guest_os_type": "other3xLinux64Guest",
        "CPUs": 2,
        "RAM": 4096,
        "RAM_reserve_all": true,
        "disk_controller_type": "pvscsi",
        "storage": [
          {
            "disk_size": 8192,
            "disk_thin_provisioned": true
          }
        ],
        "network_adapters": [
          {
            "network": "2000",
            "network_card": "vmxnet3"
          }
        ],
        "iso_paths": [
          "[Install] Cisco/CSR1000v/csr1000v-universalk9.16.09.01.iso"
        ],
        "iso_checksum": "sha512:c578a59272f1f80bc044bc63d8bd7dd106cda7ccc419457cdfcc5f3984c323bcb84da9e4928f3dc0a1d61093efcf546ab51305918d435ad01720332550c2db7b",
        "convert_to_template": "false",
        "communicator": "ssh",
        "ssh_username": "admin",
        "ssh_password": "{{user `password`}}",
        "ip_settle_timeout": "60s",
        "boot_wait": "5s",
        "boot_command": [
          "<wait10><enter><wait10><wait10><wait10><wait10><wait10>",
          "<wait10><wait10><wait10><wait10><wait10><wait10>",
          "<wait10><wait10><wait10><wait10><wait10><wait10>",
          "<wait10><wait10><wait10><wait10><wait10><wait10>",
          "<wait10><wait10><wait10><wait10><wait10><wait10>",
          "<wait10>",
          "<enter>enable<enter>conf t<enter>int giga1<enter>ip addr dhcp<enter><wait5>exit<enter>",
          "enable secret password 0 {{user `password`}}<enter>no service config<enter>hostname router.cisco<enter>ip domain-name lan.net<enter>crypto key gen rsa mod 4096<enter><wait5>",
          "username admin password 0 {{user `password`}}<enter>line vty 0 15<enter>login local<enter>transport input ssh<enter>exit<enter>",
          "do wr<enter>end<enter>exit<enter>"
        ]
      }
    ]
  }
  

Operating system and Environment details

uname -a
Darwin 20.1.0 Darwin Kernel Version 20.1.0: Sat Oct 31 00:07:11 PDT 2020; root:xnu-7195.50.7~2/RELEASE_X86_64 x86_64 i386 MacBookPro15,1 Darwin

Log Fragments and crash.log files

No crash logs are generated. The build timeouts because of no SSH connection into VM.

Packers standard output:

vsphere-iso: output will be in this color.
==> vsphere-iso: Creating VM...
==> vsphere-iso: Customizing hardware...
==> vsphere-iso: Mounting ISO images...
==> vsphere-iso: Adding configuration parameters...
==> vsphere-iso: Set boot order temporary...
==> vsphere-iso: Power on VM...
==> vsphere-iso: Waiting 5s for boot...
==> vsphere-iso: Typing boot command...
==> vsphere-iso: Waiting for IP...
==> vsphere-iso: Timeout waiting for IP.
==> vsphere-iso: Clear boot order...
==> vsphere-iso: Power off VM...
==> vsphere-iso: Destroying VM...
Build 'vsphere-iso' errored after 36 minutes 13 seconds: Timeout waiting for IP.

==> Wait completed after 36 minutes 13 seconds

==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: Timeout waiting for IP.

==> Builds finished but no artifacts were created.

Log file generated by packer (PACKER_LOG=1):

log file
2020/12/09 15:18:14 [INFO] Packer version: 1.6.5 [go1.15.3 darwin amd64]
2020/12/09 15:18:14 Checking 'PACKER_CONFIG' for a config file path
2020/12/09 15:18:14 'PACKER_CONFIG' not set; checking the default config file path
2020/12/09 15:18:14 Attempting to open config file: .packerconfig
2020/12/09 15:18:14 [WARN] Config file doesn't exist: .packerconfig
2020/12/09 15:18:14 Setting cache directory: templates/packer/packer_cache
2020/12/09 15:18:14 Creating plugin client for path: templates/packer/packer
2020/12/09 15:18:14 Starting plugin: templates/packer/packer []string{"templates/packer/packer", "plugin", "packer-builder-vsphere-iso"}
2020/12/09 15:18:14 Waiting for RPC address for: templates/packer/packer
2020/12/09 15:18:14 packer plugin: [INFO] Packer version: 1.6.5 [go1.15.3 darwin amd64]
2020/12/09 15:18:14 packer plugin: Checking 'PACKER_CONFIG' for a config file path
2020/12/09 15:18:14 packer plugin: 'PACKER_CONFIG' not set; checking the default config file path
2020/12/09 15:18:14 packer plugin: Attempting to open config file: .packerconfig
2020/12/09 15:18:14 packer plugin: [WARN] Config file doesn't exist: .packerconfig
2020/12/09 15:18:14 packer plugin: Setting cache directory: templates/packer/packer_cache
2020/12/09 15:18:14 packer plugin: args: []string{"packer-builder-vsphere-iso"}
2020/12/09 15:18:15 packer plugin: Plugin address: unix /var/folders/3l/jj8w1c854ml0hfzcjgczq1w00000gn/T/packer-plugin283264697
2020/12/09 15:18:15 packer plugin: Waiting for connection...
2020/12/09 15:18:15 Received unix RPC address for templates/packer/packer: addr is /var/folders/3l/jj8w1c854ml0hfzcjgczq1w00000gn/T/packer-plugin283264697
2020/12/09 15:18:15 packer plugin: Serving a plugin connection...
2020/12/09 15:18:15 Preparing build: vsphere-iso
2020/12/09 15:18:15 ui: �[1;32mvsphere-iso: output will be in this color.�[0m
2020/12/09 15:18:15 ui: 
2020/12/09 15:18:15 Build debug mode: false
2020/12/09 15:18:15 Force build: true
2020/12/09 15:18:15 On error: 
2020/12/09 15:18:15 Waiting on builds to complete...
2020/12/09 15:18:15 Starting build run: vsphere-iso
2020/12/09 15:18:15 Running builder: vsphere-iso
2020/12/09 15:18:15 [INFO] (telemetry) Starting builder vsphere-iso
2020/12/09 15:18:15 packer plugin: No URLs were provided to Step Download. Continuing...
2020/12/09 15:18:15 packer plugin: No CD files specified. CD disk will not be made.
2020/12/09 15:18:15 ui: �[1;32m==> vsphere-iso: Creating VM...�[0m
2020/12/09 15:18:17 ui: �[1;32m==> vsphere-iso: Customizing hardware...�[0m
2020/12/09 15:18:17 ui: �[1;32m==> vsphere-iso: Mounting ISO images...�[0m
2020/12/09 15:18:17 packer plugin: Check if ISO path is a Content Library path
2020/12/09 15:18:18 packer plugin: ISO path not identified as a Content Library path
2020/12/09 15:18:18 packer plugin: Using [Install] Cisco/CSR1000v/csr1000v-universalk9.16.09.01.iso as the datastore path
2020/12/09 15:18:18 packer plugin: Creating CD-ROM on controller '&{{{} 200 0xc000cab720 <nil> <nil> <nil> 0 <nil>} 0 []}' with iso '[Install] Cisco/CSR1000v/csr1000v-universalk9.16.09.01.iso'
2020/12/09 15:18:21 ui: �[1;32m==> vsphere-iso: Adding configuration parameters...�[0m
2020/12/09 15:18:21 packer plugin: No floppy files specified. Floppy disk will not be made.
2020/12/09 15:18:21 ui: �[1;32m==> vsphere-iso: Set boot order temporary...�[0m
2020/12/09 15:18:21 ui: �[1;32m==> vsphere-iso: Power on VM...�[0m
2020/12/09 15:18:23 ui: �[1;32m==> vsphere-iso: Waiting 5s for boot...�[0m
2020/12/09 15:18:28 ui: �[1;32m==> vsphere-iso: Typing boot command...�[0m
2020/12/09 15:18:28 packer plugin: [INFO] Waiting 10s
2020/12/09 15:18:38 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:18:38 packer plugin: [INFO] Waiting 10s
2020/12/09 15:18:48 packer plugin: [INFO] Waiting 10s
2020/12/09 15:18:58 packer plugin: [INFO] Waiting 10s
2020/12/09 15:19:08 packer plugin: [INFO] Waiting 10s
2020/12/09 15:19:18 packer plugin: [INFO] Waiting 10s
2020/12/09 15:19:28 packer plugin: [INFO] Waiting 10s
2020/12/09 15:19:38 packer plugin: [INFO] Waiting 10s
2020/12/09 15:19:48 packer plugin: [INFO] Waiting 10s
2020/12/09 15:19:58 packer plugin: [INFO] Waiting 10s
2020/12/09 15:20:08 packer plugin: [INFO] Waiting 10s
2020/12/09 15:20:18 packer plugin: [INFO] Waiting 10s
2020/12/09 15:20:28 packer plugin: [INFO] Waiting 10s
2020/12/09 15:20:38 packer plugin: [INFO] Waiting 10s
2020/12/09 15:20:48 packer plugin: [INFO] Waiting 10s
2020/12/09 15:20:58 packer plugin: [INFO] Waiting 10s
2020/12/09 15:21:08 packer plugin: [INFO] Waiting 10s
2020/12/09 15:21:18 packer plugin: [INFO] Waiting 10s
2020/12/09 15:21:28 packer plugin: [INFO] Waiting 10s
2020/12/09 15:21:38 packer plugin: [INFO] Waiting 10s
2020/12/09 15:21:48 packer plugin: [INFO] Waiting 10s
2020/12/09 15:21:58 packer plugin: [INFO] Waiting 10s
2020/12/09 15:22:08 packer plugin: [INFO] Waiting 10s
2020/12/09 15:22:18 packer plugin: [INFO] Waiting 10s
2020/12/09 15:22:28 packer plugin: [INFO] Waiting 10s
2020/12/09 15:22:38 packer plugin: [INFO] Waiting 10s
2020/12/09 15:22:48 packer plugin: [INFO] Waiting 10s
2020/12/09 15:22:58 packer plugin: [INFO] Waiting 10s
2020/12/09 15:23:08 packer plugin: [INFO] Waiting 10s
2020/12/09 15:23:18 packer plugin: [INFO] Waiting 10s
2020/12/09 15:23:28 packer plugin: [INFO] Waiting 10s
2020/12/09 15:23:38 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:38 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:38 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:38 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:38 packer plugin: Sending char 'b', code CodeB, shift false
2020/12/09 15:23:38 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:23:39 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:39 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:39 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:39 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:39 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:39 packer plugin: Sending char 'f', code CodeF, shift false
2020/12/09 15:23:39 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:39 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:23:40 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:40 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:40 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:40 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:23:40 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:40 packer plugin: Sending char 'g', code CodeG, shift false
2020/12/09 15:23:40 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:40 packer plugin: Sending char 'g', code CodeG, shift false
2020/12/09 15:23:41 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:41 packer plugin: Sending char '1', code Code1, shift false
2020/12/09 15:23:41 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:41 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:41 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:23:41 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:41 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:41 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:23:42 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:23:42 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:23:42 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:42 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:23:42 packer plugin: Sending char 'h', code CodeH, shift false
2020/12/09 15:23:42 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:42 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:23:43 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:43 packer plugin: [INFO] Waiting 5s
2020/12/09 15:23:48 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:48 packer plugin: Sending char 'x', code CodeX, shift false
2020/12/09 15:23:48 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:48 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:23:48 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:48 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:48 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:49 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:49 packer plugin: Sending char 'b', code CodeB, shift false
2020/12/09 15:23:49 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:23:49 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:49 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:49 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:23:49 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:50 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:50 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:23:50 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:50 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:23:50 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:50 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:23:50 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:50 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:23:51 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:23:51 packer plugin: Sending char 'w', code CodeW, shift false
2020/12/09 15:23:51 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:51 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:23:51 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:23:51 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:51 packer plugin: Sending char '0', code Code0, shift false
2020/12/09 15:23:51 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:23:52 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:23:53 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:23:53 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:53 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:53 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:23:53 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:23:53 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:53 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:53 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:54 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:54 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:23:54 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:54 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:23:54 packer plugin: Sending char 'v', code CodeV, shift false
2020/12/09 15:23:54 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:54 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:54 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:55 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:55 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:55 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:55 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:55 packer plugin: Sending char 'f', code CodeF, shift false
2020/12/09 15:23:55 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:55 packer plugin: Sending char 'g', code CodeG, shift false
2020/12/09 15:23:56 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:56 packer plugin: Sending char 'h', code CodeH, shift false
2020/12/09 15:23:56 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:56 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:23:56 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:23:56 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:23:56 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:56 packer plugin: Sending char 'm', code CodeM, shift false
2020/12/09 15:23:57 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:57 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:57 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:23:57 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:57 packer plugin: Sending char 'u', code CodeU, shift false
2020/12/09 15:23:57 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:23:57 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:23:57 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:23:58 packer plugin: Sending char '.', code CodeFullStop, shift false
2020/12/09 15:23:58 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:58 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:58 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:23:58 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:23:58 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:58 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:23:59 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:23:59 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:23:59 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:23:59 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:23:59 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:23:59 packer plugin: Sending char 'm', code CodeM, shift false
2020/12/09 15:23:59 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:23:59 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:00 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:00 packer plugin: Sending char '-', code CodeHyphenMinus, shift false
2020/12/09 15:24:00 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:00 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:00 packer plugin: Sending char 'm', code CodeM, shift false
2020/12/09 15:24:00 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:00 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:01 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:24:01 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:01 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:01 packer plugin: Sending char '.', code CodeFullStop, shift false
2020/12/09 15:24:01 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:01 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:01 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:24:01 packer plugin: Sending char 'h', code CodeH, shift false
2020/12/09 15:24:02 packer plugin: Sending char '.', code CodeFullStop, shift false
2020/12/09 15:24:02 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:24:02 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:24:02 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:24:02 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:24:02 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:02 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:02 packer plugin: Sending char '.', code CodeFullStop, shift false
2020/12/09 15:24:03 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:03 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:03 packer plugin: Sending char 'g', code CodeG, shift false
2020/12/09 15:24:03 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:03 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:24:03 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:03 packer plugin: Sending char 'y', code CodeY, shift false
2020/12/09 15:24:04 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:24:04 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:04 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:04 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:04 packer plugin: Sending char 'k', code CodeK, shift false
2020/12/09 15:24:04 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:04 packer plugin: Sending char 'y', code CodeY, shift false
2020/12/09 15:24:04 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:05 packer plugin: Sending char 'g', code CodeG, shift false
2020/12/09 15:24:05 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:05 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:05 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:05 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:05 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:05 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:05 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:06 packer plugin: Sending char 'm', code CodeM, shift false
2020/12/09 15:24:06 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:06 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:24:06 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:06 packer plugin: Sending char '4', code Code4, shift false
2020/12/09 15:24:06 packer plugin: Sending char '0', code Code0, shift false
2020/12/09 15:24:06 packer plugin: Sending char '9', code Code9, shift false
2020/12/09 15:24:07 packer plugin: Sending char '6', code Code6, shift false
2020/12/09 15:24:07 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:07 packer plugin: [INFO] Waiting 5s
2020/12/09 15:24:12 packer plugin: Sending char 'u', code CodeU, shift false
2020/12/09 15:24:12 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:12 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:12 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:12 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:12 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'm', code CodeM, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:13 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'm', code CodeM, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:13 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:14 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:14 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:24:14 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:14 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:14 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:14 packer plugin: Sending char 'w', code CodeW, shift false
2020/12/09 15:24:15 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:15 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:15 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:24:15 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:15 packer plugin: Sending char '0', code Code0, shift false
2020/12/09 15:24:15 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:24:16 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:24:17 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:24:17 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:17 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:17 packer plugin: Sending char 'X', code CodeX, shift true
2020/12/09 15:24:17 packer plugin: Sending char 'X', code CodeX, shift false
2020/12/09 15:24:17 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:18 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:24:18 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:18 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:18 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:18 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:18 packer plugin: Sending char 'v', code CodeV, shift false
2020/12/09 15:24:18 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:19 packer plugin: Sending char 'y', code CodeY, shift false
2020/12/09 15:24:19 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:19 packer plugin: Sending char '0', code Code0, shift false
2020/12/09 15:24:19 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:19 packer plugin: Sending char '1', code Code1, shift false
2020/12/09 15:24:19 packer plugin: Sending char '5', code Code5, shift false
2020/12/09 15:24:20 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:20 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:24:20 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:20 packer plugin: Sending char 'g', code CodeG, shift false
2020/12/09 15:24:20 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:20 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:21 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:21 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:24:21 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:21 packer plugin: Sending char 'c', code CodeC, shift false
2020/12/09 15:24:21 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:21 packer plugin: Sending char 'l', code CodeL, shift false
2020/12/09 15:24:22 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:22 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:22 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:22 packer plugin: Sending char 'a', code CodeA, shift false
2020/12/09 15:24:22 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:22 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:22 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:24:22 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:23 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:23 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:23 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:23 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:23 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:23 packer plugin: Sending char 'p', code CodeP, shift false
2020/12/09 15:24:23 packer plugin: Sending char 'u', code CodeU, shift false
2020/12/09 15:24:23 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:24 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:24 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:24 packer plugin: Sending char 's', code CodeS, shift false
2020/12/09 15:24:24 packer plugin: Sending char 'h', code CodeH, shift false
2020/12/09 15:24:24 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:24 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:24 packer plugin: Sending char 'x', code CodeX, shift false
2020/12/09 15:24:24 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:25 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:25 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:25 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:24:25 packer plugin: Sending char 'o', code CodeO, shift false
2020/12/09 15:24:25 packer plugin: Sending char ' ', code CodeSpacebar, shift false
2020/12/09 15:24:25 packer plugin: Sending char 'w', code CodeW, shift false
2020/12/09 15:24:25 packer plugin: Sending char 'r', code CodeR, shift false
2020/12/09 15:24:25 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:26 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:26 packer plugin: Sending char 'n', code CodeN, shift false
2020/12/09 15:24:26 packer plugin: Sending char 'd', code CodeD, shift false
2020/12/09 15:24:26 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:26 packer plugin: Sending char 'e', code CodeE, shift false
2020/12/09 15:24:26 packer plugin: Sending char 'x', code CodeX, shift false
2020/12/09 15:24:26 packer plugin: Sending char 'i', code CodeI, shift false
2020/12/09 15:24:26 packer plugin: Sending char 't', code CodeT, shift false
2020/12/09 15:24:27 packer plugin: Special code '<enter>' found, replacing with: CodeReturnEnter
2020/12/09 15:24:27 packer plugin: [INFO] Waiting for IP, up to total timeout: 30m0s, settle timeout: 1m0s
2020/12/09 15:24:27 ui: �[1;32m==> vsphere-iso: Waiting for IP...�[0m
2020/12/09 15:54:27 ui error: �[1;31m==> vsphere-iso: Timeout waiting for IP.�[0m
2020/12/09 15:54:27 ui: �[1;32m==> vsphere-iso: Clear boot order...�[0m
2020/12/09 15:54:27 ui: �[1;32m==> vsphere-iso: Power off VM...�[0m
2020/12/09 15:54:27 ui: �[1;32m==> vsphere-iso: Destroying VM...�[0m
2020/12/09 15:54:28 [INFO] (telemetry) ending vsphere-iso
2020/12/09 15:54:28 ui error: �[1;31mBuild 'vsphere-iso' errored after 36 minutes 13 seconds: Timeout waiting for IP.�[0m
2020/12/09 15:54:28 ui: 
==> Wait completed after 36 minutes 13 seconds
2020/12/09 15:54:28 machine readable: error-count []string{"1"}
2020/12/09 15:54:28 ui error: 
==> Some builds didn't complete successfully and had errors:
2020/12/09 15:54:28 machine readable: vsphere-iso,error []string{"Timeout waiting for IP."}
2020/12/09 15:54:28 ui error: --> vsphere-iso: Timeout waiting for IP.
2020/12/09 15:54:28 ui: 
==> Builds finished but no artifacts were created.
2020/12/09 15:54:28 [INFO] (telemetry) Finalizing.
2020/12/09 15:54:30 waiting for all plugin processes to complete...
2020/12/09 15:54:30 templates/packer/packer: plugin process exited

`vsphere-iso`: packer-tmp-created-floppy.flp - contains corrupted files when "too many" files are added

This issue was originally opened by @jason-azze as hashicorp/packer#9998. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

A vsphere-iso build for Windows VMs that used to work began failing after I added a few more provisioning scripts to it. The specific symptom I encountered was that the Windows setup wizard began to complain that it could not find Autounattend.xml (which is the "answer file" for a quiet Windows installation). I eventually determined that the copy of Autounattend.xml on the packer-tmp-created-floppy.flp was corrupt, as were several other files. So Windows was telling the truth when it would hang during what was supposed to be an unattended setup.

I began to suspect that the provisioning scripts that I had added to my build were at fault because that was the only change I had made. I suspected that something in the file names was causing the data stream to be corrupted during floppy creation, or perhaps I had a weird, unprintable unicode character in my script from a bad copy-paste out of Slack! While I played around with this idea, I began shortening the file names of my new scripts. I eventually discovered (after a generous application of profanity) that a combination of shortening file names and deleting an unneeded file resulted in an uncorrupted flp file and the build was able to proceed. (Note, I checked to be sure was not exceeding the size of a typical floppy.)

So, I think I'm running into something like "too many arguments" for whatever code rolls up the files into the packer-tmp-created-floppy.flp, or perhaps too many total characters. ?

Reproduction Steps

Run a build with more than this many characters:

ls setup |wc -m
1670

or perhaps this many files:

ls setup |wc -l
69

in your packer-tmp-created-floppy.flp

My build failed because Autounattend.xml happened to be corrupted during the floppy image creation. YMMV.

To prove this to myself, I snagged the temporary .flp file that Packer creates during the build. In my case, the file appears in /tmp. I cracked the file open with Archive Manager (it's some kind of zip) and tried opening files one by one. Several were corrupted, including Autounattend.xml.

Packer version

packer version
Packer v1.6.2

I tried with 1.6.1 and 1.6.0 as well. Same problem.

Simplified Packer Buildfile

In the reproduction steps above I show that I have 69 files in my setup directory (which works, add more to repro the bug). This is what I use setup for (roughly):

  "provisioners": [
    {
      "type": "shell-local",
      "scripts": [
       "setup/test-ssh-dir-get.sh"
      ]
    },
    {
      "type": "windows-shell",
      "inline": ["dir c:\\"]
    },
    {
      "type": "powershell",
      "scripts": [
        "setup/123.ps1"
      ]
    },
    {
      "type": "windows-shell",
      "valid_exit_codes": [0, 3010, 3221227356],
      "scripts": [
        "setup/firewall-off.cmd",
        "setup/network-location-awareness-off.cmd",
        "setup/power-setting.cmd",
        "setup/rdp-enable.cmd",
        "setup/test-user-password-expire-never.cmd",
        "setup/updates-stop-them.cmd",
        "setup/installers-dir-make.cmd",
        "setup/tools-dir-make.cmd",
        "setup/7zip-install.cmd",
        "setup/glorp.cmd",
        "setup/crash-dump-tune.cmd",
        "setup/domain-suffix-set.cmd",
        "setup/file-extensions-and-hidden-files-show.cmd",
        "setup/indexing-turn-off.cmd",
        "setup/timezone-set-local.cmd",
        "setup/windows-annoyances-disable.cmd",
        "setup/windows-defender-disable.cmd"
      ]
    },
    {
      "type": "file",
      "source": "files/.ssh",
      "destination": "C:\\home\\test\\"
    },
    {
      "type": "file",
      "source": "setup/ssh-fix-owner-mode.sh",
      "destination": "C:\\Windows\\Temp\\"
    },
    {
      "type": "file",
      "source": "files/environment",
      "destination": "C:\\Windows\\Temp\\"
    },
    {
      "type": "file",
      "source": "files/sshd_config",
      "destination": "C:\\Windows\\Temp\\"
    },
    {
      "type": "file",
      "source": "setup/ssh-copy-config-files.sh",
      "destination": "C:\\Windows\\Temp\\"
    },
    {
      "type": "powershell",
      "scripts": [
        "setup/cygwin-sshd-config.ps1",
        "setup/pagefile-size.ps1",
        "setup/bginfo-shortcut.ps1",
        "setup/windows-annoyances-and-tweaks.ps1",
        "setup/path-length-longer.ps1"
      ]
    },
    {
      "type": "windows-shell",
      "valid_exit_codes": [0, 3010, 3221549136, 3221227356],
      "scripts": [
        "setup/jenkinsapi-install.cmd",
        "setup/cygwin-setfacl-fix.cmd",
        "setup/drive-map-cleanup.cmd",
        "setup/product-key-fix.cmd",
        "setup/CHANGEME-zero-free-space.cmd"
      ]
    }
  ]
}

Notice the files called 123 and glorp? That's me shortening descriptive names.

Operating system and Environment details

cat /etc/os-release 
NAME="Ubuntu"
VERSION="18.04.4 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.4 LTS"

Log Fragments and crash.log files

PACKER_LOG=1 gives lots of nice detail, but there's no error logged when the floppy is created.

`vsphere-iso/clone`: Add support for disk compaction

This issue was originally opened by @HeroCC as hashicorp/packer#9459. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Apologies if this is already implemented, or is requested in another ticket.

Feature Description

Right now, the vmware-(iso|clone) build types have an option to compact disks (or rather, skip compacting disks). The VSphere builders, however, seem to lack such an option. I looked for anything resembling 'compact' or 'thinning' to no avail. Ideally, I'd be able to use a similar flag to thin out free space before finalizing the VM / template. The only other way documented is to Storage vMotion the disk after it's built, but I'm not sure if Packer supports doing that.

Use Case(s)

I'm using vsphere-clone on a VM that requires temporarily using a lot of space, but frees much of it
by the end of the buildprocess. I'd like to be able to zero out the free space and replicate something like vmkfstools --punchzero. I see this is already implemented in vmware based builders, and I'd love to be able to use it with the vsphere builders.

Thank you for your time!

Packer vsphere-iso builder error "use of closed network connection" when uploading floppy in GitLab-CI

This issue was originally opened by @Boeller666 as hashicorp/packer#9295. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

When building locally the VM will be created as expected.

On an on-prem GitLab with multiple runners inside a docker swarm there is an error that the floppy image upload is using a closed network connection!

As our networker told me, connection first seems to be established. There is no error inside the network logs... Also network ranges are open between the runner hosts (with Docker-Swarm) and the vCenter network range.

Reproduction Steps

WE are using GitLab-Ci runner inside a docker swarm, where the packer container is used. Same error when directly using the docker container inside the swarm machine against a vCenter 6.7 environment.

Packer version

Packer v1.5.6

Simplified Packer Buildfile

template.json

{
    "builders": [
        {
            "type": "vsphere-iso",
            "vcenter_server": "vcenter.corp.company.local",
            "datacenter": "DC",
            "insecure_connection": true,
            "resource_pool": "Testpool",
            "datastore": "DS",
            "username": "{{ user `vcenter_username` }}@corp.company.local",
            "password": "{{ user `vcenter_password` }}",
            "host": "esxi.corp.company.local",
            "folder": "VM_VORLAGEN",
            "vm_name": "CentOS_7_minimal",
            "convert_to_template": true,
            "CPUs": 1,
            "RAM": 2048,
            "storage": [
                {
                    "disk_size": 4096,
                    "disk_thin_provisioned": true
                }
            ],
            "network_adapters": [
                {
                    "network": "TESTNETZ",
                    "network_card": "e1000"
                }
            ],
            "guest_os_type": "centos7_64Guest",
            "iso_paths": [
                "[ISOSpeicher] ISO_IMAGES/CENTOS7/CentOS-7-x86_64-Minimal-1810.iso"
            ],
            "floppy_files": ["{{template_dir}}/ks.cfg"],
            "boot_wait": "10s",
            "boot_order": "disk,cdrom,floppy",
            "boot_command": ["<esc><wait>", "linux ks=hd:fd0:/ks.cfg<enter>"],
            "ssh_username": "admin",
            "ssh_password": "password"
        }
    ],
    "provisioners": []
}

.gitlab-ci.yaml

stages:
  - build

create-centos-vm:
  stage: build
  image:
    name: hashicorp/packer:1.5.6
    entrypoint: ['']
  script:
    - cd CentOS/
    - PACKER_LOG=1 packer build -var "vcenter_username=${vcenter_username}" -var "vcenter_password=${vcenter_password}" template.json

Operating system and Environment details

CentOS 7.7
Docker 19.03
GitLab-CI Runner 11.10

Log Fragments and crash.log files

 2020/05/26 11:02:36 packer-builder-vsphere-iso plugin: Reading the root directory from the filesystem
     vsphere-iso: Copying file: Packer-Templates/CentOS/ks.cfg
     vsphere-iso: Done copying files from floppy_files
     vsphere-iso: Collecting paths from floppy_dirs
     vsphere-iso: Resulting paths from floppy_dirs : []
     vsphere-iso: Done copying paths from floppy_dirs
 ==> vsphere-iso: Uploading created floppy image
 2020/05/26 11:03:26 packer-builder-vsphere-iso plugin: Deleting floppy disk: /tmp/packer706795908
 ==> vsphere-iso: Destroying VM...
 2020/05/26 11:03:26 [INFO] (telemetry) ending vsphere-iso
 2020/05/26 11:03:26 machine readable: error-count []string{"1"}
 ==> Some builds didn't complete successfully and had errors:
 2020/05/26 11:03:26 machine readable: vsphere-iso,error []string{"Put \"https://172.x.x.x/folder/CentOS_7_minimal/packer-tmp-created-floppy.flp?dsName=DS\": write tcp 172.17.0.2:55030->172.x.x.x:443: use of closed network connection"}
 ==> Builds finished but no artifacts were created.
 2020/05/26 11:03:26 [INFO] (telemetry) Finalizing.
 Build 'vsphere-iso' errored: Put "https://172.x.x.x/folder/CentOS_7_minimal/packer-tmp-created-floppy.flp?dsName=DS": write tcp 172.17.0.2:55030->172.x.x.x:443: use of closed network connection
 ==> Some builds didn't complete successfully and had errors:
 --> vsphere-iso: Put "https://172.x.x.x/folder/CentOS_7_minimal/packer-tmp-created-floppy.flp?dsName=DS": write tcp 172.17.0.2:55030->172.x.x.x:443: use of closed network connection
 ==> Builds finished but no artifacts were created.
 2020/05/26 11:03:27 waiting for all plugin processes to complete...
 2020/05/26 11:03:27 /bin/packer: plugin process exited
 ERROR: Job failed: exit code 1

vsphere-clone - Customize requires Linux Customization Settings

This issue was originally opened by @FF186 as hashicorp/packer#10041. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Hi,

I'm wondering why the customize options for vsphere-clone requires the linux_options: or else it says that customize is empty?
I want the ability to only add network_interface.

WORKS
 "customize": {
          "linux_options": {
            "host_name": "packer-test",
            "domain": "test.internal"
          },
          "network_interface": {
            "ipv4_address": "10.0.0.10",
            "ipv4_netmask": "24"
          }
 }
DOES NOT WORK
 "customize": {
          "network_interface": {
            "ipv4_address": "10.0.0.10",
            "ipv4_netmask": "24"
          }
 }

`vsphere-iso`: Add support for removing ethernet before export

This issue was originally opened by @johnjelinek as hashicorp/packer#10029. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Description

When I use the normal export capabilities from the vsphere-iso builder without removing the ethernet card, the ovf expects the network capabilities to be available wherever the vmdk is imported. This results in an error like this (using govc in this example).

govc: Host did not have any virtual network defined.

To mitigate this, we've added some manual overrides to the shutdown process before the export occurs, but it feels very hacky:

{
  "builders": [
    {
      "type": "vsphere-iso",
      "disable_shutdown": true
      ,
    }],
  "provisioners": [
    {
      "expect_disconnect": true,
      "inline": [
        "echo '{{user `user`}}' | sudo -S shutdown -h +1"
      ],
      "only": [
        "vsphere-iso"
      ],
      "type": "shell"
    },
    {
      "command": "govc device.remove -k -u '{{ user `user` }}:{{ user `pass` }}@{{ user `host` }}' -vm '{{ user `template` }}' ethernet-0",
      "only": [
        "vsphere-iso"
      ],
      "type": "shell-local"
  }]
}

Another odd nuance is that when using the "disable_shutdown": true, the export from vsphere-iso spits out a vmdk named like template_name-disk-0.vmdk otherwise it's just named disk-0.vmdk, not sure why that is.

Use Case(s)

The primary use-case I see is being able to ship my exported template to vcenters different from the one the template was generated on.

Versions

Packer v1.6.1
govc 0.23.0

Add support to cleanup cache

This issue was originally opened by @swerveshot as hashicorp/packer#9843. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

After creating a succesfull build with the vsphere-iso builder I would like to have the option to remove the ISO file that was uploaded to the packer_cache folder on the remote datastore. The exact same functionality was added to the vmware-iso builder in this PR hashicorp/packer#8917.

Use Case(s)

Housekeeping

vsphere-template post-processor maybe should not enforce folder parameter bound to root

This issue was originally opened by @yves-vogl as hashicorp/packer#6975. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Hi,

as I understand path.Join there should be automatically concatenated a slash inbetween dcPath and folder (https://github.com/hashicorp/packer/blob/master/post-processor/vsphere-template/step_create_folder.go#L25)

Can you explain why the binding to root is enforced?
https://github.com/hashicorp/packer/blob/b01f4a61f8b16815521f4feb278dbc98536f162f/post-processor/vsphere-template/post-processor.go#L72

Thank you!

Builder not waiting for specified ip_wait_timeout

This issue was originally opened by @brad-simspace as hashicorp/packer#9948. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Related to hashicorp/packer#9143
As with the other ticket I am also on a VPN connecting to the datacenter that is hosting our vCenter. I have been able to successfully build a Debian 10 VM without any issues.

Packer fails after waiting for IP for approx 1m30s. It's not consistent across attempts, but is always between 1m20s and 1m35s.

Packer: 1.6.2
Builder: vsphere-iso

Relevant portion of packer file

  "builders": [ 
    { 
      "type": "vsphere-iso",
      "CPUs": 2,
      "RAM": 4096,
      "boot_wait": "2s",
      "ip_wait_timeout": "3600s",
      "ip_settle_timeout": "3600s",

Collected PACKER_LOG and found the following. Looks as though it accepts the time provided, but does not actually wait.

2020/09/16 10:50:09 packer-builder-vsphere-iso plugin: [INFO] Waiting for IP, up to total timeout: 1h0m0s, , settle timeout: 1h0m0s
==> vsphere-iso: Waiting for IP...
==> vsphere-iso: Clear boot order...
==> vsphere-iso: Power off VM...
==> vsphere-iso: Destroying VM...
2020/09/16 10:50:59 [INFO] (telemetry) ending vsphere-iso
==> Wait completed after 1 minute 24 seconds
2020/09/16 10:50:59 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/09/16 10:50:59 machine readable: vsphere-iso,error []string{"unable to find an IP"}
Build 'vsphere-iso' errored after 1 minute 24 seconds: unable to find an IP

==> Builds finished but no artifacts were created.

`vsphere-iso`: Add support to set a VM storage policy while configuring hard disks

This issue was originally opened by @amitbhadra as hashicorp/packer#9525. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Feature Description

From the UI, we can select the VM Storage Policy when setting up the details of a Hard disk. This should be possible on Packer too.
Possibly something like -

"storage": [
        {
          "disk_size": 34816,
          "disk_controller_index": 0,
          "storage_policy": "<storage-policy-dev>"
        },
        {
          "disk_size": 51200,
          "disk_controller_index": 0,
          "storage_policy": "<storage-policy-prod>"
        }
],

Why do I have "disk_controller_index"? Because this is another use case already covered in hashicorp/packer#9518. I hope the new PR already has the changes from the previous issue I mentioned.

Use Case(s)

Based on whether the VM is dev or prod, the storage policy is changed.

Support vSphere datastore clusters

This issue was originally opened by @MG2R as hashicorp/packer#10620. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

I'm trying to build VM templates in a vSphere cluster. Our disks are housed in a datastore cluster. I'd like to instruct packer to target that cluster for the disk storage. However, when I enter the address of the cluster in the datastore parameter for packer, packer throws the following error:

Build 'vsphere-iso' errored after 107 milliseconds 63 microseconds: datastore doesn't exist: Error finding datastore with name /MYDC/datastore/MYDSC: datastore '/MYDC/datastore/MYDSC' not found

It seems I need to point towards specific datastores: using /MYDC/datastore/MYDSC/datastore_1 works perfectly fine.

Is this indeed unsupported? Am I missing something?

`vsphere-iso`: No configuration parameters written

This issue was originally opened by @jorgelon as hashicorp/packer#10434. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

The configuration_parameters are not written to the vmx file
It is the same case as described in https://discuss.hashicorp.com/t/how-do-the-configuration-parameters-in-vsphere-iso-builder-work/17454

Packer version

1.6.6

Simplified Packer Buildfile

  "configuration_parameters": {
    "msg.autoanswer": "FALSE",
    "svga.vramSize": "16777216"
  },

Operating system and Environment details

Linux x86_64

`vsphere-iso`: Add support for a `vmtools` communicator

This issue was originally opened by @JamesPGriffith as hashicorp/packer#10194. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Description

vsphere-iso, vsphere-clone, vmware-clone, vmware-iso could all have make use of a special communicator that uses the VMware ability to use the guest tools to copy files to the guest and execute commands.

Use Case(s)

#9924 now allows Packer to continue if the communicator is none for guests that are isolated from the Packer executor. I saw that Docker has its own communicator so I thought that the VMware builder could take advantage of the guest tools to transfer files to a VM in network isolation and execute commands.

Potential configuration

I'm not familiar with Golang enough to provide input here or take this on myself. I did look up this documentation though:
https://pkg.go.dev/github.com/vmware/govmomi/guest/toolbox

Potential References

hashicorp/packer#9924
https://pkg.go.dev/github.com/vmware/govmomi/guest/toolbox

`vsphere-iso`: Deploying exported OVF results in VM with empty disks

This issue was originally opened by @nilic as hashicorp/packer#9629. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Initially a part of hashicorp/packer#8995, now opening a separate issue as requested.

The OVF exported by the vsphere-iso builder currently seems unusable.

After hashicorp/packer#9020 was released as a part of Packer 1.5.6 I tried the following:

  • deploying the exported OVF to two vCenter servers using the vsphere post-processor

  • converting the OVF to OVA manually using ovftool and uploading it to vCloud Director

All of these attempts have the same result - I get a VM which seemingly looks fine (in terms of virtual hardware, disk size etc) but has virtual disks which are zero in size:

80977876-a2885800-8e25-11ea-93d8-afb4c5ebdb7c

and doesn't boot.

Message I get from ovftool when converting to OVA:

Wrong file size specified in OVF descriptor for 'centos7-XXX-disk-0.vmdk' (specified: 12884901888, actual 1344942592).
Wrong file size specified in OVF descriptor for 'centos7-XXX-disk-1.vmdk' (specified: 12884901888, actual 74752).

Virtual disk 0 is 10 GB and disk 1 is 2 GB in size if that's of any help.

Packer 1.5.6
ovftool 4.3.0 (build-13981069)
vSphere 6.5

`vsphere-iso`: Add support to upload ISOs in `iso_paths` to cache

This issue was originally opened by @dustyhorizon as hashicorp/packer#9194. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Feature Description

ISOs listed in iso_paths should ideally be uploaded by the builder to the same folder as packer_cache (same way how the base ISO is uploaded) and mounted for use as so the build can still be "self contained" instead of relying on other cli magicks

Use Case(s)

An alternate way to mount cloud-config / preseed files using CDRoms instead of floppies (which are slowly seeing reduced support i.e. Ubuntu does not automount floppies at /media for unattended installation anymore ...)

EDIT: I was thinking that this particular feature could work similarly to the existing floppy_* features where we can specify a number of files and packer will generate and mount the generated ISO as a CD and then delete after the build ends

Centos vSphere Templates cannot be customized

This issue was originally opened by @tjflexmaster as hashicorp/packer#8703. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Packer 1.5.0
vCenter 6.7
VMware Workstation Pro 15.5.0
Ubuntu 18.04.3 LTS

vCenter only recognizes Centos7_64Guest machines as being customizable if they are powered on, so open-vm-tools is running, before being turned into a template. The current vsphere and vsphere_template post-processors never poweron the VM before uploading it to vCenter.

The vsphere post-processor allows ovftool options which can power on the VM once it is uploaded. There is no way to power down the VM before passing the artifacts to the vsphere_template post-processor. If the VM is not powered down then vsphere_template will fail.

Either vsphere needs a way to power on a machine, let open-vm-tools do their magic, and then power down the machine or vsphere_template needs an option to Power down the machine before turning it into a template.

The vCenter api allows this functionality. I will include bash scripts that do what is needed.

Here is my packer json:

{
  "variables": {
    "vcenter_user": "{{env `MY_USER`}}",
    "vcenter_password": "{{env `MY_SECRET`}}"
  },
  "builders": [
    {
      "type": "vmware-iso",
      "vm_name": "pkr-centos7",
      "guest_os_type": "centos7-64",
      "headless": true,
      "iso_urls": [
        "CentOS-7-x86_64-Minimal-1908.iso",
        "http://mirrors.mit.edu/centos/7/isos/x86_64/CentOS-7-x86_64-Minimal-1908.iso"
      ],
      "iso_checksum_type": "sha256",
      "iso_checksum": "9a2c47d97b9975452f7d582264e9fc16d108ed8252ac6816239a3b58cef5c53d",
      "ssh_username": "vagrant",
      "ssh_password": "vagrant",
      "ssh_port": 22,
      "ssh_wait_timeout": "10000s",
      "http_directory": "http",
      "floppy_files": [
        "http/ks.cfg"
      ],
      "boot_command": [
        "<tab> inst.text inst.ks=hd:fd0:/ks.cfg <enter><wait>"
      ],
      "boot_wait": "10s",
      "shutdown_command": "echo 'vagrant'|sudo -S shutdown -P now",
      "disk_size": 8192,
      "cores": 1,
      "network": "nat",
      "network_adapter_type": "vmxnet3",
      "vmx_data":
      {
        "numvcpus": "2",
        "memsize": "2048",
        "virtualHW.version": "14",
        "cpuid.coresPerSocket": "1",
        "disk.EnableUUID": "true"
      }
    }
  ],
  "post-processors": [
    {
      "type": "vagrant",
      "output": "builds/{{.Provider}}-pkr-centos7-64-pke.box"
    },
    [
      {
        "type": "vsphere",
        "vm_name": "pkr-centos7-template",
        "password": "{{user `vcenter_password`}}",
        "username": "{{user `vcenter_user`}}",
        "host": "vcenter_server",
        "datacenter": "DC",
        "cluster": "CLUSTER",
        "datastore": "DS",
        "resource_pool": "POOL",
        "vm_folder": "/FOLDER/templates",
        "vm_network": "NETWORK",
        "disk_mode": "thin",
        "insecure": true,
        "overwrite": true,
        "options": [
          "--powerOn"
        ]
      },
      {
        "type": "vsphere-template",
        "username": "{{user `vcenter_user`}}",
        "password": "{{user `vcenter_password`}}",
        "host": "vcenter_server",
        "insecure": true,
        "datacenter": "DC",
        "folder": "/FOLDER/templates"
      }

    ]
  ]

}

Here is a bash script that uses the vCenter api to power on the machine and power it off

#i!/bin/bash

# vSphere doesn't recognize VMware tools for guest vms unless the machine has been powered on first
# This script uses the vcenter rest api to power on a VM, sleep for a time, and then shutdown the VM
# Run this script before creating a template.

# create a cookie
curl -k -i -u $MY_USER@domain:$MY_SECRET -X POST -c cookie-jar.txt https://$VCENTER_SERVER/rest/com/vmware/cis/session

# Capture VM ID
export VM_ID=$(curl -k -i -b cookie-jar.txt -X GET https://$VCENTER_SERVER/rest/vcenter/vm?filter.names.1=$VM_NAME | grep -Po 'vm-[0-9]+')

# Power up the VM
#curl -k -i -b cookie-jar.txt -X POST https://$VCENTER_SERVER/rest/vcenter/vm/$VM_ID/power/start

# Sleep for 30s
sleep 30

# Power down the VM
curl -k -i -b cookie-jar.txt -X POST https://$VCENTER_SERVER/rest/vcenter/vm/$VM_ID/power/stop

# Sleep for 30s
sleep 30

`vsphere-iso`: Exported OVF file, govc imports disk controllers in wrong order

This issue was originally opened by @Chris-Perrin as hashicorp/packer#10378. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

For a vm that has multiple scsi controllers, when exporting the vm using packer, the resulting ovf file has the SCSI Controllers in the wrong physical order.

Steps to reproduce

  1. Create a virtual machine with multiple scsi controllers (e.g. the max, 4)
  2. Add at least 1 disk to each scsi controller
  3. Export the Virtual machine to an OVF File.

Resultant OVF File has the below XML (snipped)

<Item>
<rasd:Address>3</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 3</rasd:ElementName>
<rasd:InstanceID>3</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="256"/>
</Item>
<Item>
<rasd:Address>2</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 2</rasd:ElementName>
<rasd:InstanceID>4</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="224"/>
</Item>
<Item>
<rasd:Address>1</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 1</rasd:ElementName>
<rasd:InstanceID>5</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="192"/>
</Item>
<Item>
<rasd:Address>0</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 0</rasd:ElementName>
<rasd:InstanceID>6</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="160"/>
</Item>

When this is imported back into VMware Virtual center via GOVC the Controllers are imported in the wrong order, which, when the template is then cloned to the resultant virtual machines, the hard Disk name ends up incorrect and the machine will not boot.

If you Modify the XML in the following way, sorting by "Element Name" the template is created correctly and subsequent virtual machines can boot.

<Item>
<rasd:Address>0</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 0</rasd:ElementName>
<rasd:InstanceID>6</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="160"/>
</Item>
<Item>
<rasd:Address>1</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 1</rasd:ElementName>
<rasd:InstanceID>5</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="192"/>
</Item>
<Item>
<rasd:Address>2</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 2</rasd:ElementName>
<rasd:InstanceID>4</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="224"/>
</Item>
<Item>
<rasd:Address>3</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI controller 3</rasd:ElementName>
<rasd:InstanceID>3</rasd:InstanceID>
<rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
<vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="256"/>
</Item>

Additionally I have checked by using virtual center / ESX to export the template, it also orders the SCSI Controllers physically based on "ElementName" (see below)

      <Item>
        <rasd:Address>1</rasd:Address>
        <rasd:Description>SCSI Controller</rasd:Description>
        <rasd:ElementName>SCSI Controller 1</rasd:ElementName>
        <rasd:InstanceID>3</rasd:InstanceID>
        <rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
        <rasd:ResourceType>6</rasd:ResourceType>
        <vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="192"/>
      </Item>
      <Item>
        <rasd:Address>0</rasd:Address>
        <rasd:Description>SCSI Controller</rasd:Description>
        <rasd:ElementName>SCSI Controller 2</rasd:ElementName>
        <rasd:InstanceID>4</rasd:InstanceID>
        <rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
        <rasd:ResourceType>6</rasd:ResourceType>
        <vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="160"/>
      </Item>
      <Item>
        <rasd:Address>3</rasd:Address>
        <rasd:Description>SCSI Controller</rasd:Description>
        <rasd:ElementName>SCSI Controller 3</rasd:ElementName>
        <rasd:InstanceID>5</rasd:InstanceID>
        <rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
        <rasd:ResourceType>6</rasd:ResourceType>
        <vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="256"/>
      </Item>
      <Item>
        <rasd:Address>2</rasd:Address>
        <rasd:Description>SCSI Controller</rasd:Description>
        <rasd:ElementName>SCSI Controller 4</rasd:ElementName>
        <rasd:InstanceID>6</rasd:InstanceID>
        <rasd:ResourceSubType>VirtualSCSI</rasd:ResourceSubType>
        <rasd:ResourceType>6</rasd:ResourceType>
        <vmw:Config ovf:required="false" vmw:key="slotInfo.pciSlotNumber" vmw:value="224"/>
      </Item>

Packer version

From 1.6.5

Simplified Packer Buildfile


      "type": "vsphere-iso",
      "disk_controller_type": ["pvscsi", "pvscsi", "pvscsi", "pvscsi"],
      "storage":  [ 
        {
          "disk_size": 65536,
          "disk_controller_index": 0
        },
        {
          "disk_size": 1024,
          "disk_controller_index": 0
        },
        {
          "disk_size": 2048,
          "disk_controller_index": 0
        },
        {
          "disk_size": 3072,
          "disk_controller_index": 1
        },
        {
          "disk_size": 4096,
          "disk_controller_index": 2
        },
        {
          "disk_size": 5120,
          "disk_controller_index": 3
        }

Operating system and Environment details

CentOS Linux release 7.7.1908

Add support for datastore cluster storage

This issue was originally opened by @vheon as hashicorp/packer#10004. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

From the documentation doesn't seems to be possible to have a packer build with the vsphere-iso builder specifying a Storage DRS, only a specific Datastore is allowed. I've tried to set the Storage DRS name as the datastore property but it fails with:

==> vsphere-iso.packer-build-name: error creating vm: datastore 'Storage-DRS-name' not found

Use Case(s)

Usually using the Storage DRS allows to keep the same configuration even if new datastores are added for more storage.

Potential configuration

I'm not sure if new configuration would be required. Packer could first search for Storage DRS and look for a match of the datastore property and if found use it otherwise look for a match of single datastores.

Potential References

https://docs.vmware.com/en/VMware-vSphere/6.7/com.vmware.vsphere.resmgmt.doc/GUID-827DBD6D-08B7-4411-9214-9E126671457F.html

`vsphere-iso`: Add support for vApp properties

This issue was originally opened by @jpbuecken as hashicorp/packer#10319. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

In vSphere, you can enable vApp Options of a VM via Configure -> vApp Options -> Edit
After that, you can add Properties to the vApp / VM (same window)

image

This should be possible via the vsphere-iso builder.

Use Case(s)

With this, you can create a VM with vApp Properties.
Use Case 1: You can add a public-keys property. Configure your Suse autoyast / Redhat/Ubuntu kickstart / Ubuntu preseed to make use of the value during boot (write your own script or make use of cloud-init).
After you have done this, your new vm can be used in turn as a source for vsphere-clone builder.
Since vsphere-clone supports temporary keys for the public-keys property, there is no need to store a password or public-key file in your source image.
I see this as an absolut security win.

Use Case 2: Similar to vsphere-clone, vsphere-iso may use the public-keys property itself:

  1. vsphere-iso creates the vapp property public-keys
  2. vsphere-iso is able to generate temporary key pairs and add them as value to the public-keys (additional new feature, same logic as vsphere-clone)
  3. Your automatic installation process make use of the value in public-keys and add them to your connect user (e.g. root)
  4. vsphere-iso can connect with the ssh communicator with the temporary key pair
    Now the same argument as above applies, there is no need to store a hardcoded password or key files inside your vm before you connect with vsphere-iso. E.g. we have the policy to recreate key files regularly. If they are created and removed "on the fly" temporary, this policy is easily fulfilled.

Potential configuration

     "builders": [
      {
         "type":                      "vsphere-iso",
[...]
        
        "vapp": {
             "enable_vapp_options": true
            "properties": {
                "public-keys": "",
            }
        }
     ]

Potential References

https://www.packer.io/docs/builders/vmware/vsphere-clone#ssh (search for public-keys and vapp on the side)

dhcp ip changed : no SSH after reboot [vsphere-iso]

This issue was originally opened by @necarnot as hashicorp/packer#10893. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

Packer 1.7.2 fails to connect via SSH during the installation of an Ubuntu 20.04.2 live server because the ip of the guest changes after the first reboot, and the DHCP provides a different ip address. Packer still uses the first seen ip address.

In hashicorp/packer#8528 , it's claimed that this point is fixed for vmware-iso, but I'm witnessing that with vsphere-iso, the issue remains.

Reproduction Steps

  • Use packer 1.7.2 with vsphere-iso
  • Set up a deployment of an Ubuntu 20.04.2 live server (thus, not the legacy one)
  • The installation looks good, but the logs are showing Packer forever failing to connect via SSH because targetting the wrong previous ip address.

Packer version

1.7.2

Buildfile

build {
  sources = [
    "source.vsphere-iso.ubuntu20_04_varnames"
  ]

  provisioner "shell" {
    inline = ["echo 'Packer Template Build -- Complete'"]
  }
}

sources_ubuntu_20_04.pkr.hcl

source "vsphere-iso" "ubuntu20_04_varnames" {
  vcenter_server       = "${var.vcenter_server}"
  username             = "${var.vcenter_username}"
  password             = "${var.vcenter_password}"
  cluster              = "${var.cluster}"
  datacenter           = "${var.datacenter}"
  datastore            = "${var.datastore}"
  host                 = "${var.host}"
  folder               = "${var.folder}"
  convert_to_template  = "true"

  vm_name              = "${var.vm_name}"
  CPUs                 = "${var.vm_cpu_num}"
  RAM                  = "${var.vm_mem_size}"
  RAM_reserve_all      = true
  boot_command = [
    "<esc><esc><esc>",
    "<enter><wait>",
    "/casper/vmlinuz ",
    "initrd=/casper/initrd ",
    "autoinstall ",
    "<enter>"
  ]
  boot_order           = "disk,cdrom"
  boot_wait            = "3s"
  disk_controller_type = ["pvscsi"]

  cd_files = [
    "./cloud-init/meta-data",
    "./cloud-init/user-data"]
  cd_label = "cidata"

  guest_os_type        = "ubuntu64Guest"
  http_directory       = "./http"
  insecure_connection  = "true"
  ip_wait_timeout      = "20m"
  iso_checksum         = "${var.iso_checksum_type}:${var.iso_checksum}"
  iso_urls             = ["${var.iso_url}"]
  network_adapters {
    network      = "${var.network}"
    network_card = "vmxnet3"
  }
  ssh_handshake_attempts = "200"
  ssh_pty                = true
  ssh_timeout            = "20m"
  ssh_username           = "${var.ssh_username}"
  ssh_password           = "${var.ssh_password}"
  storage {
    disk_size             = "${var.vm_disk_size}"
    disk_thin_provisioned = true
  }
}

Operating system and Environment details

Packer runs on CentOS Linux release 8.3.2011
vSphere is 7.0.2

Log Fragments and crash.log files

2021/04/08 12:09:03 packer-builder-vsphere-iso plugin: VM IP aquired: 10.32.0.142
2021/04/08 12:09:04 packer-builder-vsphere-iso plugin: VM IP is still the same: 10.32.0.142
...
2021/04/08 12:12:30 packer-builder-vsphere-iso plugin: [DEBUG] TCP connection to SSH ip/port failed: dial tcp 10.32.0.142:22: i/o timeout
2021/04/08 12:12:50 packer-builder-vsphere-iso plugin: [DEBUG] TCP connection to SSH ip/port failed: dial tcp 10.32.0.142:22: i/o timeout
2021/04/08 12:12:58 packer-builder-vsphere-iso plugin: [DEBUG] TCP connection to SSH ip/port failed: dial tcp 10.32.0.142:22: connect: no route to host
2021/04/08 12:13:06 packer-builder-vsphere-iso plugin: [DEBUG] TCP connection to SSH ip/port failed: dial tcp 10.32.0.142:22: connect: no route to host

^--- Here, the ip has changed and is not 10.32.0.142 anymore, so Packed won't reach it ever.

Thank you for you help.

Nicolas

winrm: fails with invalid url if ipv6 is used

packer chooses randomly between ipv6 or ipv4 ip

when ipv6 is used winrm connection fails with:

2017/08/21 08:51:52 packer-builder-vsphere.linux: 2017/08/21 08:51:52 [INFO] Attempting WinRM connection...
2017/08/21 08:51:52 packer-builder-vsphere.linux: 2017/08/21 08:51:52 [DEBUG] connecting to remote shell using WinRM
2017/08/21 08:51:52 packer-builder-vsphere.linux: 2017/08/21 08:51:52 [ERROR] connection error: unknown error Post http://fe80::dd5b:1430:623d:c52b:5985/wsman: invalid URL port ":dd5b:1430:623d:c52b:5985"
2017/08/21 08:51:52 packer-builder-vsphere.linux: 2017/08/21 08:51:52 [ERROR] WinRM connection err: unknown error Post http://fe80::dd5b:1430:623d:c52b:5985/wsman: invalid URL port ":dd5b:1430:623d:c52b:5985"
  • Packer version from 1.0.3
  • linux ubuntu

Add support for tags

This issue was originally opened by @dbond007 as hashicorp/packer#10147. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

Allow the adding of tags to templates within vsphere/esxi like available within terraform.

Use Case(s)

Allows adding tags to templates so you can add information about the template for example build date, OS version, roles, build version, builder etc So information on the template can be extracted instead of needing to encode as much of this in the template name or external database.

Potential configuration

see references

Potential References

https://registry.terraform.io/providers/hashicorp/vsphere/latest/docs/resources/tag_category
https://registry.terraform.io/providers/hashicorp/vsphere/latest/docs/resources/tag
https://registry.terraform.io/providers/hashicorp/vsphere/latest/docs/resources/custom_attribute

Add support for retries in vSphere post processor

This issue was originally opened by @gamethis as hashicorp/packer#10745. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

Add option to retry on failures to upload templates to vsphere environment.
This can be done similar to what was done for vagrant-cloud post processor or can be done with giving customer a option on packer file to add retry and count to retry.

Use Case(s)

An intermittent error occurs during the upload phase of template. Adding a retry variable that retries x number of times to allow greater success.

Potential configuration

"post-processors":  [
{"type": "vsphere",
"retry": 4,
....
}],
....

Potential References

vsphere-clone join Windows to AD domain

This issue was originally opened by @mbrewczynski-eis as hashicorp/packer#10418. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Description

We are using vsphere-clone to update existing templates but we can't do that for Windows, because in our case winrm will be unlocked after joining to domain.

vSphere allow automatic connection to windows domain when VM is cloned with option 'Customize guest OS'.
It will be nice to have that in vsphere-clone.

Add vSphere tests (unit, integration and acceptance)

This issue was originally opened by @sylviamoss as hashicorp/packer#9851. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


This is for keeping track of the work on adding all kinds of tests to the vSphere builder, both iso and clone. There's already a related issue for acceptance tests hashicorp/packer#8841.

  • Unit tests for all steps
  • Integration test for driver using simulator provided by govmomi
  • Fix existing acceptance tests
  • Make vcsim mock work with Packer. vmware/govmomi#2093

The vcsim mock should allow maintainers and contributors to test Packer against many possible environments configuration.

packer with vmware-clone fails

This issue was originally opened by @zerr0s as hashicorp/packer#10411. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Hi, forgive me for my bad english.

I'm trying to use packer to automate the creation of templates based on linux (ubuntu 20.04).
Actually I have a problem when packer clones the existing template to create the new VM: The nic is not connected at startup.

I have verified the existence of "ethernet0.startConnected = TRUE" if the vmtx file, and when I created a new vm using the template directly from the vsphere web interface, the option "Connected at startup" is correctly checked.

When the vm is created by packer, I have to check the option from vmware web to connet the nic. After that all provisionners can run correctrly.

What am I doing wrong ??

vmware version: 6.5
packer version: 1.6.5

{
  "builders": [
    {
      "type": "vsphere-clone",
      "vcenter_server": "xxxxx",
      "datacenter": "xxxxx",
      "username": "xxxx",
      "password": "xxxxx",
      "insecure_connection": "true",

      "template": "ubuntu_server_20.04",

      "communicator": "ssh",
      "ssh_username": "auto",
      "ssh_private_key_file": "./sshkey",
      "boot_wait": "30s",
      "host": "vmwarehost.local",
      "vm_name": "newvm",
      "datastore": "myds",
      "folder": "packer-test",
      "CPUs": 2,
      "CPU_hot_plug": true,
      "RAM": 2048,
      "RAM_hot_plug": true,
      "RAM_reserve_all": false,
      "firmware": "bios",

      "network": "LAN",
      "customize": {
        "linux_options": {
          "host_name": "packer-test",
          "domain": "mydomain.local",
          "time_zone": "UTC"
        },
        "network_interface": {
          "ipv4_address": "192.168.0.5",
          "ipv4_netmask": "24"
        },
        "ipv4_gateway": "192.168.0.1",
        "dns_server_list": ["192.168.0.1"]
      }

    }
  ],
  "provisioners": [
    {
      "type": "shell",
      "inline": [
        "ls -lrth /",
        "echo done"
      ]
    }
  ]
}

Thanks.

Custom cd_files iso not removed from datastore after build using vsphere-iso builder

This issue was originally opened by @tleepa as hashicorp/packer#10914. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

I am building RHEL8 template using vsphere-iso builder, providing kickstart file via cd_files option.
Everything builds fine, but the temporary (kickstart) iso file is not being removed from the datastore.
Looking at the logs I can see kickstart iso file being removed from local (packer) /tmp/ directory, but it seems it does not even try to delete the datastore copy.

ksnip_20210416-115100

Reproduction Steps

Build template using vsphere-iso builder and kickstart provided via cd_files option.

Packer version

# packer version
Packer v1.6.6

Simplified Packer Buildfile

packer template

Operating system and Environment details

Packer build process is executed in the docker container, running on CentOS 7 x86-64 host.

# cat /etc/centos-release
CentOS Linux release 7.9.2009 (Core)

# docker --version
Docker version 20.10.6, build 370c289

OS version of docker container:

# cat /etc/centos-release
CentOS Linux release 8.3.2011

Log Fragments and crash.log files

Relevant part of the log:

==> vsphere-iso: Shutting down VM...
2021/04/16 07:33:43 packer-builder-vsphere-iso plugin: Waiting max 5m0s for shutdown to complete
==> vsphere-iso: Deleting Floppy drives...
==> vsphere-iso: Eject CD-ROM drives...
==> vsphere-iso: Deleting CD-ROM drives...
==> vsphere-iso: Convert VM into template...
==> vsphere-iso: Clear boot order...
2021/04/16 07:33:58 packer-builder-vsphere-iso plugin: Deleting CD disk: /tmp/packer663790441.iso
2021/04/16 07:33:58 [INFO] (telemetry) ending vsphere-iso
==> Wait completed after 7 minutes 32 seconds
==> Builds finished. The artifacts of successful builds are:
2021/04/16 07:33:58 machine readable: vsphere-iso,artifact-count []string{"1"}
Build 'vsphere-iso' finished after 7 minutes 32 seconds.

==> Wait completed after 7 minutes 32 seconds

==> Builds finished. The artifacts of successful builds are:
2021/04/16 07:33:58 machine readable: vsphere-iso,artifact []string{"0", "builder-id", "jetbrains.vsphere"}
2021/04/16 07:33:58 machine readable: vsphere-iso,artifact []string{"0", "id", "RHEL8"}
2021/04/16 07:33:58 machine readable: vsphere-iso,artifact []string{"0", "string", "RHEL8"}
--> vsphere-iso: RHEL8
2021/04/16 07:33:58 machine readable: vsphere-iso,artifact []string{"0", "files-count", "0"}
2021/04/16 07:33:58 machine readable: vsphere-iso,artifact []string{"0", "end"}
2021/04/16 07:33:58 [INFO] (telemetry) Finalizing.
2021/04/16 07:33:58 waiting for all plugin processes to complete...
2021/04/16 07:33:58 /usr/bin/packer: plugin process exited
2021/04/16 07:33:58 /usr/bin/packer: plugin process exited
2021/04/16 07:33:58 /usr/bin/packer: plugin process exited

full log

Note

Template and log file redacted for brevity and some sensitive data removed.

`vsphere-iso`: direct connectivity to cluster hosts is needed for export

This issue was originally opened by @MaxRink as hashicorp/packer#10327. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

We are trying to setup packer for building images on VMWare and in the process noticed that the export step at the end tries to pull the ovf directly from the Host.
We dont have direct network connectivity from the packer build host to the hosts in the vmware cluster, only to the vcenter.
Thus the export fails.
VMWare has a fallback option, that proxies that ovf through the vCenter, so manual retrieval is possible at that point.
Now im wondering why packer doesnt follow the same fallback in the builprocess, making slimmer network setups possible.

Reproduction Steps

Build on a vCenter where you cant reach the actual compute nodes

Packer version

1.6.5

Simplified Packer Buildfile

   {
      "type": "vsphere-iso",
      "name": "vsphere",
      "vcenter_server": "{{user `vcenter_server`}}",
      "username": "{{user `username`}}",
      "password": "{{user `password`}}",
      "insecure_connection": "{{user `insecure_connection`}}",
      ...
      "export": {
        "force": true,
        "output_directory": "{{user `output_dir`}}"
      }

Log Fragments and crash.log files

2020/12/02 16:42:27 ui: �[0;32m    vsphere: Starting export...�[0m
2020/12/02 16:42:27 ui: �[0;32m    vsphere: Downloading: ubuntu-1804-201202T1623-kube-v1.19.4-disk-0.vmdk�[0m
2020/12/02 16:42:57 ui: �[1;32m==> vsphere: Provisioning step had errors: Running the cleanup provisioner, if present...�[0m
2020/12/02 16:42:57 ui: �[1;32m==> vsphere: Clear boot order...�[0m
2020/12/02 16:47:57 ui: �[1;32m==> vsphere: Power off VM...�[0m
2020/12/02 16:47:57 packer-builder-vsphere-iso plugin: Deleting floppy disk: /tmp/packer162294781
2020/12/02 16:47:57 ui: �[1;32m==> vsphere: Destroying VM...�[0m
2020/12/02 16:47:57 [INFO] (telemetry) ending vsphere-iso
2020/12/02 16:47:57 ui error: �[1;31mBuild 'vsphere' errored after 24 minutes 10 seconds: Get "https:///esxi-host//nfc/52cc4f1c-1aaa-0f65-ef1a-92615c6eff0b/disk-0.vmdk": dial tcp IP:443: i/o timeout�[0m
2020/12/02 16:47:57 ui: 
==> Wait completed after 24 minutes 10 seconds
2020/12/02 16:47:57 machine readable: error-count []string{"1"}
2020/12/02 16:47:57 ui error: 
==> Some builds didn't complete successfully and had errors:
2020/12/02 16:47:57 machine readable: vsphere,error []string{"Get \"https://esxi-host/nfc/52cc4f1c-1aaa-0f65-ef1a-92615c6eff0b/disk-0.vmdk\": dial tcp IP:443: i/o timeout"}
2020/12/02 16:47:57 ui error: --> vsphere: Get "https:///esxi-host/nfc/52cc4f1c-1aaa-0f65-ef1a-92615c6eff0b/disk-0.vmdk": dial tcp IP:443: i/o timeout
2020/12/02 16:47:57 ui: 

Ability to use the same name as vm name for vm-templates in vSphere Content Library

This issue was originally opened by @jdotsmith as hashicorp/packer#9967. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Please search the existing issues for relevant feature requests, and use the
reaction feature
(https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/)
to add upvotes to pre-existing requests.

Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request.
If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

Currently the vsphere-iso module content library configuration allows for vmware templates to be uploaded to the content library when the OVF flag is set to false. However, it doesn't allow for the same name as the VM name and appends a timestamp unless the name flag is specified. The OVF flag allows for the same name and overwrites the existing content library file.

I would like the ability to have the vm name be the same as the vm template name that gets uploaded to content library.

Here's the error message:


1 error(s) occurred:

* the content library destination name must be different from the VM name

Use Case(s)

I have an automated packer process (ADO Build) that runs every month so Ill be getting a bunch of vm_name + timestamps vm-templates and would love to overwrite them.

Im using the vmware_content_deploy_template with ansible and that only allows the content-library type to be vm-template and not OVF.

Potential configuration

Potential References

The -force option does not work when convert_to_template is set to true

This issue was originally opened by @bnfbiz as hashicorp/packer#10889. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.


Overview of the Issue

When trying to re-run packer with the -force option and the vsphere-cllone (and presumably vsphere-iso I haven't tested it yet) is run and the created image is already a vSphere template it failes with:

--> vsphere-clone.cleanuptest: error destroying DevOps/forcecleanuptest: Invalid virtual machine state.

Reproduction Steps

Build a vm usine vsphere-clone (expect same issue with vsphere-iso)
~/go/bin/packer build -timestamp-ui -force .

The first one will succed. Then rebuild without manually deleting the first one
~/go/bin/packer build -timestamp-ui -force .

the error occurs:

vsphere-clone.cleanuptest: output will be in this color.

2021-04-07T22:58:10+02:00: ==> vsphere-clone.cleanuptest: the vm/template DevOps/forcecleanuptest already exists, but deleting it due to -force flag
2021-04-07T22:58:10+02:00: Build 'vsphere-clone.cleanuptest' errored after 7 seconds 683 milliseconds: error destroying DevOps/EDaaS_Templates/QA/forcecleanuptest: Invalid virtual machine state.

==> Wait completed after 7 seconds 683 milliseconds

==> Some builds didn't complete successfully and had errors:
--> vsphere-clone.cleanuptest: error destroying DevOps/forcecleanuptest: Invalid virtual machine state.

==> Builds finished but no artifacts were created.

Packer version

v1.7.2

Simplified Packer Buildfile

If the file is longer than a few dozen lines, please include the URL to the
gist of the log or use the Github detailed
format

instead of posting it directly in the issue.

Operating system and Environment details

Target is vSphere

Log Fragments and crash.log files

as above

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.