ansible-middleware / amq Goto Github PK
View Code? Open in Web Editor NEWA collection to manage AMQ brokers
License: Apache License 2.0
A collection to manage AMQ brokers
License: Apache License 2.0
Hi Guido sorry for all the bug reports. I was trying the activemq_prometheus_enabled: True
setting in combination with ActiveMQ 2.18.0. I got the following error: Artemis Prometheus library not found. When I run the default playbook. When I check in the host the prometheus jar is not present not is there another promtheus jar form another version.
__________________________________________________________
/ TASK [middleware_automation.amq.activemq : Ensure lib is \
\ available to instance] /
----------------------------------------------------------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
fatal: [192.168.2.212]: FAILED! => {"changed": false, "msg": "Source /opt/amq/apache-artemis-2.18.0/lib/artemis-prometheus-metrics-plugin-1.1.0.redhat-00002.jar not found"}
fatal: [192.168.2.211]: FAILED! => {"changed": false, "msg": "Source /opt/amq/apache-artemis-2.18.0/lib/artemis-prometheus-metrics-plugin-1.1.0.redhat-00002.jar not found"}
Run this playbook
all:
children:
amq:
children:
left:
hosts: 192.168.2.211
right:
hosts: 192.168.2.212
vars:
activemq_configure_firewalld: True
activemq_prometheus_enabled: True
activemq_cors_strict_checking: False
ansible-playbook -i hosts_vagrant.yml playbooks/activemq.yml -v
To enable the Prometheus plugin
This is content of the directory /opt/amq/apache-artemis-2.18.0/lib
[ansible@amq1 lib]$ pwd
/opt/amq/apache-artemis-2.18.0/lib
[ansible@amq1 lib]$ ll
total 35612
-rw-r--r--. 1 amq-broker amq-broker 42380 Jan 22 2020 activemq-artemis-native-1.0.2.jar
-rw-r--r--. 1 amq-broker amq-broker 1435605 Jan 22 2020 activemq-client-5.16.0.jar
-rw-r--r--. 1 amq-broker amq-broker 685740 Jan 22 2020 activemq-openwire-legacy-5.16.0.jar
-rw-r--r--. 1 amq-broker amq-broker 78439 Jan 22 2020 airline-0.8.jar
-rw-r--r--. 1 amq-broker amq-broker 369380 Jan 22 2020 artemis-amqp-protocol-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 11009 Jan 22 2020 artemis-boot.jar
-rw-r--r--. 1 amq-broker amq-broker 543478 Jan 22 2020 artemis-cli-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 144577 Jan 22 2020 artemis-commons-2.18.0-tests.jar
-rw-r--r--. 1 amq-broker amq-broker 424458 Jan 22 2020 artemis-commons-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 819718 Jan 22 2020 artemis-core-client-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 28664 Jan 22 2020 artemis-dto-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 16997 Jan 22 2020 artemis-hornetq-protocol-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 21190 Jan 22 2020 artemis-hqclient-protocol-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 111266 Jan 22 2020 artemis-jdbc-store-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 193557 Jan 22 2020 artemis-jms-client-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 131270 Jan 22 2020 artemis-jms-server-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 244761 Jan 22 2020 artemis-journal-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 68641 Jan 22 2020 artemis-mqtt-protocol-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 1397684 Jan 22 2020 artemis-openwire-protocol-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 12763 Jan 22 2020 artemis-quorum-api-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 33930 Jan 22 2020 artemis-quorum-ri-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 164316 Jan 22 2020 artemis-ra-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 143053 Jan 22 2020 artemis-rest-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 107222 Jan 22 2020 artemis-selector-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 2282239 Jan 22 2020 artemis-server-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 37253 Jan 22 2020 artemis-service-extensions-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 113290 Jan 22 2020 artemis-stomp-protocol-2.18.0.jar
-rw-r--r--. 1 amq-broker amq-broker 25718 Jan 22 2020 artemis-web-2.18.0.jar
drwxr-xr-x. 2 amq-broker amq-broker 47 Jan 22 2020 client
-rw-r--r--. 1 amq-broker amq-broker 246918 Jan 22 2020 commons-beanutils-1.9.4.jar
-rw-r--r--. 1 amq-broker amq-broker 588337 Jan 22 2020 commons-collections-3.2.2.jar
-rw-r--r--. 1 amq-broker amq-broker 622580 Jan 22 2020 commons-configuration2-2.7.jar
-rw-r--r--. 1 amq-broker amq-broker 208475 Jan 22 2020 commons-dbcp2-2.7.0.jar
-rw-r--r--. 1 amq-broker amq-broker 587402 Jan 22 2020 commons-lang3-3.12.0.jar
-rw-r--r--. 1 amq-broker amq-broker 61829 Jan 22 2020 commons-logging-1.2.jar
-rw-r--r--. 1 amq-broker amq-broker 129592 Jan 22 2020 commons-pool2-2.7.0.jar
-rw-r--r--. 1 amq-broker amq-broker 207030 Jan 22 2020 commons-text-1.8.jar
-rw-r--r--. 1 amq-broker amq-broker 2982285 Jan 22 2020 curator-client-5.1.0.jar
-rw-r--r--. 1 amq-broker amq-broker 324263 Jan 22 2020 curator-framework-5.1.0.jar
-rw-r--r--. 1 amq-broker amq-broker 315120 Jan 22 2020 curator-recipes-5.1.0.jar
-rw-r--r--. 1 amq-broker amq-broker 2862361 Jan 22 2020 guava-30.1-jre.jar
-rw-r--r--. 1 amq-broker amq-broker 50155 Jan 22 2020 hawtbuf-1.11.jar
-rw-r--r--. 1 amq-broker amq-broker 46613 Jan 22 2020 jakarta.activation-api-1.2.2.jar
-rw-r--r--. 1 amq-broker amq-broker 9990 Jan 22 2020 jakarta.inject-api-1.0.3.jar
-rw-r--r--. 1 amq-broker amq-broker 57220 Jan 22 2020 jakarta.jms-api-2.0.3.jar
-rw-r--r--. 1 amq-broker amq-broker 43361 Jan 22 2020 jakarta.json-api-1.1.6.jar
-rw-r--r--. 1 amq-broker amq-broker 42750 Jan 22 2020 jakarta.security.auth.message-api-1.1.3.jar
-rw-r--r--. 1 amq-broker amq-broker 15392 Jan 22 2020 jakarta.transaction-api-1.3.3.jar
-rw-r--r--. 1 amq-broker amq-broker 115638 Jan 22 2020 jakarta.xml.bind-api-2.3.3.jar
-rw-r--r--. 1 amq-broker amq-broker 1133924 Jan 22 2020 jaxb-impl-2.3.3.jar
-rw-r--r--. 1 amq-broker amq-broker 60911 Jan 22 2020 jboss-logging-3.4.2.Final.jar
-rw-r--r--. 1 amq-broker amq-broker 420403 Jan 22 2020 jboss-logmanager-2.1.10.Final.jar
-rw-r--r--. 1 amq-broker amq-broker 252020 Jan 22 2020 jctools-core-2.1.2.jar
-rw-r--r--. 1 amq-broker amq-broker 3944338 Jan 22 2020 jetty-all-9.4.43.v20210629-uber.jar
-rw-r--r--. 1 amq-broker amq-broker 2501842 Jan 22 2020 jgroups-3.6.13.Final.jar
-rw-r--r--. 1 amq-broker amq-broker 88976 Jan 22 2020 johnzon-core-0.9.5.jar
-rw-r--r--. 1 amq-broker amq-broker 609773 Jan 22 2020 micrometer-core-1.6.3.jar
-rw-r--r--. 1 amq-broker amq-broker 4473968 Jan 22 2020 netty-all-4.1.66.Final.jar
-rw-r--r--. 1 amq-broker amq-broker 742536 Jan 22 2020 proton-j-0.33.8.jar
-rw-r--r--. 1 amq-broker amq-broker 803568 Jan 22 2020 qpid-jms-client-0.59.0.jar
-rw-r--r--. 1 amq-broker amq-broker 41071 Jan 22 2020 slf4j-api-1.7.21.jar
-rw-r--r--. 1 amq-broker amq-broker 10491 Jan 22 2020 slf4j-jboss-logmanager-1.0.4.GA.jar
-rw-r--r--. 1 amq-broker amq-broker 244276 Jan 22 2020 tomcat-servlet-api-8.5.5.jar
-rw-r--r--. 1 amq-broker amq-broker 283909 Jan 22 2020 wildfly-common-1.5.2.Final.jar
-rw-r--r--. 1 amq-broker amq-broker 1254153 Jan 22 2020 zookeeper-3.6.3.jar
-rw-r--r--. 1 amq-broker amq-broker 250399 Jan 22 2020 zookeeper-jute-3.6.3.jar
AMQ broker does not start correctly due to a problem in sysconfiog file. The Java arguments container the variable ARTEMIS_INSTANCE_ETC_URI. This is not defined in the sysconfig file. this leads to a problem starting the service and a 503 error for the webconsole.
1 Start the broker from the main playbook.
ansible-playbook -i hosts_vagrant.yml playbooks/activemq.yml -v
❯ cat hosts_vagrant.yml
all:
children:
amq:
children:
left:
hosts: 192.168.2.211
right:
hosts: 192.168.2.212
Service should start AMQ correctly
Service contains a not interpolated variable:
[root@amq1 log]# systemctl status amq-broker.service | more
● amq-broker.service - amq-broker Apache ActiveMQ Service
Loaded: loaded (/etc/systemd/system/amq-broker.service; enabled; vendor preset: disabled)
Active: active (running) since Fri 2022-09-02 08:11:07 UTC; 5min ago
Process: 17291 ExecStop=/opt/amq/amq-broker/bin/artemis-service stop (code=exited, status=0/SUCCESS)
Process: 17338 ExecStart=/opt/amq/amq-broker/bin/artemis-service start (code=exited, status=0/SUCCESS)
Main PID: 17346 (java)
Tasks: 47 (limit: 5953)
Memory: 179.5M
CGroup: /system.slice/amq-broker.service
└─17346 /usr/lib/jvm/java-11-openjdk-11.0.16.0.8-1.el8_6.x86_64/bin/java -Xms512M -Xmx2G -XX:+PrintClassHistogram -XX:+UseG1GC -XX:+UseStringDeduplication -Dhawtio.disableProxy=true -Dhawtio.realm=activemq -Dhawti
o.offline=true -Dhawtio.rolePrincipalClasses=org.apache.activemq.artemis.spi.core.security.jaas.RolePrincipal -Djolokia.policyLocation=**${ARTEMIS_INSTANCE_ETC_URI}jolokia-access.xml** -Dhawtio.role=amq -Xbootclasspath/a:/opt/
amq/apache-artemis-2.18.0/lib/jboss-logmanager-2.1.10.Final.jar:/opt/amq/apache-artemis-2.18.0/lib/wildfly-common-1.5.2.Final.jar -Djava.security.auth.login.config=/opt/amq/amq-broker/etc/login.config -classpath /opt/amq/a
pache-artemis-2.18.0/lib/artemis-boot.jar -Dartemis.home=/opt/amq/apache-artemis-2.18.0 -Dartemis.instance=/opt/amq/amq-broker -Djava.library.path=/opt/amq/apache-artemis-2.18.0/bin/lib/linux-x86_64 -Djava.io.tmpdir=/opt/a
mq/amq-broker/tmp -Ddata.dir=/opt/amq/amq-broker/data -Dartemis.instance.etc=/opt/amq/amq-broker/etc -Djava.util.logging.manager=org.jboss.logmanager.LogManager -Dlogging.configuration=file:/opt/amq/amq-broker/etc//logging
.properties org.apache.activemq.artemis.boot.Artemis run
Sep 02 08:11:06 amq1.test.local systemd[1]: amq-broker.service: Succeeded.
Sep 02 08:11:06 amq1.test.local systemd[1]: Stopped amq-broker Apache ActiveMQ Service.
Sep 02 08:11:06 amq1.test.local systemd[1]: Starting amq-broker Apache ActiveMQ Service...
Sep 02 08:11:06 amq1.test.local artemis-service[17338]: Starting artemis-service
Sep 02 08:11:07 amq1.test.local artemis-service[17338]: artemis-service is now running (17346)
Sep 02 08:11:07 amq1.test.local systemd[1]: Started amq-broker Apache ActiveMQ Service.
2022-09-02 08:11:10,228 WARN [org.eclipse.jetty.webapp.WebAppContext] Failed startup of context o.e.j.w.WebAppContext@cdb2d95{hawtio,/console,file:///opt/amq/amq-broker/tmp/webapps/jetty-0_0_0_0-8161-console_war-_console-any-13591837186429975029/webapp/,UNAVAILABLE}{/opt/amq/apache-artemis-2.18.0/web/console.war}: javax.servlet.ServletException: jolokia-agent==io.hawt.web.servlets.JolokiaConfiguredAgentServlet@c61e72f3{jsp=null,order=1,inst=true,async=false,src=DESCRIPTOR:file:///opt/amq/amq-broker/tmp/webapps/jetty-0_0_0_0-8161-console_war-_console-any-13591837186429975029/webapp/WEB-INF/web.xml,STARTED}
Caused by: java.lang.IllegalArgumentException: Unknown expression ARTEMIS_INSTANCE_ETC_URI in ${ARTEMIS_INSTANCE_ETC_URI}jolokia-access.xml
at org.jolokia.util.NetworkUtil.replaceExpression(NetworkUtil.java:362)
at org.jolokia.restrictor.RestrictorFactory.createRestrictor(RestrictorFactory.java:48)
at org.jolokia.http.AgentServlet.createRestrictor(AgentServlet.java:195)
at org.jolokia.http.AgentServlet.init(AgentServlet.java:135)
at io.hawt.web.servlets.JolokiaConfiguredAgentServlet.init(JolokiaConfiguredAgentServlet.java:55)
at org.eclipse.jetty.servlet.ServletHolder$Wrapper.init(ServletHolder.java:1345) [jetty-all-9.4.43.v20210629-uber.jar:9.4.43.v20210629]
at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:632) [jetty-all-9.4.43.v20210629-uber.jar:9.4.43.v20210629]
... 38 more
We want to be able to change the broker's name in the broker.xml, but without changing the name of all folders, the systemd unit file. The name change of the broker would improve te readability of our log, but having a different folder structure for each broker would be very difficult to manage.
The current name for this repo might not be the best suited. We need to investigate the opportunity to use a more appropriate named, more linked to community project.
Repeated running of the playbooks creates multiple security-setting blocks in the broker.xml. When I run the user-roles.yml task multiple times it creates repeated blocks in etc/broker.xml. This is not the desired configuration for the broker.xml for AMQ.
❯ ansible --version
ansible [core 2.13.3]
config file = /Users/robertfloor/amq/playbooks/ansible.cfg
configured module search path = ['/Users/robertfloor/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/Cellar/ansible/6.3.0/libexec/lib/python3.10/site-packages/ansible
ansible collection location = /Users/robertfloor/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.10.6 (main, Aug 11 2022, 13:49:25) [Clang 13.1.6 (clang-1316.0.21.2.5)]
jinja version = 3.1.2
libyaml = True
I get this problem when I run multiple times this command with default settings
ansible-playbook -i hosts_vagrant.yml activemq.yml -v
etc/broker.xml
<security-settings>
<security-setting match="#">
<permission type="createNonDurableQueue" roles="amq"/>
<permission type="deleteNonDurableQueue" roles="amq"/>
<permission type="createDurableQueue" roles="amq"/>
<permission type="deleteDurableQueue" roles="amq"/>
<permission type="createAddress" roles="amq"/>
<permission type="deleteAddress" roles="amq"/>
<permission type="consume" roles="amq"/>
<permission type="browse" roles="amq"/>
<permission type="send" roles="amq"/>
<!-- we need this otherwise ./artemis data imp wouldn't work -->
<permission type="manage" roles="amq"/>
</security-setting>
<security-setting match="#">
<permission type="createNonDurableQueue" roles="amq"/>
<permission type="deleteNonDurableQueue" roles="amq"/>
<permission type="createDurableQueue" roles="amq"/>
<permission type="deleteDurableQueue" roles="amq"/>
<permission type="createAddress" roles="amq"/>
<permission type="deleteAddress" roles="amq"/>
<permission type="consume" roles="amq"/>
<permission type="browse" roles="amq"/>
<permission type="send" roles="amq"/>
<permission type="manage" roles="amq"/>
</security-setting>
<security-setting match="#">
<permission type="createNonDurableQueue" roles="amq"/>
<permission type="deleteNonDurableQueue" roles="amq"/>
<permission type="createDurableQueue" roles="amq"/>
<permission type="deleteDurableQueue" roles="amq"/>
<permission type="createAddress" roles="amq"/>
<permission type="deleteAddress" roles="amq"/>
<permission type="consume" roles="amq"/>
<permission type="browse" roles="amq"/>
<permission type="send" roles="amq"/>
<permission type="manage" roles="amq"/>
</security-setting>
</security-settings>
I believe it is caused by this task
- name: Create messaging roles permissions
xml:
path: "{{ amq_broker.instance_home }}/etc/broker.xml"
xpath: /conf:configuration/core:core/core:security-settings
input_type: xml
add_children: "{{ lookup('template', 'security_settings.broker.xml.j2') }}"
namespaces:
conf: urn:activemq
core: urn:activemq:core
pretty_print: yes
changed_when: False
loop: "{{ amq_broker_roles }}"
become: yes
become_user: "{{ amq_broker_service_user }}"
Hi, thank you so far for the nice project. I was reviewing the code and was wondering why the tasks configure.yml and upgrade.yml are part of the systemd.yml task. I think it would be more logical to include them as separate tasks in the main.yml. Would it be possible to change this or are you open to a pull request which would include them as separate tasks in the main.yml?
Thank you in advance,
Also the current main.yaml is a combination of include_tasks and specific log tasks which is not the most logical combination
- name: Check prerequisites
ansible.builtin.include_tasks: prereqs.yml
tags:
- prereqs
- name: Include firewall config tasks
ansible.builtin.include_tasks: firewalld.yml
when: amq_broker_configure_firewalld
tags:
- firewall
- name: Include install tasks
ansible.builtin.include_tasks: install.yml
tags:
- install
- name: Include systemd tasks
ansible.builtin.include_tasks: systemd.yml
tags:
- systemd
- name: Create default logs directory
become: yes
ansible.builtin.file:
state: directory
dest: "/var/log/{{ amq_broker.service_name }}"
owner: "{{ amq_broker_service_user }}"
group: "{{ amq_broker_service_group }}"
mode: 0750
- name: Link default logs directory
become: yes
ansible.builtin.file:
state: link
src: "{{ amq_broker.instance_home }}/log"
dest: "/var/log/{{ amq_broker.service_name }}/{{ amq_broker.instance_name }}"
XSD missing for AMQ version 7.11. I am trying to install AMQ version 7.11 but I am missing the latest xsd for this version.
ansible --version
ansible [core 2.12.2]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
ansible collection location = /home/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.8.13 (default, Jun 14 2022, 17:49:07) [GCC 8.5.0 20210514 (Red Hat 8.5.0-13)]
jinja version = 2.11.3
libyaml = True
ansible-galaxy collection list
# /.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.5.2
community.general 6.0.1
middleware_automation.amq 1.3.3
middleware_automation.common 1.1.0
middleware_automation.redhat_csp_download 1.2.2
# /usr/lib/python3.8/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 2.1.0
ansible.netcommon 2.5.1
ansible.posix 1.3.0
ansible.utils 2.5.0
ansible.windows 1.9.0
arista.eos 3.1.0
awx.awx 19.4.0
azure.azcollection 1.11.0
check_point.mgmt 2.2.2
chocolatey.chocolatey 1.2.0
cisco.aci 2.1.0
cisco.asa 2.1.0
cisco.intersight 1.0.18
cisco.ios 2.7.1
cisco.iosxr 2.7.0
cisco.ise 1.2.1
cisco.meraki 2.6.0
cisco.mso 1.3.0
cisco.nso 1.0.3
cisco.nxos 2.9.0
cisco.ucs 1.6.0
cloud.common 2.1.0
cloudscale_ch.cloud 2.2.0
community.aws 2.3.0
community.azure 1.1.0
community.ciscosmb 1.0.4
community.crypto 2.2.2
community.digitalocean 1.15.1
community.dns 2.0.7
community.docker 2.2.0
community.fortios 1.0.0
community.general 4.5.0
community.google 1.0.0
community.grafana 1.3.2
community.hashi_vault 2.3.0
community.hrobot 1.2.2
community.kubernetes 2.0.1
community.kubevirt 1.0.0
community.libvirt 1.0.2
community.mongodb 1.3.2
community.mysql 2.3.4
community.network 3.0.0
community.okd 2.1.0
community.postgresql 1.7.0
community.proxysql 1.3.1
community.rabbitmq 1.1.0
community.routeros 2.0.0
community.skydive 1.0.0
community.sops 1.2.0
community.vmware 1.17.1
community.windows 1.9.0
community.zabbix 1.5.1
containers.podman 1.9.1
cyberark.conjur 1.1.0
cyberark.pas 1.0.13
dellemc.enterprise_sonic 1.1.0
dellemc.openmanage 4.4.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
f5networks.f5_modules 1.14.0
fortinet.fortimanager 2.1.4
fortinet.fortios 2.1.4
frr.frr 1.0.3
gluster.gluster 1.0.2
google.cloud 1.0.2
hetzner.hcloud 1.6.0
hpe.nimble 1.1.4
ibm.qradar 1.0.3
infinidat.infinibox 1.3.3
infoblox.nios_modules 1.2.1
inspur.sm 1.3.0
junipernetworks.junos 2.9.0
kubernetes.core 2.2.3
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.14.0
netapp.elementsw 21
Deploy via the default playbook
$ cat AMQ-acc-cluster.yml
all:
children:
amq:
children:
ha1:
hosts: 1
vars:
artemis: 1
node0: 2
ha2:
hosts: 2
vars:
artemis: 2
node0: 1
vars:
activemq_configure_firewalld: True
activemq_prometheus_enabled: True
amq_broker_enable: True
activemq_cors_strict_checking: False
activemq_disable_hornetq_protocol: true
activemq_disable_mqtt_protocol: true
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /amq_nfs
ansible_user:
activemq_offline_install: True
activemq_version: 7.11.0
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
activemq_port: 61616
activemq_instance_username: amq-admin
activemq_sa_password: "amq-sa-password"
activemq_testers_password: "amq-testers-password"
Ansible should install AMQ 7.11.
Ansible playbook did not install successfully. It is missing the artemis-configuration-7.11.xsd.
TASK [middleware_automation.amq.activemq : Make collection xsd available for validation] ***********************************************************************************************************************************************************************************************************************************************
task path: .ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml:68
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: If you are using a module and expect the file to exist on the remote, see the remote_src option
fatal: [1] -> localhost]: FAILED! => changed=false
msg: |-
Could not find or access 'artemis-configuration-7.11.xsd'
Searched in:
/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/files/artemis-configuration-7.11.xsd
/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/artemis-configuration-7.11.xsd
/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/files/artemis-configuration-7.11.xsd
/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/artemis-configuration-7.11.xsd
/code/ansible-configuration/playbooks/files/artemis-configuration-7.11.xsd
/code/ansible-configuration/playbooks/artemis-configuration-7.11.xsd on the Ansible Controller.
If you are using a module and expect the file to exist on the remote, see the remote_src option
TASK [middleware_automation.amq.activemq : Fetch artemis configuration xsd schema for requested version] *******************************************************************************************************************************************************************************************************************************
task path: /.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml:85
failed: [ -> localhost] (item=https://github.com/apache/activemq-artemis/raw/7.11.0/artemis-server/src/main/resources/schema/artemis-configuration.xsd) => changed=false
ansible_loop_var: item
dest: /code/ansible-configuration/
elapsed: 10
gid: 627600513
item: https://github.com/apache/activemq-artemis/raw/7.11.0/artemis-server/src/main/resources/schema/artemis-configuration.xsd
mode: '0755'
msg: 'Request failed: <urlopen error timed out>'
secontext: unconfined_u:object_r:user_home_t:s0
size: 4096
state: directory
uid: 627615280
url: https://github.com/apache/activemq-artemis/raw/7.11.0/artemis-server/src/main/resources/schema/artemis-configuration.xsd
failed: -> localhost] (item=https://www.w3.org/2001/03/xml.xsd) => changed=false
ansible_loop_var: item
dest: l/code/ansible-configuration/
elapsed: 20
gid: 627600513
item: https://www.w3.org/2001/03/xml.xsd
mode: '0755'
msg: 'Request failed: <urlopen error [Errno 101] Network is unreachable>'
secontext: unconfined_u:object_r:user_home_t:s0
size: 4096
state: directory
uid: 627615280
url: https://www.w3.org/2001/03/xml.xsd
There is a small documentation error here: https://github.com/ansible-middleware/amq/tree/main/roles/activemq
It should be activemq_tls_keystore_password instead of activemq_tls_keystore_pasword .
Sample connector with TLS:
- name: amqp
address: 172.168.10.43
port: 61616
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE
useEpoll: true
sslEnabled: True
keyStorePath: "{{ activemq_tls_keystore_dest }}"
keyStorePassword: "{{ activemq_tls_keystore_pasword }}"
trustStorePath: "{{ activemq_tls_truststore_dest }}"
trustStorePassword: "{{ activemq_tls_truststore_password }}"
verifyHost: False
We want to be able to allow log rotation for AMQ. We would like to store only the last two days on disk since the full Audit logs take up to much space. Would this be possible?
When using Microsoft Active Directory as a LDAP backend for authentication, it is required to configure referral for the LDAPLoginModule. Default is to ignore referrals but AD requires to follow them.
From the Artemis documentation:
specify how to handle referrals; valid values: ignore, follow, throw; default is ignore
So, it would be nice to allow the configuration of referrals or maybe even better allow to provide an own template for login.config
Be able to run the playbook with the --check option to see changes before they are applied. We would like to determine changes made by to be made by the playbook before they are applied. We can deploy the playbook but --check does not work
ansible-playbook -i hostfiles/dev/AMQ-dev-cluster.yml -e activemq_version="7.11.1" playbooks/install-broker-dev.yml --check -v
However currently this goes wrong. i cannot see why due to no_log: true
TASK [middleware_automation.amq.activemq : Add masked password to users list] ***************************************************************************************************************
fatal: [amq1]: FAILED! =>
censored: 'the output has been hidden due to the fact that ''no_log: true'' was specified for this result'
fatal: [amq2]: FAILED! =>
censored: 'the output has been hidden due to the fact that ''no_log: true'' was specified for this result'
This task works in normal install mode.
We want to run tasks that only change users or queues on our production brokers. Even if the user changes other settings it only should change the tasks with the tag. I see that the playbook has support for tags but for us this is not running correctly.
❯ ansible --version
ansible [core 2.15.3]
config file = /home/robert/AMQ-Ansible-config/ansible-configuration/ansible.cfg
configured module search path = ['/home/robert/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/linuxbrew/.linuxbrew/Cellar/ansible/8.3.0/libexec/lib/python3.11/site-packages/ansible
ansible collection location = /home/robert/.ansible/collections:/usr/share/ansible/collections
executable location = /home/linuxbrew/.linuxbrew/bin/ansible
python version = 3.11.5 (main, Aug 24 2023, 12:23:19) [GCC 11.4.0] (/home/linuxbrew/.linuxbrew/Cellar/ansible/8.3.0/libexec/bin/python)
jinja version = 3.1.2
libyaml = True
# /home/linuxbrew/.linuxbrew/Cellar/ansible/8.3.0/libexec/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------------------- -------
amazon.aws 6.3.0
ansible.netcommon 5.1.2
ansible.posix 1.5.4
ansible.utils 2.10.3
ansible.windows 1.14.0
arista.eos 6.0.1
awx.awx 22.6.0
azure.azcollection 1.16.0
check_point.mgmt 5.1.1
chocolatey.chocolatey 1.5.1
cisco.aci 2.7.0
cisco.asa 4.0.1
cisco.dnac 6.7.3
cisco.intersight 1.0.27
cisco.ios 4.6.1
cisco.iosxr 5.0.3
cisco.ise 2.5.14
cisco.meraki 2.15.3
cisco.mso 2.5.0
cisco.nso 1.0.3
cisco.nxos 4.4.0
cisco.ucs 1.10.0
cloud.common 2.1.4
cloudscale_ch.cloud 2.3.1
community.aws 6.2.0
community.azure 2.0.0
community.ciscosmb 1.0.6
community.crypto 2.15.0
community.digitalocean 1.24.0
community.dns 2.6.0
community.docker 3.4.8
community.fortios 1.0.0
community.general 7.3.0
community.google 1.0.0
community.grafana 1.5.4
community.hashi_vault 5.0.0
community.hrobot 1.8.1
community.libvirt 1.2.0
community.mongodb 1.6.1
community.mysql 3.7.2
community.network 5.0.0
community.okd 2.3.0
community.postgresql 2.4.3
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.9.0
community.sap 1.0.0
community.sap_libs 1.4.1
community.skydive 1.0.0
community.sops 1.6.4
community.vmware 3.9.0
community.windows 1.13.0
community.zabbix 2.1.0
containers.podman 1.10.2
cyberark.conjur 1.2.0
cyberark.pas 1.0.19
dellemc.enterprise_sonic 2.2.0
dellemc.openmanage 7.6.1
dellemc.powerflex 1.7.0
dellemc.unity 1.7.1
f5networks.f5_modules 1.25.1
fortinet.fortimanager 2.2.1
fortinet.fortios 2.3.1
frr.frr 2.0.2
gluster.gluster 1.0.2
google.cloud 1.2.0
grafana.grafana 2.1.5
hetzner.hcloud 1.16.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.12.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.5.0
inspur.ispim 1.3.0
inspur.sm 2.3.0
junipernetworks.junos 5.2.0
kubernetes.core 2.4.0
lowlydba.sqlserver 2.1.0
microsoft.ad 1.3.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.7.0
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.13.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 2.1.0
openvswitch.openvswitch 2.1.1
ovirt.ovirt 3.1.2
purestorage.flasharray 1.20.0
purestorage.flashblade 1.12.1
purestorage.fusion 1.6.0
sensu.sensu_go 1.14.0
servicenow.servicenow 1.0.6
splunk.es 2.1.0
t_systems_mms.icinga_director 1.33.1
telekom_mms.icinga_director 1.34.1
theforeman.foreman 3.12.0
vmware.vmware_rest 2.3.1
vultr.cloud 1.8.0
vyos.vyos 4.1.0
wti.remote 1.0.5
# /home/robert/.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.5.4
community.general 6.0.1
middleware_automation.amq 1.3.11
middleware_automation.common 1.1.2
middleware_automation.redhat_csp_download 1.2.2
❯ ansible-playbook -i hostfiles/dev/AMQ-dev-cluster.yml -e activemq_version="7.11.1" playbooks/install-broker-dev.yml --list-tasks --tags install
playbook: playbooks/install-broker-dev.yml
play #1 (all): Playbook for Red Hat AMQ Broker TAGS: []
tasks:
middleware_automation.amq.activemq : Validating arguments against arg spec 'main' TAGS: [always]
middleware_automation.amq.activemq : Include install tasks TAGS: [install]
❯ ansible-playbook -i hostfiles/dev/AMQ-dev-cluster.yml -e activemq_version="7.11.1" playbooks/install-broker-dev.yml -v --tags "install"
Using /home/robert/AMQ-Ansible-config/ansible-configuration/ansible.cfg as config file
PLAY [Playbook for Red Hat AMQ Broker] *************************************************************************************************************************************************************************************************************************************************
TASK [Gathering Facts] *****************************************************************************************************************************************************************************************************************************************************************
ok: [amq1]
ok: [amq2]
TASK [middleware_automation.amq.activemq : Validating arguments against arg spec 'main'] ***********************************************************************************************************************************************************************************************
ok: [amq1] => changed=false
msg: The arg spec validation passed
validate_args_context:
argument_spec_name: main
name: activemq
path: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq
type: role
ok: [amq2] => changed=false
msg: The arg spec validation passed
validate_args_context:
argument_spec_name: main
name: activemq
path: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq
type: role
TASK [middleware_automation.amq.activemq : Include install tasks] **********************************************************************************************************************************************************************************************************************
included: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/install.yml for amq1, amq2
PLAY RECAP *****************************************************************************************************************************************************************************************************************************************************************************
amq1 : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
amq2 : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
❯ ansible-playbook -i hostfiles/dev/AMQ-dev-cluster.yml -e activemq_version="7.11.1" playbooks/install-broker-dev.yml -v --tags "users"
Using /home/robert/AMQ-Ansible-config/ansible-configuration/ansible.cfg as config file
PLAY [Playbook for Red Hat AMQ Broker] *************************************************************************************************************************************************************************************************************************************************
TASK [Gathering Facts] *****************************************************************************************************************************************************************************************************************************************************************
ok: [amq2]
ok: [amq1]
TASK [middleware_automation.amq.activemq : Validating arguments against arg spec 'main'] ***********************************************************************************************************************************************************************************************
ok: [amq1] => changed=false
msg: The arg spec validation passed
validate_args_context:
argument_spec_name: main
name: activemq
path: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq
type: role
ok: [amq2] => changed=false
msg: The arg spec validation passed
validate_args_context:
argument_spec_name: main
name: activemq
path: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq
type: role
PLAY RECAP *****************************************************************************************************************************************************************************************************************************************************************************
amq1 : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
amq2 : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
❯ ansible-playbook -i hostfiles/dev/AMQ-dev-cluster.yml -e activemq_version="7.11.1" playbooks/install-broker-dev.yml -v --tags "systemd"
Using /home/robert/AMQ-Ansible-config/ansible-configuration/ansible.cfg as config file
PLAY [Playbook for Red Hat AMQ Broker] *************************************************************************************************************************************************************************************************************************************************
TASK [Gathering Facts] *****************************************************************************************************************************************************************************************************************************************************************
ok: [amq1]
ok: [amq2]
TASK [middleware_automation.amq.activemq : Validating arguments against arg spec 'main'] ***********************************************************************************************************************************************************************************************
ok: [amq2] => changed=false
msg: The arg spec validation passed
validate_args_context:
argument_spec_name: main
name: activemq
path: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq
type: role
ok: [amq1] => changed=false
msg: The arg spec validation passed
validate_args_context:
argument_spec_name: main
name: activemq
path: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq
type: role
TASK [middleware_automation.amq.activemq : Include systemd tasks] **********************************************************************************************************************************************************************************************************************
included: /home/robert/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/systemd.yml for amq1, amq2
PLAY RECAP *****************************************************************************************************************************************************************************************************************************************************************************
amq1 : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
amq2 : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
We expect that the tasks included such as firewall or user are executed fully. now it seems only the include task is executed.
Included tasks are not ran.
We want to set global_max_size in the Ansible configuration for the broker. Currently, it seems this setting has no effect.
ansible [core 2.14.2]
config file = /etc/ansible/ansible.cfg
configured module search path = ['//.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.11/site-packages/ansible
ansible collection location =.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.11.2 (main, Feb 17 2023, 09:28:16) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] (/usr/bin/python3.11)
jinja version = 3.1.2
libyaml = True
# /usr/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 5.2.0
ansible.netcommon 4.1.0
ansible.posix 1.5.1
ansible.utils 2.9.0
ansible.windows 1.13.0
arista.eos 6.0.0
awx.awx 21.11.0
azure.azcollection 1.14.0
check_point.mgmt 4.0.0
chocolatey.chocolatey 1.4.0
cisco.aci 2.3.0
cisco.asa 4.0.0
cisco.dnac 6.6.3
cisco.intersight 1.0.23
cisco.ios 4.3.1
cisco.iosxr 4.1.0
cisco.ise 2.5.12
cisco.meraki 2.15.0
cisco.mso 2.2.1
cisco.nso 1.0.3
cisco.nxos 4.0.1
cisco.ucs 1.8.0
cloud.common 2.1.2
cloudscale_ch.cloud 2.2.4
community.aws 5.2.0
community.azure 2.0.0
community.ciscosmb 1.0.5
community.crypto 2.10.0
community.digitalocean 1.23.0
community.dns 2.5.0
community.docker 3.4.0
community.fortios 1.0.0
community.general 6.3.0
community.google 1.0.0
community.grafana 1.5.3
community.hashi_vault 4.1.0
community.hrobot 1.7.0
community.libvirt 1.2.0
community.mongodb 1.4.2
community.mysql 3.5.1
community.network 5.0.0
community.okd 2.2.0
community.postgresql 2.3.2
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.7.0
community.sap 1.0.0
community.sap_libs 1.4.0
community.skydive 1.0.0
community.sops 1.6.0
community.vmware 3.3.0
community.windows 1.12.0
community.zabbix 1.9.1
containers.podman 1.10.1
cyberark.conjur 1.2.0
cyberark.pas 1.0.17
dellemc.enterprise_sonic 2.0.0
dellemc.openmanage 6.3.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
dellemc.powerflex 1.5.0
dellemc.unity 1.5.0
f5networks.f5_modules 1.22.0
fortinet.fortimanager 2.1.7
fortinet.fortios 2.2.2
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.1.2
grafana.grafana 1.1.0
hetzner.hcloud 1.9.1
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.11.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.4.1
inspur.ispim 1.2.0
inspur.sm 2.3.0
junipernetworks.junos 4.1.0
kubernetes.core 2.3.2
lowlydba.sqlserver 1.3.1
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.2.0
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.10.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.4.1
purestorage.flasharray 1.16.2
purestorage.flashblade 1.10.0
purestorage.fusion 1.3.0
sensu.sensu_go 1.13.2
splunk.es 2.1.0
t_systems_mms.icinga_director 1.32.0
theforeman.foreman 3.8.0
vmware.vmware_rest 2.2.0
vultr.cloud 1.7.0
vyos.vyos 4.0.0
wti.remote 1.0.4
# ns
Collection Version
----------------------------------------- -------
ansible.posix 1.5.4
community.general 6.0.1
middleware_automation.amq 1.3.6
middleware_automation.common 1.1.0
middleware_automation.redhat_csp_download 1.2.2
all:
children:
amq:
children:
ha1:
hosts: xxxx
vars:
artemis: xxxx
node0: xxx
ha2:
hosts: xxxx
vars:
artemis: xxxx
node0: xxxx
vars:
iface: ens192
activemq_configure_firewalld: True
activemq_prometheus_enabled: True
amq_broker_enable: True
activemq_cors_strict_checking: False
activemq_disable_hornetq_protocol: true
activemq_disable_mqtt_protocol: true
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /mnt/asb2-acc-internal
activemq_offline_install: True
activemq_java_home: /etc/alternatives/jre_11_openjdk
activemq_version: 7.11.0
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
activemq_port: 61616
activemq_instance_username: amq-admin
activemq_enable_audit: true
activemq_hawtio_role: "amq,g-ap-com-amq-acc-internal"
global_max_size: 10737418240
activemq_address_settings:
- match: "#"
parameters:
dead_letter_address: DLQ
expiry_address: ExpiryQueue
redelivery_delay: 2000
max_size_bytes: 10737418240
message_counter_history_day_limit: 10
max_delivery_attempts: -1
max_redelivery_delay: 300000
redelivery_delay_multiplier: 2
address_full_policy: PAGE
auto_create_queues: true
auto_create_addresses: true
auto_create_jms_queues: true
auto_create_jms_topics: true
activemq_users:
- user: "{{ activemq_instance_username }}"
password: "{{ activemq_instance_password }}"
roles: [ amq ]
- user: "asb-application-sa"
password: "{{ activemq_sa_password }}"
roles: [ amq ]
- user: "asb-testers-sa"
password: "{{ activemq_testers_password }}"
roles: [ sa-amq ]
activemq_roles:
- name: amq
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createNonDurableQueue, deleteNonDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
- name: sa-amq
match: '#'
permissions: [ createDurableQueue, createAddress, consume, browse, send ]
- name: g-ap-com-amq-acc-internal
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
activemq_tls_enabled: True
activemq_tls_keystore_path: "keystore.jks"
activemq_tls_truststore_path: "truststore.jks"
activemq_tls_truststore_password: "changeit"
activemq_acceptors:
- name: amqp
bind_address: "0.0.0.0"
bind_port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,OPENWIRE
useEpoll: true
sslEnabled: True
keyStorePath: "{{ activemq_tls_keystore_dest }}"
keyStorePassword: "{{ activemq_tls_keystore_password }}"
trustStorePath: "{{ activemq_tls_truststore_dest }}"
trustStorePassword: "{{ activemq_tls_truststore_password }}"
verifyHost: False
activemq_connectors:
- name: artemis
address: "{{ artemis }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
sslEnabled: True
keyStorePath: "{{ activemq_tls_keystore_dest }}"
keyStorePassword: "{{ activemq_tls_keystore_password }}"
trustStorePath: "{{ activemq_tls_truststore_dest }}"
trustStorePassword: "{{ activemq_tls_truststore_password }}"
- name: node0
address: "{{ node0 }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
sslEnabled: True
keyStorePath: "{{ activemq_tls_keystore_dest }}"
keyStorePassword: "{{ activemq_tls_keystore_password }}"
trustStorePath: "{{ activemq_tls_truststore_dest }}"
trustStorePassword: "{{ activemq_tls_truststore_password }}"
I would expect global-max-size to be set to the specified value in broker.xml
global-max-size was not changed and remained commented out
See this snippet
<!-- how often we are looking for how many bytes are being used on the disk in ms -->
<disk-scan-period>5000</disk-scan-period>
<!-- once the disk hits this limit the system will block, or close the connection in certain protocols
that won't support flow control. -->
<max-disk-usage>90</max-disk-usage>
<!-- should the broker detect dead locks and other issues -->
<critical-analyzer>true</critical-analyzer>
<critical-analyzer-timeout>120000</critical-analyzer-timeout>
<critical-analyzer-check-period>60000</critical-analyzer-check-period>
<critical-analyzer-policy>HALT</critical-analyzer-policy>
<page-sync-timeout>604000</page-sync-timeout>
<!-- the system will enter into page mode once you hit this limit. This is an estimate in bytes of how much the messages are using in memory
The system will use half of the available memory (-Xmx) by default for the global-max-size.
You may specify a different value here if you need to customize it to your needs.
<global-max-size>100Mb</global-max-size> -->
<!-- the maximum number of messages accepted before entering full address mode.
if global-max-size is specified the full address mode will be specified by whatever hits it first. -->
<global-max-messages>-1</global-max-messages>
Difference in ownership between different log files. AMQ creates a new log file each day. The log file in the directory /var/log/activemq/amq-broker are owner by either amq-broker or by root. For our monitoring solution it would be best if they are not owned by root. The initial file is owned by amq-broker, but later ones are owned by root.
We are running AMQ 7.11
#------
Collection Version
----------------------------------------- -------
ansible.posix 1.5.2
community.general 6.0.1
middleware_automation.amq 1.3.5
middleware_automation.common 1.1.0
middleware_automation.redhat_csp_download 1.2.2
/var/log/activemq/amq-broker
bash-4.4$ ls -alh
total 172K
drwxr-xr-x. 2 amq-broker amq-broker 102 Jun 5 10:20 .
drwxrwxr-x. 9 amq-broker amq-broker 85 May 22 16:48 ..
-rw-r--r--. 1 root root 49K Jun 5 13:43 artemis.log
-rw-r--r--. 1 amq-broker amq-broker 16K May 22 16:51 artemis.log.2023-05-22
-rw-r--r--. 1 root root 103K Jun 2 15:38 artemis.log.2023-06-02
-rw-r--r--. 1 root root 0 Jun 5 10:20 audit.log
improvement
Hi, I have downloaded the collection from https://console.redhat.com/ansible/automation-hub/repo/published/redhat/amq_broker/docs/, and I have seen that it is not starting.
$ ansible --version
ansible [core 2.14.5]
config file = None
configured module search path = xxx
ansible python module location = /opt/homebrew/Cellar/ansible/7.5.0/libexec/lib/python3.11/site-packages/ansible
ansible collection location = xxx
executable location = /opt/homebrew/bin/ansible
python version = 3.11.3 (main, Apr 7 2023, 20:13:31) [Clang 14.0.0 (clang-1400.0.29.202)] (/opt/homebrew/Cellar/ansible/7.5.0/libexec/bin/python3.11)
jinja version = 3.1.2
libyaml = True
1.3.7
ansible-playbook -i ~/ansible_inventory --connection=local activemq.yml
Playbook progressing its execution
$ pwd
~/.ansible/collections/ansible_collections/redhat/amq_broker/playbooks
$ ansible-playbook -i ~/ansible_inventory --connection=local activemq.yml
amq_broker_disable_stomp_protocol: False
[WARNING]: running playbook inside collection redhat.amq_broker
PLAY [Playbook for ActiveMQ Artemis] ***************************************************************************************************
TASK [Gathering Facts] *****************************************************************************************************************
[WARNING]: Platform darwin on host localhost is using the discovered Python interpreter at /opt/homebrew/bin/python3.11, but future
installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible-
core/2.14/reference_appendices/interpreter_discovery.html for more information.
ok: [localhost]
TASK [redhat.amq_broker.amq_broker : Validating arguments against arg spec 'main'] *****************************************************
fatal: [localhost]: FAILED! => {"msg": "'activemq_version' is undefined. 'activemq_version' is undefined"}
PLAY RECAP *****************************************************************************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
I could temporarily progress in my playbook's execution by editing the file roles/amq_broker/defaults/main.yml, thus replacing:
amq_broker_logger_config_template: "{{ 'log4j2.properties' if activemq_version is version_compare('7.11.0', '>=') else 'logging.properties' }}"
with
amq_broker_logger_config_template: "{{ 'log4j2.properties' if amq_broker_version is version_compare('7.11.0', '>=') else 'logging.properties' }}"
The Java Version used is hardcoded in this file for systemd: https://github.com/ansible-middleware/amq/blob/main/roles/activemq/templates/amq_broker.sysconfig.j2, /etc/sysconfig/amq-broker. That means that if you update the packages in between without changing this file, (and it includes a java update) you are no longer able to start using systemd. We have different persons responsible for updating systems and running the application. So the infrastructure team can update the packages without the AMQ team knowing. This will leas to a situation where the broker cannot start with systemd. Therefore I believe this strict coupling in this file with the Java version is not desired.
ansible [core 2.12.2]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/[email protected]/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
ansible collection location = /home/[email protected]/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.8.13 (default, Jun 14 2022, 17:49:07) [GCC 8.5.0 20210514 (Red Hat 8.5.0-13)]
jinja version = 2.11.3
libyaml = True
Collection Version
----------------------------------------- -------
ansible.posix 1.5.4
community.general 6.0.1
middleware_automation.amq 1.3.6
middleware_automation.common 1.1.0
middleware_automation.redhat_csp_download 1.2.2
# /usr/lib/python3.8/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 2.1.0
ansible.netcommon 2.5.1
ansible.posix 1.3.0
ansible.utils 2.5.0
ansible.windows 1.9.0
arista.eos 3.1.0
awx.awx 19.4.0
azure.azcollection 1.11.0
check_point.mgmt 2.2.2
chocolatey.chocolatey 1.2.0
cisco.aci 2.1.0
cisco.asa 2.1.0
cisco.intersight 1.0.18
cisco.ios 2.7.1
cisco.iosxr 2.7.0
cisco.ise 1.2.1
cisco.meraki 2.6.0
cisco.mso 1.3.0
cisco.nso 1.0.3
cisco.nxos 2.9.0
cisco.ucs 1.6.0
cloud.common 2.1.0
cloudscale_ch.cloud 2.2.0
community.aws 2.3.0
community.azure 1.1.0
community.ciscosmb 1.0.4
community.crypto 2.2.2
community.digitalocean 1.15.1
community.dns 2.0.7
community.docker 2.2.0
community.fortios 1.0.0
community.general 4.5.0
community.google 1.0.0
community.grafana 1.3.2
community.hashi_vault 2.3.0
community.hrobot 1.2.2
community.kubernetes 2.0.1
community.kubevirt 1.0.0
community.libvirt 1.0.2
community.mongodb 1.3.2
community.mysql 2.3.4
community.network 3.0.0
community.okd 2.1.0
community.postgresql 1.7.0
community.proxysql 1.3.1
community.rabbitmq 1.1.0
community.routeros 2.0.0
community.skydive 1.0.0
community.sops 1.2.0
community.vmware 1.17.1
community.windows 1.9.0
community.zabbix 1.5.1
containers.podman 1.9.1
cyberark.conjur 1.1.0
cyberark.pas 1.0.13
dellemc.enterprise_sonic 1.1.0
dellemc.openmanage 4.4.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
f5networks.f5_modules 1.14.0
fortinet.fortimanager 2.1.4
fortinet.fortios 2.1.4
frr.frr 1.0.3
gluster.gluster 1.0.2
google.cloud 1.0.2
hetzner.hcloud 1.6.0
hpe.nimble 1.1.4
ibm.qradar 1.0.3
infinidat.infinibox 1.3.3
infoblox.nios_modules 1.2.1
inspur.sm 1.3.0
junipernetworks.junos 2.9.0
kubernetes.core 2.2.3
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.14.0
netapp.elementsw 21.7.0
netapp.ontap 21.16.0
netapp.storagegrid 21.9.0
netapp.um_info 21.8.0
netapp_eseries.santricity 1.2.13
netbox.netbox 3.5.1
ngine_io.cloudstack 2.2.3
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.0
openstack.cloud 1.7.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 1.6.6
purestorage.flasharray 1.12.1
purestorage.flashblade 1.9.0
sensu.sensu_go 1.13.0
servicenow.servicenow 1.0.6
splunk.es 1.0.2
t_systems_mms.icinga_director 1.27.1
theforeman.foreman 2.2.0
vyos.vyos 2.7.0
wti.remote 1.0.3
systemctl start amq-broker.service
no longer works since it points to an old java version in JAVA_HOMEI expect the broker to continue to work with systemd after a java minor update
Systemctl did not start did not give any other logging then cannot start. It was still possible to start artemis-service
Running the default playbook and getting a problem.
ansible-galaxy collection install -r requirements.yml --force
ansible-playbook -i hosts_vagrant.yml playbooks/activemq.yml -v
_______________________________________________________
< TASK [activemq : Parse passwd for existing user salt] >
-------------------------------------------------------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
[WARNING]: an unexpected error occurred during Jinja2 environment setup: unable to locate collection middleware_automation.amq
fatal: [192.168.2.211]: FAILED! => {"msg": "template error while templating string: unable to locate collection middleware_automation.amq. String: {{ item.password | middleware_automation.amq.pbkdf2_hmac(hexsalt=existing_user[0]) }}"}
[WARNING]: an unexpected error occurred during Jinja2 environment setup: unable to locate collection middleware_automation.amq
fatal: [192.168.2.212]: FAILED! => {"msg": "template error while templating string: unable to locate collection middleware_automation.amq. String: {{ item.password | middleware_automation.amq.pbkdf2_hmac(hexsalt=existing_user[0]) }}"}
I was comparing version middleware_automation.amq:1.2.0 to version 'middleware_automation.amq:1.1.1. In version 1.1.1 everything is ok, but in version 1.2.0 the shared storage path is not mounted correctly. I attached the broker.xml files. for both versions. In the new version these four variables are not set correctly in the broker.xml. This results in all kind of errors in the console. With the same input files I get these results
Compare the broker.xml
files after installation:
in version the broker.xml is nor correct1.2.0
<paging-directory>data/paging</paging-directory>
<bindings-directory>data/bindings</bindings-directory>
<journal-directory>data/journal</journal-directory>
<large-messages-directory>data/largemessages</large-messages-directory>
in 1.1.1 it is correct.
<paging-directory>/data/amq-broker/shared/paging</paging-directory>
<bindings-directory>/data/amq-broker/shared/bindings</bindings-directory>
<journal-directory>/data/amq-broker/shared/journal</journal-directory>
<large-messages-directory>/data/amq-broker/shared/large-messages</large-messages-directory>
These are the variables we use:
all:
children:
amq:
children:
ha1:
hosts: xxxxxx
ha2:
hosts: xxxxxx
vars:
activemq_configure_firewalld: True
activemq_prometheus_enabled: True
amq_broker_enable: True
activemq_cors_strict_checking: False
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /data/amq-broker/shared
ansible_user: xxxxx
ansible_ssh_private_key_file: hostfiles/privkey
activemq_offline_install: True
activemq_version: 7.10.2
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
activemq_port: 61616
cluster_name: amqtest002
nfs_mount_source: "nfssatestasb2.file.core.windows.net:/nfssatestasb2/sharetestasb2"
activemq_tls_enabled: True
activemq_tls_keystore_path: "/app/keystores/keystore.jks"
activemq_tls_truststore_path: "/app/keystores/truststore.jks"
activemq_tls_truststore_password: "changeit"
activemq_acceptors:
- name: all
bind_address: "0.0.0.0"
bind_port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: AMQP,OPENWIRE
useEpoll: true
sslEnabled: True
keyStorePath: "{{ activemq_tls_keystore_dest }}"
keyStorePassword: "{{ activemq_tls_keystore_password }}"
trustStorePath: "{{ activemq_tls_truststore_dest }}"
trustStorePassword: "{{ activemq_tls_truststore_password }}"
verifyHost: False
activemq_sa_password: "asb-sa-test-password"
activemq_users:
- user: "{{ activemq_instance_username }}"
password: "{{ activemq_instance_password }}"
roles: [ amq ]
- user: "asb-sa"
password: "{{ activemq_sa_password }}"
roles: [ amq ]
activemq_roles:
- name: amq
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
[broker-new.xml.txt](https://github.com/ansible-middleware/amq/files/10518862/broker-new.xml.txt)
##### ISSUE TYPE
- Bug Report
##### ANSIBLE VERSION
<!-- Paste, BELOW THIS COMMENT, verbatim output from "ansible --version"-->
[default@0ee9d7b5b9c2 app]$ ansible --version
ansible [core 2.14.1]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/default/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/default/.local/lib/python3.9/site-packages/ansible
ansible collection location = /home/default/.ansible/collections:/usr/share/ansible/collections
executable location = /home/default/.local/bin/ansible
python version = 3.9.13 (main, Nov 9 2022, 13:16:24) [GCC 8.5.0 20210514 (Red Hat 8.5.0-15)] (/usr/bin/python3.9)
jinja version = 3.1.2
libyaml = True
##### COLLECTION VERSION
<!-- Paste, BELOW THIS COMMENT, verbatim output from "ansible-galaxy collection list"-->
<!-- If using virtual environments or execution environments, remember to activate them-->
[default@0ee9d7b5b9c2 app]$ ansible-galaxy collection list
Collection Version
amazon.aws 5.1.0
ansible.netcommon 4.1.0
ansible.posix 1.4.0
ansible.utils 2.8.0
ansible.windows 1.12.0
arista.eos 6.0.0
awx.awx 21.10.0
azure.azcollection 1.14.0
check_point.mgmt 4.0.0
chocolatey.chocolatey 1.3.1
cisco.aci 2.3.0
cisco.asa 4.0.0
cisco.dnac 6.6.1
cisco.intersight 1.0.22
cisco.ios 4.0.0
cisco.iosxr 4.0.3
cisco.ise 2.5.9
cisco.meraki 2.13.0
cisco.mso 2.1.0
cisco.nso 1.0.3
cisco.nxos 4.0.1
cisco.ucs 1.8.0
cloud.common 2.1.2
cloudscale_ch.cloud 2.2.3
community.aws 5.0.0
community.azure 2.0.0
community.ciscosmb 1.0.5
community.crypto 2.9.0
community.digitalocean 1.22.0
community.dns 2.4.2
community.docker 3.3.1
community.fortios 1.0.0
community.general 6.1.0
community.google 1.0.0
community.grafana 1.5.3
community.hashi_vault 4.0.0
community.hrobot 1.6.0
community.libvirt 1.2.0
community.mongodb 1.4.2
community.mysql 3.5.1
community.network 5.0.0
community.okd 2.2.0
community.postgresql 2.3.1
community.proxysql 1.4.0
community.rabbitmq 1.2.3
community.routeros 2.5.0
community.sap 1.0.0
community.sap_libs 1.4.0
community.skydive 1.0.0
community.sops 1.5.0
community.vmware 3.2.0
community.windows 1.11.1
community.zabbix 1.9.0
containers.podman 1.10.1
cyberark.conjur 1.2.0
cyberark.pas 1.0.14
dellemc.enterprise_sonic 2.0.0
dellemc.openmanage 6.3.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
f5networks.f5_modules 1.21.0
fortinet.fortimanager 2.1.7
fortinet.fortios 2.2.1
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.0.2
grafana.grafana 1.1.0
hetzner.hcloud 1.9.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.10.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.4.1
inspur.ispim 1.2.0
inspur.sm 2.3.0
junipernetworks.junos 4.1.0
kubernetes.core 2.3.2
lowlydba.sqlserver 1.2.1
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.21.0
netapp.elementsw 21.7.0
netapp.ontap 22.0.1
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.3.1
netbox.netbox 3.9.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.2
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.4.1
purestorage.flasharray 1.15.0
purestorage.flashblade 1.10.0
purestorage.fusion 1.2.0
sensu.sensu_go 1.13.1
splunk.es 2.1.0
t_systems_mms.icinga_director 1.31.4
theforeman.foreman 3.7.0
vmware.vmware_rest 2.2.0
vultr.cloud 1.3.1
vyos.vyos 4.0.0
wti.remote 1.0.4
Collection Version
ansible.posix 1.5.1
community.general 6.2.0
middleware_automation.amq 1.2.0
middleware_automation.redhat_csp_download 1.2.2
##### STEPS TO REPRODUCE
<!-- List the steps to reproduce the problem, using a minimal test-case. -->
apply the ansible playbook with the host file provided in different versions of the amq middleware. In versio 1.1.1 everything is ok, in version 1.2.0 it is not
<!-- Paste example playbook below -->
```yaml
We destroy our Azure VM in the evening an rebuild it in the morning. The AMQ Broker starts faster then the nfs mount we use. Therefore the broker does not become HA successfully since it got a lock on the directory before it was mounted by nfs. This results in two active brokers. A solution could be the approach described here making the amq-broker service dependent on the nfs mount.: https://unix.stackexchange.com/questions/246935/set-systemd-service-to-execute-after-fstab-mount. For that we would need to add an additional variable in the systemd template. Would it be possible to implement this in the code?
# {{ ansible_managed }}
[Unit]
Description={{ activemq.instance_name }} {{ activemq.name }} Service
After=network.target **data-amq\x2dbroker-shared.mount**
[Service]
Type=forking
EnvironmentFile=-/etc/sysconfig/{{ activemq.instance_name }}
PIDFile={{ activemq.instance_home }}/{{ activemq_service_pidfile }}
ExecStart={{ activemq.instance_home }}/bin/artemis-service start
ExecStop={{ activemq.instance_home }}/bin/artemis-service stop
SuccessExitStatus = 0 143
RestartSec = 60
Restart = on-failure
LimitNOFILE=102642
[Install]
WantedBy=multi-user.target
❯ ansible --version
ansible [core 2.13.6]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/robert/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/linuxbrew/.linuxbrew/Cellar/ansible/6.6.0/libexec/lib/python3.10/site-packages/ansible
ansible collection location = /home/robert/.ansible/collections:/usr/share/ansible/collections
executable location = /home/linuxbrew/.linuxbrew/bin/ansible
python version = 3.10.9 (main, Dec 6 2022, 18:44:57) [GCC 11.3.0]
jinja version = 3.1.2
libyaml = True
# /home/robert/.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.4.0
community.general 6.0.1
middleware_automation.amq 1.1.0
middleware_automation.redhat_csp_download 1.2.2
# /home/linuxbrew/.linuxbrew/Cellar/ansible/6.6.0/libexec/lib/python3.10/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 3.5.0
ansible.netcommon 3.1.3
ansible.posix 1.4.0
ansible.utils 2.7.0
ansible.windows 1.12.0
arista.eos 5.0.1
awx.awx 21.8.0
azure.azcollection 1.14.0
check_point.mgmt 2.3.0
chocolatey.chocolatey 1.3.1
cisco.aci 2.3.0
cisco.asa 3.1.0
cisco.dnac 6.6.0
cisco.intersight 1.0.20
cisco.ios 3.3.2
cisco.iosxr 3.3.1
cisco.ise 2.5.8
cisco.meraki 2.11.0
cisco.mso 2.1.0
cisco.nso 1.0.3
cisco.nxos 3.2.0
cisco.ucs 1.8.0
cloud.common 2.1.2
cloudscale_ch.cloud 2.2.2
community.aws 3.6.0
community.azure 1.1.0
community.ciscosmb 1.0.5
community.crypto 2.8.1
community.digitalocean 1.22.0
community.dns 2.4.0
community.docker 2.7.1
community.fortios 1.0.0
community.general 5.8.0
community.google 1.0.0
community.grafana 1.5.3
community.hashi_vault 3.4.0
community.hrobot 1.6.0
community.libvirt 1.2.0
community.mongodb 1.4.2
community.mysql 3.5.1
community.network 4.0.1
community.okd 2.2.0
community.postgresql 2.3.0
community.proxysql 1.4.0
community.rabbitmq 1.2.3
community.routeros 2.3.1
community.sap 1.0.0
community.sap_libs 1.3.0
community.skydive 1.0.0
community.sops 1.4.1
community.vmware 2.10.1
community.windows 1.11.1
community.zabbix 1.8.0
containers.podman 1.9.4
cyberark.conjur 1.2.0
cyberark.pas 1.0.14
dellemc.enterprise_sonic 1.1.2
dellemc.openmanage 5.5.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
f5networks.f5_modules 1.20.0
fortinet.fortimanager 2.1.6
fortinet.fortios 2.1.7
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.0.2
hetzner.hcloud 1.8.2
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.10.0
infinidat.infinibox 1.3.7
infoblox.nios_modules 1.4.0
inspur.ispim 1.2.0
inspur.sm 2.3.0
junipernetworks.junos 3.1.0
kubernetes.core 2.3.2
lowlydba.sqlserver 1.0.4
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.21.0
netapp.elementsw 21.7.0
netapp.ontap 21.24.1
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.3.1
netbox.netbox 3.8.1
ngine_io.cloudstack 2.2.4
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.2
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.3.1
purestorage.flasharray 1.14.0
purestorage.flashblade 1.10.0
purestorage.fusion 1.1.1
sensu.sensu_go 1.13.1
servicenow.servicenow 1.0.6
splunk.es 2.1.0
t_systems_mms.icinga_director 1.31.4
theforeman.foreman 3.7.0
vmware.vmware_rest 2.2.0
vultr.cloud 1.3.0
vyos.vyos 3.0.1
wti.remote 1.0.4
all:
children:
amq:
children:
ha1:
hosts: xxxxxx
ha2:
hosts: xxxxxx
vars:
activemq_configure_firewalld: True
activemq_prometheus_enabled: True
activemq_cors_strict_checking: False
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /data/amq-broker/shared
ansible_user: xxxx
ansible_ssh_private_key_file: hostfiles/privkey
activemq_offline_install: True
activemq_version: 7.10.2
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_port: 61616
activemq_acceptors:
- name: amqp
bind_address: "0.0.0.0"
bind_port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: AMQP
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
One of the two brokers should become active the other one should remain passive. Therefore the amq-broker should start after the NFS mount.
Both brokers are active. If we restart both brokers using systemctl then only one broker becomes active.
We would like to to be able to change more address settings for the broker to match our need. The activeMQ documentation suggests all the settings here: https://activemq.apache.org/components/artemis/documentation/latest/address-settings.html. If I look at the implementation here https://github.com/ansible-middleware/amq/blob/main/roles/activemq/templates/address_settings.broker.xml.j2 we cannot set these settings:
maxDeliveryAttempts: -1
maxRedeliveryDelay: 300000
redeliveryDelay: 2000
redeliveryDelayMultiplier: 2
Would it be possible to include those settings in the template?
Thanks you
We were trying the lattest version. The latest version has a problem with the fast images task. Somehow the command does not work. The error is:
/home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml:4
fatal: [xxxxxx]: FAILED! => {"changed": true, "msg": "Unsupported parameters for (ansible.legacy.command) module: warn. Supported parameters include: executable, _uses_shell, strip_empty_ends, stdin_add_newline, stdin, removes, chdir, argv, creates, _raw_params."}
[default@6d339d7a0171 app]$ ansible --version
ansible [core 2.14.1]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/default/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/default/.local/lib/python3.9/site-packages/ansible
ansible collection location = /home/default/.ansible/collections:/usr/share/ansible/collections
executable location = /home/default/.local/bin/ansible
python version = 3.9.13 (main, Nov 9 2022, 13:16:24) [GCC 8.5.0 20210514 (Red Hat 8.5.0-15)] (/usr/bin/python3.9)
jinja version = 3.1.2
libyaml = True
[default@6d339d7a0171 app]$ ansible-galaxy collection list
# /home/default/.local/lib/python3.9/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 5.1.0
ansible.netcommon 4.1.0
ansible.posix 1.4.0
ansible.utils 2.8.0
ansible.windows 1.12.0
arista.eos 6.0.0
awx.awx 21.10.0
azure.azcollection 1.14.0
check_point.mgmt 4.0.0
chocolatey.chocolatey 1.3.1
cisco.aci 2.3.0
cisco.asa 4.0.0
cisco.dnac 6.6.1
cisco.intersight 1.0.22
cisco.ios 4.0.0
cisco.iosxr 4.0.3
cisco.ise 2.5.9
cisco.meraki 2.13.0
cisco.mso 2.1.0
cisco.nso 1.0.3
cisco.nxos 4.0.1
cisco.ucs 1.8.0
cloud.common 2.1.2
cloudscale_ch.cloud 2.2.3
community.aws 5.0.0
community.azure 2.0.0
community.ciscosmb 1.0.5
community.crypto 2.9.0
community.digitalocean 1.22.0
community.dns 2.4.2
community.docker 3.3.1
community.fortios 1.0.0
community.general 6.1.0
community.google 1.0.0
community.grafana 1.5.3
community.hashi_vault 4.0.0
community.hrobot 1.6.0
community.libvirt 1.2.0
community.mongodb 1.4.2
community.mysql 3.5.1
community.network 5.0.0
community.okd 2.2.0
community.postgresql 2.3.1
community.proxysql 1.4.0
community.rabbitmq 1.2.3
community.routeros 2.5.0
community.sap 1.0.0
community.sap_libs 1.4.0
community.skydive 1.0.0
community.sops 1.5.0
community.vmware 3.2.0
community.windows 1.11.1
community.zabbix 1.9.0
containers.podman 1.10.1
cyberark.conjur 1.2.0
cyberark.pas 1.0.14
dellemc.enterprise_sonic 2.0.0
dellemc.openmanage 6.3.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
f5networks.f5_modules 1.21.0
fortinet.fortimanager 2.1.7
fortinet.fortios 2.2.1
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.0.2
grafana.grafana 1.1.0
hetzner.hcloud 1.9.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.10.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.4.1
inspur.ispim 1.2.0
inspur.sm 2.3.0
junipernetworks.junos 4.1.0
kubernetes.core 2.3.2
lowlydba.sqlserver 1.2.1
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.21.0
netapp.elementsw 21.7.0
netapp.ontap 22.0.1
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.3.1
netbox.netbox 3.9.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.2
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.4.1
purestorage.flasharray 1.15.0
purestorage.flashblade 1.10.0
purestorage.fusion 1.2.0
sensu.sensu_go 1.13.1
splunk.es 2.1.0
t_systems_mms.icinga_director 1.31.4
theforeman.foreman 3.7.0
vmware.vmware_rest 2.2.0
vultr.cloud 1.3.1
vyos.vyos 4.0.0
wti.remote 1.0.4
# /home/default/.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.4.0
community.general 6.1.0
middleware_automation.amq 1.1.0
middleware_automation.redhat_csp_download 1.2.2
Run the latest latest amq with the following playbook
all:
children:
amq:
children:
ha1:
hosts: xxxx
ha2:
hosts: xxxx
vars:
activemq_configure_firewalld: True
activemq_prometheus_enabled: False
activemq_cors_strict_checking: False
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /data/amq-broker/shared
ansible_user: xxxxx
ansible_ssh_private_key_file: hostfiles/privkey
activemq_offline_install: True
activemq_version: 7.10.0
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
TASK [middleware_automation.amq.activemq : Check prerequisites] ****************
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/main.yml:2
included: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml for 10.226.22.41, 10.226.22.40
TASK [middleware_automation.amq.activemq : Validate credentials] ***************
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml:18
ok: [10.226.22.41] => {"changed": false, "msg": "Installing activemq"}
ok: [10.226.22.40] => {"changed": false, "msg": "Installing activemq"}
TASK [middleware_automation.amq.activemq : Validate TLS config] ****************
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml:27
skipping: [10.226.22.41] => {"changed": false, "skip_reason": "Conditional result was False"}
skipping: [10.226.22.40] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [middleware_automation.amq.activemq : Validate TLS mutual auth config] ****
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml:36
skipping: [10.226.22.41] => {"changed": false, "skip_reason": "Conditional result was False"}
skipping: [10.226.22.40] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [middleware_automation.amq.activemq : Ensure required packages are installed] ***
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml:46
included: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml for 10.226.22.41, 10.226.22.40
TASK [middleware_automation.amq.activemq : Check if packages are already installed] ***
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml:4
fatal: [10.226.22.40]: FAILED! => {"changed": true, "msg": "Unsupported parameters for (ansible.legacy.command) module: warn. Supported parameters include: executable, _uses_shell, strip_empty_ends, stdin_add_newline, stdin, removes, chdir, argv, creates, _raw_params."}
fatal: [10.226.22.41]: FAILED! => {"changed": true, "msg": "Unsupported parameters for (ansible.legacy.command) module: warn. Supported parameters include: stdin, removes, strip_empty_ends, stdin_add_newline, _raw_params, _uses_shell, chdir, argv, executable, creates."}
TASK [middleware_automation.amq.activemq : Add missing packages to the yum install list] ***
task path: /home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml:12
fatal: [10.226.22.41]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'stdout_lines'. 'dict object' has no attribute 'stdout_lines'\n\nThe error appears to be in '/home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml': line 12, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n rescue:\n - name: \"Add missing packages to the yum install list\"\n ^ here\n"}
fatal: [10.226.22.40]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'stdout_lines'. 'dict object' has no attribute 'stdout_lines'\n\nThe error appears to be in '/home/default/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml': line 12, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n rescue:\n - name: \"Add missing packages to the yum install list\"\n ^ here\n"}
PLAY RECAP *********************************************************************
10.226.22.40 : ok=12 changed=1 unreachable=0 failed=1 skipped=2 rescued=1 ignored=0
10.226.22.41 : ok=12 changed=1 unreachable=0 failed=1 skipped=2 rescued=1 ignored=0
If amq_broker_ha_enabled
is enabled, the broker.xml generates connectors
and cluster-connections
which do not allow for message flow between nodes.
$ ansible --version
ansible [core 2.15.9]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/ec2-user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/ec2-user/.local/lib/python3.9/site-packages/ansible
ansible collection location = /home/ec2-user/.ansible/collections:/usr/share/ansible/collections
executable location = /home/ec2-user/.local/bin/ansible
python version = 3.9.18 (main, Sep 7 2023, 00:00:00) [GCC 11.4.1 20230605 (Red Hat 11.4.1-2)] (/usr/bin/python3)
jinja version = 3.1.3
libyaml = True
$ ansible-galaxy collection list
# /home/ec2-user/.ansible/collections/ansible_collections
Collection Version
----------------------------- -------
redhat.amq_broker 2.0.0
redhat.runtimes_common 1.1.3
deploy-amq.yml
---
- name: Playbook to deploy activemq in clustered
hosts: all
vars:
amq_broker_ha_enabled: true
amq_broker_nio_enabled: true
amq_broker_cors_strict_checking: false
amq_broker_prometheus_enabled: false
amq_broker_offline_install: true
amq_broker_archive: amq-broker-7.11.6-bin.zip
amq_broker_addresses:
- name: TEST
anycast:
- name: TEST
- name: DLQ
anycast:
- name: DLQ
- name: ExpiryQueue
anycast:
- name: ExpiryQueue
amq_broker_acceptors:
- name: artemis
bind_address: "0.0.0.0"
bind_port: "{{ amq_broker_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
amqpMinLargeMessageSize: 102400
protocols: CORE
useEpoll: true
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: false
suppressInternalManagementObjects: false
- name: amqp
bind_address: "0.0.0.0"
bind_port: "{{ amq_broker_port_amqp }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: AMQP
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
roles:
- redhat.amq_broker.amq_broker
inventory/group_vars/all.yml
---
all:
hosts:
ip-11-0-1-001.eu-west-2.compute.internal:
amq_broker_instance_name: one
ip-11-0-1-002.eu-west-2.compute.internal:
amq_broker_instance_name: two
ip-11-0-1-003.eu-west-2.compute.internal:
amq_broker_instance_name: three
amq/bin/artemis consumer --user amq-broker --password amq-broker --message-count 500
amq/bin/artemis producer --user amq-broker --password amq-broker
Broker two and three get 500 messages each.
Ansible created samples, which don't allow for clustered broker:
<connectors>
<connector name="artemis">tcp://localhost:61616?</connector>
<connector name="ip-11-0-1-001.eu-west-2.compute.internal">tcp://11.0.1.001:61616?</connector>
<connector name="ip-11-0-1-002.eu-west-2.compute.internal">tcp://11.0.1.002:61616?</connector>
<connector name="ip-11-0-1-003.eu-west-2.compute.internal">tcp://11.0.1.003:61616?</connector>
</connectors>
<cluster-connections>
<cluster-connection name="amq_broker">
<connector-ref>artemis</connector-ref>
<message-load-balancing>ON_DEMAND</message-load-balancing>
<max-hops>1</max-hops>
<static-connectors>
<connector-ref>ip-11-0-1-001.eu-west-2.compute.internal</connector-ref>
<connector-ref>ip-11-0-1-002.eu-west-2.compute.internal</connector-ref>
<connector-ref>ip-11-0-1-003.eu-west-2.compute.internal</connector-ref>
</static-connectors>
</cluster-connection>
</cluster-connections>
Manually edited broker one sample.
<connectors>
<connector name="artemis">tcp://11.0.1.001:61616?</connector>
<connector name="ip-11-0-1-002.eu-west-2.compute.internal">tcp://11.0.1.002:61616?</connector>
<connector name="ip-11-0-1-003.eu-west-2.compute.internal">tcp://11.0.1.003:61616?</connector>
</connectors>
<cluster-connections>
<cluster-connection name="amq_broker">
<connector-ref>artemis</connector-ref>
<message-load-balancing>ON_DEMAND</message-load-balancing>
<max-hops>1</max-hops>
<static-connectors>
<connector-ref>ip-11-0-1-002.eu-west-2.compute.internal</connector-ref>
<connector-ref>ip-11-0-1-003.eu-west-2.compute.internal</connector-ref>
</static-connectors>
</cluster-connection>
</cluster-connections>
The above config works, and broker two and three get 500 messages.
we are starting with Red Hat AMQ deployment and planning to use the redhat.amq_broker collection. For later tests of upgrading Artemis, we want to start deploying Red Hat AMQ version 7.09.4.
First attempt was to set amq_broker_version to "7.9.4", result was that redhat.runtimes_common.product_search did not find any product:
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"api_url": "https://jbossnetwork.api.redhat.com",
"client_id": "xxxxxxxxxx",
"client_secret": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"product_category": "jboss.amq.broker",
"product_id": null,
"product_type": "DISTRIBUTION",
"product_version": "7.9.4",
"sso_url": "https://sso.redhat.com",
"validate_certs": true
}
},
"msg": "",
"results": []
}
After changing amq_broker_version to "7.09.4", product search finds the version but still we cannot download because the "rhn_filtered_products" fact is empty:
TASK [Determine install zipfile from search results] **********************************************************************************************
task path: /home/cmaso/Artemis/test/download.yml:43
ok: [localhost] => {
"ansible_facts": {
"rhn_filtered_products": []
},
"changed": false
}
this is because the returned file_path of the search result don't use the version 7.09.4, the files are named 7.9.4 ...
{
"category": "jboss.amq.broker",
"description": "AMQ Broker 7.9.4",
"distribution_status": "AVAILABLE",
"download_path": "https://access.redhat.com/cspdownload/0e54ee2328428c00bc535af93e96f44a/654499be/AMQ-BROKER-7.9.4/amq-broker-7.9.4-bin.zip",
"file_path": "AMQ-BROKER-7.9.4/amq-broker-7.9.4-bin.zip",
"id": 104297,
"md5": "d208b3a4b25a6f741dc3b5d097eea76b",
"name": "Red Hat AMQ Broker",
"sha256": "ccdc930df014c84d131b2e74da0937d04a6c5ceb6096a37968c2ad36f1ae4207",
"title": "AMQ Broker 7.9.4",
"type": "DISTRIBUTION",
"version": "7.09.4",
"visibility": "PUBLIC"
}
ansible [core 2.14.6]
config file = /home/cmaso/.ansible.cfg
configured module search path = ['/home/cmaso/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.9/site-packages/ansible
ansible collection location = /home/cmaso/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.9.16 (main, Sep 12 2023, 00:00:00) [GCC 11.3.1 20221121 (Red Hat 11.3.1-4)] (/usr/bin/python3)
jinja version = 3.1.2
libyaml = True
redhat.amq_broker 1.3.10
The security team asked us to improve the algorithms used to hash and salt the passwords when they are masked on the filesystem. They would like us to configure/decide the following parameters (at minimum): iterations, randomScheme & sceretKeyAlgorithm.
Rationale:
We do not accept the use of the SHA1 cipher anywhere, because we deem it insecure and unfit for production use.
The amount of PBKDF2 rounds is too small, this makes comprise too “computationally” easy. PBKDF relies on a “work” (iteration) factor, which is a weak paradigm, however given proper iteration counts we deem the risk med/low.
We want to be able to configure the JVM arguments -Xms and -Xmx to set server available RAM. For a high load, our servers production need more heap memory. This is also recommended here https://activemq.apache.org/components/artemis/documentation/2.1.0/perf-tuning.html. Therefore we would like to configure these settings.
The Cross Origin (CORS) policy in etc/jolokia-acces.xml
block access to the web console. I believe it is the problem as mentioned on stack overflow: https://stackoverflow.com/a/71207182 . This part blocks access to webconsole from my remote machine:
<allow-origin>*://0.0.0.0*</allow-origin>
It works if I change it to
<allow-origin>*://*</allow-origin>
Default installation
Get access to the web console
Obtained a blank screen after login in with the correct credentials
root@amq1 etc]# cat jolokia-access.xml
<?xml version="1.0" encoding="utf-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<!-- This policy file controls the Jolokia JMX-HTTP bridge security options for the web console.
see: https://jolokia.org/reference/html/security.html -->
<restrict>
<cors>
<!-- Allow cross origin access from 0.0.0.0 ... -->
<allow-origin>*://0.0.0.0*</allow-origin>
<!-- Options from this point on are auto-generated by Create.java from the Artemis CLI -->
<!-- Check for the proper origin on the server side, too -->
<strict-checking/>
</cors>
We would like to place the amq-brokers user home-dir in a different place compared to activemq.home
since we would like to run some additional scripts under the amq-broker with a chron job. Currently reinstalling the broker also destroys this home dir and stops the cron job from running. Would it be possible to change the home dir location?
The guide Ha configuration with active/standby nodes needs a tester and technical writer. Experiment and analyze the molecule test here, and using it write down the guide in markdown here.
I need to be able to use a 'activemq_config_override_template'.
I am expecting it to use my custom broker.xml with a '<broker-plugins>' section as a base.
ansible [core 2.15.3]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/XXX/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.11/site-packages/ansible
ansible collection location = /home/XXX/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.11.5 (main, Sep 22 2023, 15:34:29) [GCC 8.5.0 20210514 (Red Hat 8.5.0-20)] (/usr/bin/python3.11)
jinja version = 3.1.2
libyaml = True
# /home/XXX/.ansible/collections/ansible_collections
Collection Version
----------------------------- -------
middleware_automation.amq 1.3.10
middleware_automation.common 1.1.4
# /usr/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 6.3.0
ansible.netcommon 5.1.2
ansible.posix 1.5.4
ansible.utils 2.10.3
ansible.windows 1.14.0
arista.eos 6.0.1
awx.awx 22.6.0
azure.azcollection 1.16.0
check_point.mgmt 5.1.1
chocolatey.chocolatey 1.5.1
cisco.aci 2.7.0
cisco.asa 4.0.1
cisco.dnac 6.7.3
cisco.intersight 1.0.27
cisco.ios 4.6.1
cisco.iosxr 5.0.3
cisco.ise 2.5.14
cisco.meraki 2.15.3
cisco.mso 2.5.0
cisco.nso 1.0.3
cisco.nxos 4.4.0
cisco.ucs 1.10.0
cloud.common 2.1.4
cloudscale_ch.cloud 2.3.1
community.aws 6.2.0
community.azure 2.0.0
community.ciscosmb 1.0.6
community.crypto 2.15.0
community.digitalocean 1.24.0
community.dns 2.6.0
community.docker 3.4.8
community.fortios 1.0.0
community.general 7.3.0
community.google 1.0.0
community.grafana 1.5.4
community.hashi_vault 5.0.0
community.hrobot 1.8.1
community.libvirt 1.2.0
community.mongodb 1.6.1
community.mysql 3.7.2
community.network 5.0.0
community.okd 2.3.0
community.postgresql 2.4.3
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.9.0
community.sap 1.0.0
community.sap_libs 1.4.1
community.skydive 1.0.0
community.sops 1.6.4
community.vmware 3.9.0
community.windows 1.13.0
community.zabbix 2.1.0
containers.podman 1.10.2
cyberark.conjur 1.2.0
cyberark.pas 1.0.19
dellemc.enterprise_sonic 2.2.0
dellemc.openmanage 7.6.1
dellemc.powerflex 1.7.0
dellemc.unity 1.7.1
f5networks.f5_modules 1.25.1
fortinet.fortimanager 2.2.1
fortinet.fortios 2.3.1
frr.frr 2.0.2
gluster.gluster 1.0.2
google.cloud 1.2.0
grafana.grafana 2.1.5
hetzner.hcloud 1.16.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.12.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.5.0
inspur.ispim 1.3.0
inspur.sm 2.3.0
junipernetworks.junos 5.2.0
kubernetes.core 2.4.0
lowlydba.sqlserver 2.1.0
microsoft.ad 1.3.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.7.0
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.13.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 2.1.0
openvswitch.openvswitch 2.1.1
ovirt.ovirt 3.1.2
purestorage.flasharray 1.20.0
purestorage.flashblade 1.12.1
purestorage.fusion 1.6.0
sensu.sensu_go 1.14.0
servicenow.servicenow 1.0.6
splunk.es 2.1.0
t_systems_mms.icinga_director 1.33.1
telekom_mms.icinga_director 1.34.1
theforeman.foreman 3.12.0
vmware.vmware_rest 2.3.1
vultr.cloud 1.8.0
vyos.vyos 4.1.0
wti.remote 1.0.5
Playbook
---
- name: Install AMQ XXX
hosts: all
collections:
- middleware_automation.amq
roles:
- activemq
Extra vars
---
activemq_config_override_template: '/tmp/broker-custom.xml'
activemq_archive: amq-broker-7.10.0-bin.zip
activemq_installdir: /opt/amq/amq-broker-7.10.0
activemq_offline_install: true
activemq_instance_password: XXX
activemq_configure_firewalld: true
activemq_bind_address: 0.0.0.0
activemq_host: example.com
activemq_jvm_package: java-17-openjdk-headless
activemq_shared_storage: false
The produced broker.xml should contain the '<broker-plugins>' section from the override template.
The produced broker.xml does not contain the '<broker-plugins>' section from the override template.
$ ansible-playbook amq_playbook.yaml -i inventories/local --limit activemq -e "@inventories/local/group_vars/activemq.yml" -K -b --become-method=sudo
Script started on 2024-01-26 11:58:06+01:00
BECOME password:
PLAY [Install AMQ XXX] *****
TASK [Gathering Facts] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validating arguments against arg spec 'main'] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Check prerequisites] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/prereqs.yml for localhost
TASK [middleware_automation.amq.activemq : Clear internal templating variables] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validate credentials] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validate TLS config] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Validate TLS mutual auth config] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Check local download archive path] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validate local download path] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Make collection xsd available for validation] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Make collection xml.xsd available for validation] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Ensure required packages are installed] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml for localhost
TASK [middleware_automation.amq.activemq : Check if packages are already installed] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Install packages: {{ packages_to_install }}] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Create broker cluster node members] *****
skipping: [localhost] => (item=localhost)
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Validate broker configuration] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_config.yml for localhost
TASK [middleware_automation.amq.activemq : Clear internal validation variables] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validate configuration/core and journal configuration] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=persistence_enabled)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=persist_id_cache)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=id_cache_size)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_type)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=paging_directory)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=bindings_directory)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_directory)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=large_messages_directory)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_datasync)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_min_files)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_pool_files)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_device_block_size)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_file_size)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_buffer_timeout)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=journal_max_io)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=configuration_file_refresh_period)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=global_max_messages)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=global_max_size)
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/validate_core_config.yml for localhost => (item=password_codec)
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item persistence_enabled to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item persist_id_cache to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item id_cache_size to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_type to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item paging_directory to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item bindings_directory to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_directory to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item large_messages_directory to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_datasync to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_min_files to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_pool_files to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_device_block_size to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_file_size to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_buffer_timeout to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item journal_max_io to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item configuration_file_refresh_period to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item global_max_messages to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item global_max_size to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create configuration element] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Add item password_codec to valid element list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create address settings configuration string] *****
ok: [localhost] => (item=activemq.management#)
ok: [localhost] => (item=#)
TASK [middleware_automation.amq.activemq : Validate address settings] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create acceptor configuration string] *****
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validate acceptors] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create final connector list using generated and declared lists] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create connector configuration string] *****
ok: [localhost] => (item=None)
ok: [localhost]
TASK [middleware_automation.amq.activemq : Validate connectors] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create addresses string] *****
ok: [localhost] => (item=queue.in)
ok: [localhost] => (item=queue.out)
ok: [localhost] => (item=DLQ)
ok: [localhost] => (item=ExpiryQueue)
TASK [middleware_automation.amq.activemq : Validate addresses] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create diverts configuration string] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Validate diverts] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Create amqp broker connections configuration string] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Validate amqp broker connections] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Create security settings matches] *****
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
TASK [middleware_automation.amq.activemq : Create security settings] *****
ok: [localhost] => (item={'match': '#', 'permissions': {'createNonDurableQueue': ['amq'], 'deleteNonDurableQueue': ['amq'], 'createDurableQueue': ['amq'], 'deleteDurableQueue': ['amq'], 'createAddress': ['amq'], 'deleteAddress': ['amq'], 'consume': ['amq'], 'browse': ['amq'], 'send': ['amq'], 'manage': ['amq']}})
TASK [middleware_automation.amq.activemq : Validate security settings] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Include firewall config tasks] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/firewalld.yml for localhost
TASK [middleware_automation.amq.activemq : Ensure required package firewalld are installed] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/fastpackages.yml for localhost
TASK [middleware_automation.amq.activemq : Check if packages are already installed] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Install packages: {{ packages_to_install }}] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable and start the firewalld service] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Configure firewall for activemq ports] *****
ok: [localhost] => (item=8161/tcp)
ok: [localhost] => (item=61616/tcp)
ok: [localhost] => (item=5445/tcp)
ok: [localhost] => (item=5672/tcp)
ok: [localhost] => (item=1883/tcp)
ok: [localhost] => (item=61613/tcp)
TASK [middleware_automation.amq.activemq : Configure firewall for JMX exporter port] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Include install tasks] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/install.yml for localhost
TASK [middleware_automation.amq.activemq : Validate parameters] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create activemq service group] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create activemq service user] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Check activemq install location] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create activemq install location] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Set download archive path] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : debug archive path] *****
ok: [localhost] => {
"msg": "/opt/amq/amq-broker-7.10.0-bin.zip"
}
TASK [middleware_automation.amq.activemq : Check download archive path] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Download activemq archive] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Retrieve product download using JBoss Network API] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Determine install zipfile from search results] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Download AMQ Broker] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Check downloaded archive] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Copy archive to target nodes] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Check target directory: /opt/amq/amq-broker-7.10.0] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Extract Apache ActiveMQ archive on target] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Check target directory: /opt/amq/amq-broker-7.10.0] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Link zipfile directory to wanted directory] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Inform decompression was not executed] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Include systemd tasks] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/systemd.yml for localhost
TASK [middleware_automation.amq.activemq : Determine JAVA_HOME for selected JVM RPM] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Ensure systemd unit override directory exists] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure sysconfig file for amq-broker activemq service] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Configure systemd unit file for amq-broker activemq service] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Check instance directory: /opt/amq/amq-broker] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Generate artemis configuration for: /opt/amq/amq-broker] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/configure_artemis.yml for localhost
TASK [middleware_automation.amq.activemq : Prepare broker creation options] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Enable clustering] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable static clustering] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable security] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Disable security] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Set address broker accepts connections on] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Set address embedded web server accepts connections on] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Disable automatic creation of queues] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Set up queues] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Set up data directory] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Set as replicated node] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable replication] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable shared storage] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Set up port offset] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Disable AMQP protocol] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Disable HornetQ protocol] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Disable MQTT protocol] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Disable STOMP] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Set the journal as nio] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable TLS for web UI] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Enable TLS client authentication for web UI] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Create final broker creation options] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create instance amq-broker of activemq] *****
Creating home directory for amq-broker.
[WARNING]: Module remote_tmp /opt/amq/apache-artemis-2.21.0/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
changed: [localhost]
TASK [middleware_automation.amq.activemq : Check instance directory: /opt/amq/amq-broker] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Reset upgrade flag] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Read profile file] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Display versions] *****
ok: [localhost] => {
"msg": "Requested version: 2.21.0 / current profile: ARTEMIS_HOME='/opt/amq/amq-broker-7.10.0'"
}
TASK [middleware_automation.amq.activemq : Set upgrade flag] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Include post upgrade tasks] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Setup clustering with jgroups] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Include keystore copy tasks] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Create libs location] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Copy additional libs to location] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Generate broker configuration for: /opt/amq/amq-broker] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/configure_broker.yml for localhost
TASK [middleware_automation.amq.activemq : Configure AMQ broker logging] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Ensure ha-policy element exists] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Configure ha-policy] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Configure acceptors] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/acceptors.yml for localhost
TASK [middleware_automation.amq.activemq : Create acceptor configuration string] *****
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost] => (item=None)
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create acceptor configuration in broker.xml] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure connectors] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/connectors.yml for localhost
TASK [middleware_automation.amq.activemq : Create final connector list using generated and declared lists] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create connector configuration string] *****
ok: [localhost] => (item=None)
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create connector configuration in broker.xml] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Create cluster connections configuration string] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Create cluster connections in broker.xml] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Configure addresses] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/addresses.yml for localhost
TASK [middleware_automation.amq.activemq : Create addresses string] *****
ok: [localhost] => (item=queue.in)
ok: [localhost] => (item=queue.out)
ok: [localhost] => (item=DLQ)
ok: [localhost] => (item=ExpiryQueue)
TASK [middleware_automation.amq.activemq : Create addresses configuration in broker.xml] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure journal] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/journal.yml for localhost
TASK [middleware_automation.amq.activemq : Set journal configuration] *****
ok: [localhost] => (item=persistence_enabled)
changed: [localhost] => (item=persist_id_cache)
changed: [localhost] => (item=id_cache_size)
ok: [localhost] => (item=journal_type)
ok: [localhost] => (item=paging_directory)
ok: [localhost] => (item=bindings_directory)
ok: [localhost] => (item=journal_directory)
changed: [localhost] => (item=large_messages_directory)
ok: [localhost] => (item=journal_datasync)
ok: [localhost] => (item=journal_min_files)
ok: [localhost] => (item=journal_pool_files)
ok: [localhost] => (item=journal_device_block_size)
ok: [localhost] => (item=journal_file_size)
changed: [localhost] => (item=journal_buffer_timeout)
ok: [localhost] => (item=journal_max_io)
changed: [localhost] => (item=configuration_file_refresh_period)
ok: [localhost] => (item=global_max_messages)
changed: [localhost] => (item=global_max_size)
changed: [localhost] => (item=password_codec)
TASK [middleware_automation.amq.activemq : Create user and roles] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/user_roles.yml for localhost
TASK [middleware_automation.amq.activemq : Retrieve existing users] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Set masked user passwords] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/mask_password.yml for localhost => (item=amq-broker)
TASK [middleware_automation.amq.activemq : Parse passwd hash for existing user] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Parse passwd for existing user salt] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Get masked password for user] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Add masked password to users list] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Add existing user to users list] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Configure users] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure roles] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Create security settings matches] *****
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
ok: [localhost] => (item=#)
TASK [middleware_automation.amq.activemq : Create security settings] *****
ok: [localhost] => (item={'match': '#', 'permissions': {'createNonDurableQueue': ['amq'], 'deleteNonDurableQueue': ['amq'], 'createDurableQueue': ['amq'], 'deleteDurableQueue': ['amq'], 'createAddress': ['amq'], 'deleteAddress': ['amq'], 'consume': ['amq'], 'browse': ['amq'], 'send': ['amq'], 'manage': ['amq']}})
TASK [middleware_automation.amq.activemq : Create messaging roles permissions] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Update hawtio role] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Ensure management.xml has no comments (lxml bug workaround)] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Create management access default] *****
ok: [localhost] => (item=list*)
ok: [localhost] => (item=get*)
ok: [localhost] => (item=is*)
ok: [localhost] => (item=set*)
ok: [localhost] => (item=browse*)
ok: [localhost] => (item=count*)
ok: [localhost] => (item=*)
TASK [middleware_automation.amq.activemq : Create management access domains] *****
ok: [localhost] => (item=org.apache.activemq.artemis)
ok: [localhost] => (item=java.lang)
TASK [middleware_automation.amq.activemq : Create management default access configuration] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Create management access roles configuration] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure address settings] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/address_settings.yml for localhost
TASK [middleware_automation.amq.activemq : Create address settings configuration string] *****
ok: [localhost] => (item=activemq.management#)
ok: [localhost] => (item=#)
TASK [middleware_automation.amq.activemq : Create address settings configuration in broker.xml] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure diverts] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Configure broker connections] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Configure jolokia access] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Configure jaas] *****
changed: [localhost]
TASK [middleware_automation.amq.activemq : Find available library path from installation] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Ensure lib is available to instance] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Configure metrics plugin] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Reload systemd] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Verify file ownership] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Flush handlers] *****
RUNNING HANDLER [middleware_automation.amq.activemq : Restart handler] *****
included: /home/XXX/.ansible/collections/ansible_collections/middleware_automation/amq/roles/activemq/tasks/restart.yml for localhost
RUNNING HANDLER [middleware_automation.amq.activemq : Restart and enable instance amq-broker for activemq service] *****
changed: [localhost]
RUNNING HANDLER [middleware_automation.amq.activemq : Configure systemd unit override] *****
skipping: [localhost]
RUNNING HANDLER [middleware_automation.amq.activemq : Restart service] *****
skipping: [localhost]
RUNNING HANDLER [middleware_automation.amq.activemq : Delete systemd unit override] *****
skipping: [localhost]
RUNNING HANDLER [middleware_automation.amq.activemq : Ensure daemon reload] *****
skipping: [localhost]
RUNNING HANDLER [middleware_automation.amq.activemq : Restart handler] *****
skipping: [localhost]
TASK [middleware_automation.amq.activemq : Check service status] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Verify service status] *****
ok: [localhost] => {
"changed": false,
"msg": "All assertions passed"
}
TASK [middleware_automation.amq.activemq : Create default logs directory] *****
ok: [localhost]
TASK [middleware_automation.amq.activemq : Link default logs directory] *****
ok: [localhost]
PLAY RECAP *****
localhost : ok=164 changed=19 unreachable=0 failed=0 skipped=53 rescued=0 ignored=0
Script done on 2024-01-26 11:59:05+01:00
Cannot install version middleware_automation.amq:1.3.11 with ansible Galaxy.
ansible-galaxy collection install -r requirements/requirements.yml
❯ cat requirements/requirements.yml
---
collections:
- name: middleware_automation.common
- name: ansible.posix
- name: middleware_automation.amq
version: 1.3.11
I expected version 1.3.11 to be installed
ansible-galaxy collection install -r requirements/requirements.yml
Starting galaxy collection install process
Process install dependency map
ERROR! Failed to resolve the requested dependencies map. Could not satisfy the following requirements:
* middleware_automation.amq:1.3.11 (direct request)
We would like to enable logging in to the Broker via LDAP provide our own login.config
template file for the playbook to allow logging in via LDAP. Would this be possible?
Allow certain configurations to be kept in a (or 1+ modularized) separate file(s), imported via XInclude [1]
RH docs explain how to enable the promethus metrics plugin at https://access.redhat.com/documentation/en-us/red_hat_amq_broker/7.11/html-single/managing_amq_broker/index#assembly-br-monitoring-broker-runtime-metrics_managing but I cant see how you'd do this using the ansible module.
The TLDR, is:
cp amq/apache-artemis-2.28.0/lib/artemis-prometheus-metrics-plugin-2.1.0.redhat-00001.jar amq/one/lib/artemis-prometheus-metrics-plugin-2.1.0.redhat-00001.jar
<metrics>
<plugin class-name="com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin"/>
</metrics>
my RH ID: gahealy@redhat
Hi we are back to testing the AMQ playbooks. When I run the newest version of the playbook I cannot start since the data directory is owned by root. We are running an NFS mounted dir like this:
sudo mount -t nfs amqtest2.file.core.windows.net:/amqtest2/amqdata /opt/amq/amq-broker/data -o vers=4,minorversion=1,sec=sys
This is our playbook:
---
- name: Playbook for Red Hat AMQ Broker
hosts: all
vars:
activemq_ha_enabled: true
collections:
- middleware_automation.redhat_csp_download
- middleware_automation.amq
roles:
- activemq
The problem is that the data dir is owned by root, including the pid file
[azureroot@vm-amq-ansible-1 data]$ ll
total 3
-rw-r--r--. 1 root root 7 Nov 17 08:17 artemis.pid
drwxr-xr-x. 2 root root 128 Nov 17 08:18 bindings
drwxr-xr-x. 2 root root 320 Nov 17 08:18 journal
drwxr-xr-x. 2 root root 64 Nov 17 08:17 large-messages
drwxr-xr-x. 2 root root 64 Nov 17 08:17 paging
Hide key store password from output. The output contains of an Ansible run contains the keystore password in plain text both in the start broker section and in the acceptors section. Would it be possible to hide this with a no log option?
In our setup we plan to forward the Artemis logs to Elasticsearch. To enable that, we would need to use our own log4j2.properties or logging.properties template.
Our idea was to use the existing option (amq_broker_logger_config_template) to override this template, but it doesn't work as expected, it is only there for selecting the right logger config file:
amq_broker_logger_config_template: "{{ 'log4j2.properties' if amq_broker_version is version_compare('7.11.0', '>=') else 'logging.properties' }}"
And it's then used in the configuration of the broker as:
ansible.builtin.template:
src: "{{ amq_broker_logger_config_template }}.j2"
dest: "{{ amq_broker.instance_home }}/etc/{{ amq_broker_logger_config_template }}"
Which, of course, will not work when amq_broker_logger_config_template is a path.
Maybe the intention for amq_broker_logger_config_template was not to allow the user to override the template, so you may this not consider as a bug and more as a feature request...
Having the possibility to use our own template for the broker logging configuration similar as it is already implemented for the systemd service.
Maybe a possible solution would be to not completely override the template but have the option to specify a "template path" and keep the decision which template (logging or log4j2) up to the role.
After updating to version 1.3.2 of the playbook I can no longer start the broker via systemctl. I can start the master via systemctl but the second broker fails to start. We are deploying a two node shared storage setup. We have setup an NFS mount. In version 1.3.0 and 1.3.1 the same playbook works fine.
ansible [core 2.14.5]
config file = /home/robert/asb2/AMQ-Ansible-config/ansible-configuration/ansible.cfg
configured module search path = ['/home/robert/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/linuxbrew/.linuxbrew/Cellar/ansible/7.5.0/libexec/lib/python3.11/site-packages/ansible
ansible collection location = /home/robert/.ansible/collections:/usr/share/ansible/collections
executable location = /home/linuxbrew/.linuxbrew/bin/ansible
python version = 3.11.3 (main, Apr 4 2023, 22:36:41) [GCC 11.3.0] (/home/linuxbrew/.linuxbrew/Cellar/ansible/7.5.0/libexec/bin/python3.11)
jinja version = 3.1.2
libyaml = True
# /home/linuxbrew/.linuxbrew/Cellar/ansible/7.5.0/libexec/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 5.4.0
ansible.netcommon 4.1.0
ansible.posix 1.5.2
ansible.utils 2.9.0
ansible.windows 1.13.0
arista.eos 6.0.1
awx.awx 21.14.0
azure.azcollection 1.15.0
check_point.mgmt 4.0.0
chocolatey.chocolatey 1.4.0
cisco.aci 2.6.0
cisco.asa 4.0.0
cisco.dnac 6.7.1
cisco.intersight 1.0.27
cisco.ios 4.5.0
cisco.iosxr 4.1.0
cisco.ise 2.5.12
cisco.meraki 2.15.1
cisco.mso 2.4.0
cisco.nso 1.0.3
cisco.nxos 4.3.0
cisco.ucs 1.8.0
cloud.common 2.1.3
cloudscale_ch.cloud 2.2.4
community.aws 5.4.0
community.azure 2.0.0
community.ciscosmb 1.0.5
community.crypto 2.12.0
community.digitalocean 1.23.0
community.dns 2.5.3
community.docker 3.4.3
community.fortios 1.0.0
community.general 6.6.0
community.google 1.0.0
community.grafana 1.5.4
community.hashi_vault 4.2.0
community.hrobot 1.8.0
community.libvirt 1.2.0
community.mongodb 1.5.2
community.mysql 3.6.0
community.network 5.0.0
community.okd 2.3.0
community.postgresql 2.3.2
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.8.0
community.sap 1.0.0
community.sap_libs 1.4.1
community.skydive 1.0.0
community.sops 1.6.1
community.vmware 3.5.0
community.windows 1.12.0
community.zabbix 1.9.3
containers.podman 1.10.1
cyberark.conjur 1.2.0
cyberark.pas 1.0.17
dellemc.enterprise_sonic 2.0.0
dellemc.openmanage 6.3.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
dellemc.powerflex 1.6.0
dellemc.unity 1.6.0
f5networks.f5_modules 1.23.0
fortinet.fortimanager 2.1.7
fortinet.fortios 2.2.3
frr.frr 2.0.2
gluster.gluster 1.0.2
google.cloud 1.1.3
grafana.grafana 1.1.1
hetzner.hcloud 1.11.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.11.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.4.1
inspur.ispim 1.3.0
inspur.sm 2.3.0
junipernetworks.junos 4.1.0
kubernetes.core 2.4.0
lowlydba.sqlserver 1.3.1
mellanox.onyx 1.0.0
microsoft.ad 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.5.0
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.12.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.4.1
purestorage.flasharray 1.17.2
purestorage.flashblade 1.11.0
purestorage.fusion 1.4.2
sensu.sensu_go 1.13.2
splunk.es 2.1.0
t_systems_mms.icinga_director 1.32.2
theforeman.foreman 3.10.0
vmware.vmware_rest 2.3.1
vultr.cloud 1.7.0
vyos.vyos 4.0.2
wti.remote 1.0.4
# /home/robert/.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.5.2
community.general 6.0.1
middleware_automation.amq 1.3.2
middleware_automation.common 1.1.0
middleware_automation.redhat_csp_download 1.2.2
Install the playbook with the command
ansible-playbook -e "activemq_version=7.10.2" -i hostfiles/AMQ-dev-shared-storage.yml playbooks/mount-nfs-install-broker.yml
all:
children:
amq:
children:
ha1:
hosts: amq1
vars:
artemis: "amq1"
node0: "amq2"
ha2:
hosts: amq2
vars:
artemis: "amq2"
node0: "amq1"
vars:
iface: enp0s8
activemq_configure_firewalld: True
activemq_prometheus_enabled: False
activemq_cors_strict_checking: False
activemq_disable_hornetq_protocol: true
activemq_disable_mqtt_protocol: true
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /data/amq-broker/shared/mount
ansible_user: ansible
activemq_offline_install: True
activemq_version: 7.10.2
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
activemq_port: 61616
nfs_mount_source: "192.168.2.221:/"
activemq_instance_username: amq-admin
activemq_instance_password: activemq_instance_password
activemq_sa_password: "amq-sa-password"
activemq_testers_password: "amq-testers-password"
activemq_address_settings:
- match: "#"
parameters:
dead_letter_address: DLQ
expiry_address: ExpiryQueue
redelivery_delay: 2000
max_size_bytes: -1
message_counter_history_day_limit: 10
max_delivery_attempts: -1
max_redelivery_delay: 300000
redelivery_delay_multiplier: 2
address_full_policy: PAGE
auto_create_queues: true
auto_create_addresses: true
auto_create_jms_queues: true
auto_create_jms_topics: true
activemq_users:
- user: "{{ activemq_instance_username }}"
password: "{{ activemq_instance_password }}"
roles: [ amq ]
- user: "amq-application-sa"
password: "{{ activemq_sa_password }}"
roles: [ amq-sa ]
- user: "amq-testers-sa"
password: "{{ activemq_testers_password }}"
roles: [ amq ]
activemq_roles:
- name: amq
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
- name: amq-sa
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
- name: amq-testers
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
activemq_acceptors:
- name: amqp
bind_address: "0.0.0.0"
bind_port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,OPENWIRE
useEpoll: true
verifyHost: False
activemq_connectors:
- name: artemis
address: "{{ artemis }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
- name: node0
address: "{{ node0 }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
I expect to start both broker via systemctl during the palybook run
The second broker fails to start via systemctl. I could not find a specific reason why the broker would not start.
RUNNING HANDLER [middleware_automation.amq.activemq : Restart and enable instance amq-broker for activemq service] *******************************************************************************************************************************************************
changed: [amq1]
fatal: [amq2]: FAILED! => changed=false
msg: |-
Unable to start service amq-broker: Job for amq-broker.service failed because the control process exited with error code.
See "systemctl status amq-broker.service" and "journalctl -xe" for details.
[ansible@amq2 ~]$ sudo journalctl -xe -n 200 -u amq-broker.service
--
-- Unit amq-broker.service has begun starting up.
May 05 12:59:00 amq2.test.local systemd[190182]: amq-broker.service: Executing: /opt/amq/amq-broker/bin/artemis-service start
May 05 12:59:00 amq2.test.local artemis-service[190182]: Starting artemis-service
May 05 12:59:01 amq2.test.local artemis-service[190182]: artemis-service is now running (190186)
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Child 190182 belongs to amq-broker.service.
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Control process exited, code=exited status=0
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Got final SIGCHLD for state start.
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Permission denied while opening PID file or potentially unsafe symlink chain, will now retry with relaxed checks: /opt/amq/amq-broker/data/artemis.pid
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: New main PID 190186 belongs to service, we are happy.
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Main PID loaded: 190186
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: About to execute: /usr/bin/timeout 60 sh -c 'tail -n 15 -f /opt/amq/amq-broker/log/artemis.log | sed "/AMQ221001/ q" && /bin/sleep 10'
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Forked /usr/bin/timeout as 190215
May 05 12:59:01 amq2.test.local systemd[1]: amq-broker.service: Changed start -> start-post
May 05 12:59:01 amq2.test.local systemd[190215]: amq-broker.service: Executing: /usr/bin/timeout 60 sh -c 'tail -n 15 -f /opt/amq/amq-broker/log/artemis.log | sed "/AMQ221001/ q" && /bin/sleep 10'
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Child 190215 belongs to amq-broker.service.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Control process exited, code=exited status=124
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Got final SIGCHLD for state start-post.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Changed start-post -> stop-sigterm
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Child 190217 belongs to amq-broker.service.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Child 190186 belongs to amq-broker.service.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Permission denied while opening PID file or potentially unsafe symlink chain, will now retry with relaxed checks: /opt/amq/amq-broker/data/artemis.pid
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Main process exited, code=exited, status=143/n/a
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Failed with result 'exit-code'.
-- Subject: Unit failed
-- Defined-By: systemd
-- Support: https://access.redhat.com/support
--
-- The unit amq-broker.service has entered the 'failed' state with result 'exit-code'.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Changed stop-sigterm -> failed
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Job amq-broker.service/start finished, result=failed
May 05 13:00:01 amq2.test.local systemd[1]: Failed to start amq-broker Apache ActiveMQ Service.
-- Subject: Unit amq-broker.service has failed
-- Defined-By: systemd
-- Support: https://access.redhat.com/support
--
-- Unit amq-broker.service has failed.
--
-- The result is failed.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Unit entered failed state.
May 05 13:00:01 amq2.test.local systemd[1]: amq-broker.service: Changed failed -> auto-restart
----------------------------
[ansible@amq2 ~]$ cat /etc/systemd/system/amq-broker.service
# Ansible managed
[Unit]
Description=amq-broker Apache ActiveMQ Service
After=network.target
RequiresMountsFor=/data/amq-broker/shared/mount
[Service]
Type=forking
EnvironmentFile=-/etc/sysconfig/amq-broker
PIDFile=/opt/amq/amq-broker/data/artemis.pid
ExecStart=/opt/amq/amq-broker/bin/artemis-service start
ExecStop=/opt/amq/amq-broker/bin/artemis-service stop
SuccessExitStatus = 0 143
RestartSec = 120
Restart = on-failure
LimitNOFILE=102642
TimeoutSec=600
ExecStartPost=/usr/bin/timeout 60 sh -c 'tail -n 15 -f /opt/amq/amq-broker/log/artemis.log | sed "/AMQ221001/ q" && /bin/sleep 10'
[Install]
WantedBy=multi-user.target
---------------------------------
[ansible@amq2 ~]$ cat /etc/sysconfig/amq-broker
# Ansible managed
JAVA_ARGS='-Xms512M -Xmx2G -XX:+PrintClassHistogram -XX:+UseG1GC -XX:+UseStringDeduplication -Dhawtio.disableProxy=true -Dhawtio.realm=activemq -Dhawtio.offline=true -Dhawtio.rolePrincipalClasses=org.apache.activemq.artemis.spi.core.security.jaas.RolePrincipal -Djolokia.policyLocation=file:/opt/amq/amq-broker/etc/jolokia-access.xml'
JAVA_HOME=/usr/lib/jvm/java-11-openjdk-11.0.19.0.7-1.el8_7.x86_64
HAWTIO_ROLE='amq'
ARTEMIS_INSTANCE_URI='file:/opt/amq/amq-broker/'
ARTEMIS_INSTANCE_ETC_URI='file:/opt/amq/amq-broker/etc/'
ARTEMIS_HOME='/opt/amq/amq-broker-7.10.2'
ARTEMIS_INSTANCE='/opt/amq/amq-broker'
ARTEMIS_DATA_DIR='/data/amq-broker/shared/mount'
ARTEMIS_ETC_DIR='/opt/amq/amq-broker/etc'
Cannot login in to the webconsole with default credentials: username amq-broker
and password amq-broker
https://192.168.2.211:8161/console/auth/login
(192.168.2.211 is my VM running AMQ)amq-broker
and password amq-broker
I expected to enter the AMQ webconsole with these credentials.
[root@amq1 etc]# cat login.config
activemq {
org.apache.activemq.artemis.spi.core.security.jaas.PropertiesLoginModule required
debug=true
reload=true
org.apache.activemq.jaas.properties.user="artemis-users.properties"
org.apache.activemq.jaas.properties.role="artemis-roles.properties";
};
The prometheus plugin filename depends on the install zipfile version, so it needs to be parameterized
block:
- name: Ensure lib is available to instance
ansible.builtin.copy:
src: "{{ activemq.home }}/lib/artemis-prometheus-metrics-plugin-1.1.0.redhat-00002.jar
n/a
1.1.2
n/a
The default installation makes root the owner of the folder /opt/amq/amq-broker and its subfolder. I believe the amq-broker user should be the owner of this directory. The ownership by root creates a problem when running the task that masks the passwords (actual command: /opt/amq/amq-broker/bin/artemis mask -- amq-broker)
❯ ansible --version
ansible [core 2.13.3]
config file = None
configured module search path = ['/Users/robertfloor/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/Cellar/ansible/6.3.0/libexec/lib/python3.10/site-packages/ansible
ansible collection location = /Users/robertfloor/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.10.6 (main, Aug 11 2022, 13:49:25) [Clang 13.1.6 (clang-1316.0.21.2.5)]
jinja version = 3.1.2
libyaml = True
# /Users/robertfloor/.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.4.0
community.general 3.1.0
geerlingguy.mac 1.1.2
haAMQ.ansibleCollection 1.0.0
middleware_automation.redhat_csp_download 1.2.2
amq/playbooks main !1 ?1 ❯
Run the playbook as specified in the readme:
ansible-playbook -i hosts_vagrant.yml activemq.yml -v
It fails on this task:
---
- name: Set masked user password
block:
- name: Get masked password for user
ansible.builtin.command: "{{ amq_broker.instance_home }}/bin/artemis mask -- '{{ item.password }}'"
register: mask_pwd
changed_when: False
#no_log: True
- name: Add masked password to users list
ansible.builtin.set_fact:
amq_broker_masked_users: "{{ amq_broker_masked_users | default([]) + [ { 'user': item.user, 'password': mask_pwd.stdout | replace('result: ',''), 'role': item.role } ] }}"
#no_log: True
when: item.password is defined and item.password | length > 0
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
fatal: [192.168.2.212]: FAILED! => {"changed": false, "cmd": "/opt/amq/amq-broker/bin/artemis mask -- amq-broker", "msg": "[Errno 13] Permission denied: b'/opt/amq/amq-broker/bin/artemis'", "rc": 13, "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
fatal: [192.168.2.211]: FAILED! => {"changed": false, "cmd": "/opt/amq/amq-broker/bin/artemis mask -- amq-broker", "msg": "[Errno 13] Permission denied: b'/opt/amq/amq-broker/bin/artemis'", "rc": 13, "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
I believe it is caused by the ownership of the amq-broker directory by root
[root@amq1 amq-broker]# ll
total 0
drwxr-xr-x. 2 root root 44 Aug 25 08:15 bin
drwxr-x---. 3 amq-broker amq-broker 20 Aug 25 08:15 data
drwxr-xr-x. 2 root root 226 Aug 25 08:15 etc
drwxr-xr-x. 2 root root 6 Aug 25 08:15 lib
drwxr-xr-x. 2 root root 6 Aug 25 08:15 log
drwxr-xr-x. 2 root root 6 Aug 25 08:15 tmp
[root@amq1 amq-broker]# pwd
/opt/amq/amq-broker
The Ansible code creates an incorrect static-connectors. There are two static connectors created in here and only one of them is valid (node0), the other one is not configured. With a normal active passive cluster there needs to be only one node (the other broker) mentioned in the static-connector part
<cluster-connections>
<cluster-connection name="my-cluster">
<connector-ref>artemis</connector-ref>
<message-load-balancing>ON_DEMAND</message-load-balancing>
<max-hops>1</max-hops>
<static-connectors>
<connector-ref>node0</connector-ref>
<connector-ref>node1</connector-ref>
</static-connectors>
</cluster-connection>
</cluster-connections>
<ha-policy>
ansible [core 2.14.3]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/robert/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/linuxbrew/.linuxbrew/Cellar/ansible/7.3.0/libexec/lib/python3.11/site-packages/ansible
ansible collection location = /home/robert/.ansible/collections:/usr/share/ansible/collections
executable location = /home/linuxbrew/.linuxbrew/bin/ansible
python version = 3.11.2 (main, Feb 7 2023, 13:52:42) [GCC 11.3.0] (/home/linuxbrew/.linuxbrew/Cellar/ansible/7.3.0/libexec/bin/python3.11)
jinja version = 3.1.2
libyaml = True
# /home/linuxbrew/.linuxbrew/Cellar/ansible/7.3.0/libexec/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 5.2.0
ansible.netcommon 4.1.0
ansible.posix 1.5.1
ansible.utils 2.9.0
ansible.windows 1.13.0
arista.eos 6.0.0
awx.awx 21.12.0
azure.azcollection 1.14.0
check_point.mgmt 4.0.0
chocolatey.chocolatey 1.4.0
cisco.aci 2.4.0
cisco.asa 4.0.0
cisco.dnac 6.6.3
cisco.intersight 1.0.23
cisco.ios 4.3.1
cisco.iosxr 4.1.0
cisco.ise 2.5.12
cisco.meraki 2.15.1
cisco.mso 2.2.1
cisco.nso 1.0.3
cisco.nxos 4.1.0
cisco.ucs 1.8.0
cloud.common 2.1.2
cloudscale_ch.cloud 2.2.4
community.aws 5.2.0
community.azure 2.0.0
community.ciscosmb 1.0.5
community.crypto 2.11.0
community.digitalocean 1.23.0
community.dns 2.5.1
community.docker 3.4.2
community.fortios 1.0.0
community.general 6.4.0
community.google 1.0.0
community.grafana 1.5.4
community.hashi_vault 4.1.0
community.hrobot 1.7.0
community.libvirt 1.2.0
community.mongodb 1.5.1
community.mysql 3.6.0
community.network 5.0.0
community.okd 2.3.0
community.postgresql 2.3.2
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.7.0
community.sap 1.0.0
community.sap_libs 1.4.0
community.skydive 1.0.0
community.sops 1.6.1
community.vmware 3.4.0
community.windows 1.12.0
community.zabbix 1.9.2
containers.podman 1.10.1
cyberark.conjur 1.2.0
cyberark.pas 1.0.17
dellemc.enterprise_sonic 2.0.0
dellemc.openmanage 6.3.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
dellemc.powerflex 1.5.0
dellemc.unity 1.5.0
f5networks.f5_modules 1.22.1
fortinet.fortimanager 2.1.7
fortinet.fortios 2.2.2
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.1.2
grafana.grafana 1.1.1
hetzner.hcloud 1.10.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.11.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.4.1
inspur.ispim 1.3.0
inspur.sm 2.3.0
junipernetworks.junos 4.1.0
kubernetes.core 2.4.0
lowlydba.sqlserver 1.3.1
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.3.0
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.11.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.4.1
purestorage.flasharray 1.17.0
purestorage.flashblade 1.10.0
purestorage.fusion 1.3.0
sensu.sensu_go 1.13.2
splunk.es 2.1.0
t_systems_mms.icinga_director 1.32.0
theforeman.foreman 3.9.0
vmware.vmware_rest 2.2.0
vultr.cloud 1.7.0
vyos.vyos 4.0.0
wti.remote 1.0.4
# /home/robert/.ansible/collections/ansible_collections
Collection Version
----------------------------------------- -------
ansible.posix 1.4.0
community.general 6.0.1
middleware_automation.amq 1.1.1
middleware_automation.common 1.0.2
middleware_automation.redhat_csp_download 1.2.2
Run the code like this
ansible-playbook -e "activemq_version=7.10.2" -e "activemq_sa_password=redhat"
-i hostfiles/AMQdev.yml playbooks/mount_and_deploy.yml -vv
With the default playbook and this hostfile.
all:
children:
amq:
children:
ha1:
hosts: amq1
vars:
artemis: "amq1"
node0: "amq2"
ha2:
hosts: amq2
vars:
artemis: "amq2"
node0: "amq1"
vars:
activemq_configure_firewalld: True
activemq_prometheus_enabled: False
activemq_cors_strict_checking: False
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /data/amq-broker/shared
ansible_user: ansible
activemq_offline_install: True
activemq_version: 7.10.2
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
activemq_port: 61616
nfs_mount_source: "192.168.2.221:/"
activemq_sa_password: "asb-sa-test-password"
activemq_address_settings:
- match: "#"
parameters:
dead_letter_address: DLQ
expiry_address: ExpiryQueue
redelivery_delay: 2000
max_size_bytes: -1
message_counter_history_day_limit: 10
max_delivery_attempts: -1
max_redelivery_delay: 300000
redelivery_delay_multiplier: 2
address_full_policy: PAGE
auto_create_queues: true
auto_create_addresses: true
auto_create_jms_queues: true
auto_create_jms_topics: true
activemq_users:
- user: "{{ activemq_instance_username }}"
password: "{{ activemq_instance_password }}"
roles: [ amq ]
- user: "asb-sa"
password: "{{ activemq_sa_password }}"
roles: [ amq ]
activemq_roles:
- name: amq
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
activemq_acceptors:
- name: amqp
bind_address: "0.0.0.0"
bind_port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,OPENWIRE
useEpoll: true
verifyHost: False
activemq_connectors:
- name: artemis
address: "{{ artemis }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
- name: node0
address: "{{ node0 }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
I would like to have only node0 in the section in the broker.xml.
These two connectors appeared in the broker xml
<cluster-connections>
<cluster-connection name="my-cluster">
<connector-ref>artemis</connector-ref>
<message-load-balancing>ON_DEMAND</message-load-balancing>
<max-hops>1</max-hops>
<static-connectors>
<connector-ref>node0</connector-ref>
<connector-ref>node1</connector-ref>
</static-connectors>
</cluster-connection>
</cluster-connections>
<ha-policy>
This results in this error in the log
2023-03-24 08:42:42,787 INFO [org.apache.activemq.artemis.core.server] AMQ221006: Waiting to obtain live lock
2023-03-24 08:42:43,047 INFO [org.apache.activemq.artemis.core.server] AMQ221012: Using AIO Journal
2023-03-24 08:42:43,272 INFO [org.apache.activemq.artemis.core.server] AMQ221057: Global Max Size is being adjusted to 1/2 of the JVM max size (-Xmx). being defined as 1,073,741,824
2023-03-24 08:42:43,293 WARN [org.apache.activemq.artemis.core.server] AMQ222226: Connection configuration is null for connectorName node1
2023-03-24 08:42:43,481 INFO [org.apache.activemq.artemis.core.server] AMQ221043: Protocol module found: [artemis-server]. Adding protocol support for: CORE
2023-03-24 08:42:43,482 INFO [org.apache.activemq.artemis.core.server] AMQ221043: Protocol module found: [artemis-amqp-protocol]. Adding protocol support for: AMQP
2023-03-24 08:42:43,483 INFO [org.apache.activemq.artemis.core.server] AMQ221043: Protocol module found: [artemis-hornetq-protocol]. Adding protocol support for: HORNETQ
2023-03-24 08:42:43,484 INFO [org.apache.activemq.artemis.core.server] AMQ221043: Protocol module found: [artemis-mqtt-protocol]. Adding protocol support for: MQTT
2023-03-24 08:42:43,485 INFO [org.apache.activemq.artemis.core.server] AMQ221043: Protocol module found: [artemis-openwire-protocol]. Adding protocol support for: OPENWIRE
2023-03-24 08:42:43,485 INFO [org.apache.activemq.artemis.core.server] AMQ221043: Protocol module found: [artemis-stomp-protocol]. Adding protocol support for: STOMP
2023-03-24 08:42:43,947 WARN [org.apache.activemq.artemis.core.server] AMQ222226: Connection configuration is null for connectorName node1
2023-03-24 08:42:43,952 ERROR [org.apache.activemq.artemis.core.server] AMQ224087: Error announcing backup: backupServerLocator is null. org.apache.activemq.artemis.core.server.cluster.BackupManager$BackupConnector$1@40583b01
2023-03-24 08:42:44,050 INFO [org.apache.activemq.artemis.core.server] AMQ221034: Waiting indefinitely to obtain live lock
[ansible@amq1 log]$
Having multiple role definitions for the same que selector results in duplicate xml blocks with this section , this is not the correct syntax.
ansible --version
ansible [core 2.14.4]
config file = None
configured module search path = ['/home/robert/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/linuxbrew/.linuxbrew/Cellar/ansible/7.4.0/libexec/lib/python3.11/site-packages/ansible
ansible collection location = /home/robert/.ansible/collections:/usr/share/ansible/collections
executable location = /home/linuxbrew/.linuxbrew/bin/ansible
python version = 3.11.2 (main, Feb 7 2023, 13:52:42) [GCC 11.3.0] (/home/linuxbrew/.linuxbrew/Cellar/ansible/7.4.0/libexec/bin/python3.11)
jinja version = 3.1.2
libyaml = True
# /home/linuxbrew/.linuxbrew/Cellar/ansible/7.4.0/libexec/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 5.4.0
ansible.netcommon 4.1.0
ansible.posix 1.5.1
ansible.utils 2.9.0
ansible.windows 1.13.0
arista.eos 6.0.0
awx.awx 21.14.0
azure.azcollection 1.15.0
check_point.mgmt 4.0.0
chocolatey.chocolatey 1.4.0
cisco.aci 2.4.0
cisco.asa 4.0.0
cisco.dnac 6.6.4
cisco.intersight 1.0.24
cisco.ios 4.4.0
cisco.iosxr 4.1.0
cisco.ise 2.5.12
cisco.meraki 2.15.1
cisco.mso 2.2.1
cisco.nso 1.0.3
cisco.nxos 4.1.0
cisco.ucs 1.8.0
cloud.common 2.1.3
cloudscale_ch.cloud 2.2.4
community.aws 5.4.0
community.azure 2.0.0
community.ciscosmb 1.0.5
community.crypto 2.11.1
community.digitalocean 1.23.0
community.dns 2.5.2
community.docker 3.4.3
community.fortios 1.0.0
community.general 6.5.0
community.google 1.0.0
community.grafana 1.5.4
community.hashi_vault 4.2.0
community.hrobot 1.8.0
community.libvirt 1.2.0
community.mongodb 1.5.1
community.mysql 3.6.0
community.network 5.0.0
community.okd 2.3.0
community.postgresql 2.3.2
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.8.0
community.sap 1.0.0
community.sap_libs 1.4.1
community.skydive 1.0.0
community.sops 1.6.1
community.vmware 3.5.0
community.windows 1.12.0
community.zabbix 1.9.2
containers.podman 1.10.1
cyberark.conjur 1.2.0
cyberark.pas 1.0.17
dellemc.enterprise_sonic 2.0.0
dellemc.openmanage 6.3.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
dellemc.powerflex 1.5.0
dellemc.unity 1.5.0
f5networks.f5_modules 1.23.0
fortinet.fortimanager 2.1.7
fortinet.fortios 2.2.3
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.1.3
grafana.grafana 1.1.1
hetzner.hcloud 1.10.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.11.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.4.1
inspur.ispim 1.3.0
inspur.sm 2.3.0
junipernetworks.junos 4.1.0
kubernetes.core 2.4.0
lowlydba.sqlserver 1.3.1
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.4.1
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.11.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 1.10.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.4.1
purestorage.flasharray 1.17.2
purestorage.flashblade 1.10.0
purestorage.fusion 1.4.1
sensu.sensu_go 1.13.2
splunk.es 2.1.0
t_systems_mms.icinga_director 1.32.2
theforeman.foreman 3.9.0
vmware.vmware_rest 2.3.1
vultr.cloud 1.7.0
vyos.vyos 4.0.1
wti.remote 1.0.4
all:
children:
amq:
children:
ha1:
hosts: amq1
vars:
artemis: "amq1"
node0: "amq2"
ha2:
hosts: amq2
vars:
artemis: "amq2"
node0: "amq1"
vars:
iface: enp0s8
activemq_configure_firewalld: True
activemq_prometheus_enabled: False
activemq_cors_strict_checking: False
activemq_ha_enabled: true
activemq_shared_storage: true
activemq_shared_storage_path: /data/amq-broker/shared
ansible_user: ansible
#ansible_ssh_private_key_file: hostfiles/privkey
activemq_offline_install: True
activemq_version: 7.10.2
activemq_dest: /opt/amq
activemq_archive: "amq-broker-{{ activemq_version }}-bin.zip"
activemq_installdir: "{{ activemq_dest }}/amq-broker-{{ activemq_version }}"
activemq_shared_storage_mounted: true
activemq_port: 61616
nfs_mount_source: "192.168.2.221:/"
activemq_instance_username: amq-admin
# activemq_instance_password: activemq_instance_password
# activemq_sa_password: "asb-sa-password"
# activemq_testers_password: "asb-testers-password"
activemq_address_settings:
- match: "#"
parameters:
dead_letter_address: DLQ
expiry_address: ExpiryQueue
redelivery_delay: 2000
max_size_bytes: -1
message_counter_history_day_limit: 10
max_delivery_attempts: -1
max_redelivery_delay: 300000
redelivery_delay_multiplier: 2
address_full_policy: PAGE
auto_create_queues: true
auto_create_addresses: true
auto_create_jms_queues: true
auto_create_jms_topics: true
activemq_users:
- user: "{{ activemq_instance_username }}"
password: "{{ activemq_instance_password }}"
roles: [ amq ]
- user: "asb-application-sa"
password: "{{ activemq_sa_password }}"
roles: [ amq-sa ]
- user: "asb-testers-sa"
password: "{{ activemq_testers_password }}"
roles: [ amq-testers ]
activemq_roles:
- name: amq
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
- name: amq-sa
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
- name: amq-testers
match: '#'
permissions: [ createDurableQueue, deleteDurableQueue, createAddress, deleteAddress, consume, browse, send, manage ]
activemq_acceptors:
- name: amqp
bind_address: "0.0.0.0"
bind_port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,OPENWIRE
useEpoll: true
verifyHost: False
activemq_connectors:
- name: artemis
address: "{{ artemis }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
- name: node0
address: "{{ node0 }}"
port: "{{ activemq_port }}"
parameters:
tcpSendBufferSize: 1048576
tcpReceiveBufferSize: 1048576
protocols: CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE
useEpoll: true
amqpMinLargeMessageSize: 102400
amqpCredits: 1000
amqpLowCredits: 300
amqpDuplicateDetection: true
supportAdvisory: False
suppressInternalManagementObjects: False
The broker.xml should contain this part:
<security-setting match="#">
<permission type="createDurableQueue" roles="amq,amq-sa,amq-testers"/>
<permission type="deleteDurableQueue" roles="amq,amq-sa,amq-testers"/>
<permission type="createAddress" roles="amq,amq-sa,amq-testers"/>
<permission type="deleteAddress" roles="amq,amq-sa,amq-testers"/>
<permission type="consume" roles="amq,amq-sa,amq-testers"/>
<permission type="browse" roles="amq,amq-sa,amq-testers"/>
<permission type="send" roles="amq,amq-sa,amq-testers"/>
<permission type="manage" roles="amq,amq-sa,amq-testers"/>
</security-setting>
An incorrect broker.xml was created. It has a triplicate block in the code which is not a valid configuration for AMQ. I have discussed this with Red Hat Support earlier in case Case 02925153, and this configuration is not valid. If a client tries to create a queue it gets:
2023-04-06T16:03:07.154|XXXXXXX-2|||WARN|Open of resource:(JmsConsumerInfo: { ID:XXXXXX, destination = XXXXXX}) failed: AMQ119015: not authorized to create consumer, AMQ229032: User: XXXXX does not have permission='CREATE_ADDRESS' on address XXXXX [condition = amqp:unauthorized-access]
This is caused by this triplicate part of the broker.xml
<security-settings>
<security-setting match="#">
<permission type="createDurableQueue" roles="amq"/>
<permission type="deleteDurableQueue" roles="amq"/>
<permission type="createAddress" roles="amq"/>
<permission type="deleteAddress" roles="amq"/>
<permission type="consume" roles="amq"/>
<permission type="browse" roles="amq"/>
<permission type="send" roles="amq"/>
<permission type="manage" roles="amq"/>
</security-setting>
<security-setting match="#">
<permission type="createDurableQueue" roles="amq-sa"/>
<permission type="deleteDurableQueue" roles="amq-sa"/>
<permission type="createAddress" roles="amq-sa"/>
<permission type="deleteAddress" roles="amq-sa"/>
<permission type="consume" roles="amq-sa"/>
<permission type="browse" roles="amq-sa"/>
<permission type="send" roles="amq-sa"/>
<permission type="manage" roles="amq-sa"/>
</security-setting>
<security-setting match="#">
<permission type="createDurableQueue" roles="amq-testers"/>
<permission type="deleteDurableQueue" roles="amq-testers"/>
<permission type="createAddress" roles="amq-testers"/>
<permission type="deleteAddress" roles="amq-testers"/>
<permission type="consume" roles="amq-testers"/>
<permission type="browse" roles="amq-testers"/>
<permission type="send" roles="amq-testers"/>
<permission type="manage" roles="amq-testers"/>
</security-setting>
</security-settings>
AMQ Broker 7.11 supports a newer set of features for Mirroring broker messages and acknowledgments between remote datacenter or cloud sites at the broker level instead of the storage or infrastructure level. These are high-impact and sought-after features in a majority of customers and users who require Continuity of Business and Disaster Recovery considerations in their messaging architecture. As of 7.11, synchronous mirroring [1] is supported, and prior versions supported asynchronous mirroring (Note this is different than clustering of brokers within a single site).
Using ansible to automate the complex multi-site mirror configuration, as well as the validation and smoke testing of the topology after provisioning, would be highly valuable to the product, project, and user base.
[1]
Synchronous mirroring support
Starting in 7.11, you can configure synchronous mirroring between brokers to ensure that messages are written to the volumes of both brokers in the mirror at the same time. By using synchronous mirroring, you ensure that the mirrored broker is up-to-date for disaster recovery. For more information, see Configuring broker connections in Configuring AMQ Broker.
The artemis mask command writes additional output when other jars are present in the lib folder, such as the prometheus jar. This creates a misconfiguration. Perhaps it is a problem in the upstream version?
[default@c6ca8d3c3c60 /]$ ansible --version
ansible [core 2.15.3]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/default/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/default/.local/lib/python3.11/site-packages/ansible
ansible collection location = /home/default/.ansible/collections:/usr/share/ansible/collections
executable location = /home/default/.local/bin/ansible
python version = 3.11.2 (main, Jun 6 2023, 07:39:01) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] (/usr/bin/python3.11)
jinja version = 3.1.2
libyaml = True
[default@c6ca8d3c3c60 /]$ ansible-galaxy collection list
# /home/default/.ansible/collections/ansible_collections
Collection Version
----------------------------- -------
middleware_automation.amq 1.3.10
middleware_automation.common 1.1.2
# /home/default/.local/lib/python3.11/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 6.3.0
ansible.netcommon 5.1.2
ansible.posix 1.5.4
ansible.utils 2.10.3
ansible.windows 1.14.0
arista.eos 6.0.1
awx.awx 22.6.0
azure.azcollection 1.16.0
check_point.mgmt 5.1.1
chocolatey.chocolatey 1.5.1
cisco.aci 2.7.0
cisco.asa 4.0.1
cisco.dnac 6.7.3
cisco.intersight 1.0.27
cisco.ios 4.6.1
cisco.iosxr 5.0.3
cisco.ise 2.5.14
cisco.meraki 2.15.3
cisco.mso 2.5.0
cisco.nso 1.0.3
cisco.nxos 4.4.0
cisco.ucs 1.10.0
cloud.common 2.1.4
cloudscale_ch.cloud 2.3.1
community.aws 6.2.0
community.azure 2.0.0
community.ciscosmb 1.0.6
community.crypto 2.15.0
community.digitalocean 1.24.0
community.dns 2.6.0
community.docker 3.4.8
community.fortios 1.0.0
community.general 7.3.0
community.google 1.0.0
community.grafana 1.5.4
community.hashi_vault 5.0.0
community.hrobot 1.8.1
community.libvirt 1.2.0
community.mongodb 1.6.1
community.mysql 3.7.2
community.network 5.0.0
community.okd 2.3.0
community.postgresql 2.4.3
community.proxysql 1.5.1
community.rabbitmq 1.2.3
community.routeros 2.9.0
community.sap 1.0.0
community.sap_libs 1.4.1
community.skydive 1.0.0
community.sops 1.6.4
community.vmware 3.9.0
community.windows 1.13.0
community.zabbix 2.1.0
containers.podman 1.10.2
cyberark.conjur 1.2.0
cyberark.pas 1.0.19
dellemc.enterprise_sonic 2.2.0
dellemc.openmanage 7.6.1
dellemc.powerflex 1.7.0
dellemc.unity 1.7.1
f5networks.f5_modules 1.25.1
fortinet.fortimanager 2.2.1
fortinet.fortios 2.3.1
frr.frr 2.0.2
gluster.gluster 1.0.2
google.cloud 1.2.0
grafana.grafana 2.1.5
hetzner.hcloud 1.16.0
hpe.nimble 1.1.4
ibm.qradar 2.1.0
ibm.spectrum_virtualize 1.12.0
infinidat.infinibox 1.3.12
infoblox.nios_modules 1.5.0
inspur.ispim 1.3.0
inspur.sm 2.3.0
junipernetworks.junos 5.2.0
kubernetes.core 2.4.0
lowlydba.sqlserver 2.1.0
microsoft.ad 1.3.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.22.0
netapp.elementsw 21.7.0
netapp.ontap 22.7.0
netapp.storagegrid 21.11.1
netapp.um_info 21.8.0
netapp_eseries.santricity 1.4.0
netbox.netbox 3.13.0
ngine_io.cloudstack 2.3.0
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.3
openstack.cloud 2.1.0
openvswitch.openvswitch 2.1.1
ovirt.ovirt 3.1.2
purestorage.flasharray 1.20.0
purestorage.flashblade 1.12.1
purestorage.fusion 1.6.0
sensu.sensu_go 1.14.0
servicenow.servicenow 1.0.6
splunk.es 2.1.0
t_systems_mms.icinga_director 1.33.1
telekom_mms.icinga_director 1.34.1
theforeman.foreman 3.12.0
vmware.vmware_rest 2.3.1
vultr.cloud 1.8.0
vyos.vyos 4.1.0
wti.remote 1.0.5
This works
[root@amq1 bin]# ./artemis mask --password-codec --hash test
2023-09-08 14:37:19,168 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
result: c5a9a21e812fea2794df0114fa7c78b0:a52d7c2dff4a36c7b4ed381b067872b154e88e0b0017ec9a53defdbe865b67161bf4b32f0c22e6928a015afcc5aef512ab0b55f0237f459a39ad5025713aecf6
But logging ends up in the artemis.users file
amq-admin = ENC(2023-09-08 14:35:33,856 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
63657b38db1a65d197efd58a56af275c:b6a4fc05591317fd0bc95c7c5a3dbd6ab41764b93cd51aa31a8efb90c9f6ed712dc5d49a7cafc2d90f4e5bce30683aeb7b8160ae19791baf1edd8498bbab41d3)
amq-application-sa = ENC(2023-09-08 14:35:38,378 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
f42e63958b97a33141455b2265e04b25:b12c5cb6fbc0adf9d21027473a3d9c7eb16e8c98d6fc4d6c6cfcee53ba1faba47ac3855ef12af9fa82bd673b69d0fddb4ebb4d076e03f10e4bfca1dd0e29e09c)
amq-testers-sa = ENC(2023-09-08 14:35:42,274 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
c11e766e2c53940384a1000c908a9be8:238040adcb3a10d85abe2fc6dd136578cec1e8844a173fdde4ad9ed40e3d5df7f017bcb8f1bad4da177dd3b2b2dde5ff9d03500b7600c2375b17ad4b8eaaeebf)
The logging suggests there is a problem with ArtemisPrometheusMetricsPlugin. When I remove this from the broker.xml
<metrics>
<plugin class-name="com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin"/>
</metrics>
I get a password mask command without logging
[root@amq1 bin]# ./artemis mask --password-codec --hash test
result: 5904b58f3a1475b931823395d439069c:2c12c60c736f6a6be94aed1172a67060bc887b7575eebdad54325df76995f9cbabcb220745059502a2f478e7dd4c26747c1b4bd484dc9e939c7fb49cd90263aa
Get not additional login when running artemis amsk command
amq-admin = ENC(2023-09-08 14:35:33,856 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
63657b38db1a65d197efd58a56af275c:b6a4fc05591317fd0bc95c7c5a3dbd6ab41764b93cd51aa31a8efb90c9f6ed712dc5d49a7cafc2d90f4e5bce30683aeb7b8160ae19791baf1edd8498bbab41d3)
amq-application-sa = ENC(2023-09-08 14:35:38,378 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
f42e63958b97a33141455b2265e04b25:b12c5cb6fbc0adf9d21027473a3d9c7eb16e8c98d6fc4d6c6cfcee53ba1faba47ac3855ef12af9fa82bd673b69d0fddb4ebb4d076e03f10e4bfca1dd0e29e09c)
amq-testers-sa = ENC(2023-09-08 14:35:42,274 INFO [org.apache.activemq.artemis.core.server] AMQ221082: Initializing metrics plugin com.redhat.amq.broker.core.server.metrics.plugins.ArtemisPrometheusMetricsPlugin with properties: {}
c11e766e2c53940384a1000c908a9be8:238040adcb3a10d85abe2fc6dd136578cec1e8844a173fdde4ad9ed40e3d5df7f017bcb8f1bad4da177dd3b2b2dde5ff9d03500b7600c2375b17ad4b8eaaeebf)
I would like to copy a trust store and key store from my local control machine to the target broker. Would this be in scope for this project? Thanks
I would expect that the setting activemq_prometheus_enabled
. True would be enough to include the Prometheus metrics plugin. However this did not work and this step of the playbook was skipped. I needed to enable the setting amq_broker_enable: True
. I find the setting amq_broker_anable a bit unclear and whats the reason that it is needed for the Prometheus plugin?
The guide Using custom encypted passwords needs a tester and technical writer. Experiment and analyze the molecule test here, and using it write down the guide in markdown here.
Hi, thanks for the fixes. I was trying the latest version in the main branch . I ran the playbook again and it went a little bit further. However, it failed at another task (I removed the no_log settings for this task):
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
fatal: [192.168.2.211]: FAILED! => {"msg": "Failed to get information on remote file (/opt/amq/amq-broker/etc/artemis-users.properties): Permission denied"}
fatal: [192.168.2.212]: FAILED! => {"msg": "Failed to get information on remote file (/opt/amq/amq-broker/etc/artemis-users.properties): Permission denied"}
These are the permissions for the etc folder:
[root@amq1 etc]# ll
total 44
-rw-r--r--. 1 amq-broker amq-broker 966 Sep 2 07:38 artemis-roles.properties
-rw-r--r--. 1 amq-broker amq-broker 1166 Sep 2 07:38 artemis-users.properties
-rw-r--r--. 1 amq-broker amq-broker 3101 Sep 2 07:38 artemis.profile
-rw-r--r--. 1 amq-broker amq-broker 1521 Sep 2 07:38 bootstrap.xml
-rw-r--r--. 1 amq-broker amq-broker 12150 Sep 2 07:38 broker.xml
-rw-r--r--. 1 amq-broker amq-broker 1316 Sep 2 07:38 jolokia-access.xml
-rw-r--r--. 1 amq-broker amq-broker 3259 Sep 2 07:38 logging.properties
-rw-r--r--. 1 amq-broker amq-broker 1086 Sep 2 07:38 login.config
-rw-r--r--. 1 amq-broker amq-broker 2364 Sep 2 07:38 management.xml
I am running Ansible with the default user for Ansible. I don't know if this meant to be?
Currently, the shared storage for an HA-shared storage setup is stored in /opt/amq/amq-broker/data/shared.
I believe it would be best practice to store the data in a separate folder from the AMQ program files (/opt/amq/amq-broker.
This would make two things easier:
The current options to enable SSL do not enable SSL for connectors and acceptors in broker.xml. Currently, SSL settings only implement SSL settings for the Web UI. It would be preferable to also implement SSL settings for acceptors and connectors by using this playbook. Would this be possible in the future?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.