suse / doc-slesforsap Goto Github PK
View Code? Open in Web Editor NEWOfficial SUSE Linux Enterprise Server for SAP Applications Documentation
Home Page: https://documentation.suse.com/sles-sap/
Official SUSE Linux Enterprise Server for SAP Applications Documentation
Home Page: https://documentation.suse.com/sles-sap/
A new version of Trento premium is available (v0.8.1).
The installation process for the Agent hasn't changed. But the installation process for the server must be adjusted to accommodate the new version:
Please replace:
HELM_EXPERIMENTAL_OCI=1 helm upgrade --install \
TRENTO_SERVER_HOSTNAME \
oci://registry.suse.com/trento/trento-server \
--version 0.2.5 \
--set trento-runner.image.tag=0.7.1 \
--set trento-web.image.tag=0.7.1 \
--set-file trento-runner.privateKey=PRIVATE_SSH_KEY
with
HELM_EXPERIMENTAL_OCI=1 helm upgrade --install \
trento-server \
oci://registry.suse.com/trento/trento-server \
--version 0.3.5 \
--set-file trento-runner.privateKey=PRIVATE_SSH_KEY
In art-sol-automation for 15 SP2, there are the following b0rk links:
All of these are probably under docs.suse.com/sbp/
now.
Trento Server installation must be updated as follows:
Step 2 (installing as user root)
Replace
curl -sfL https://get.k3s.io | sh
with
curl -sfL https://get.k3s.io | INSTALL_K3S_SKIP_SELINUX_RPM=true sh
Step 2 (installing as non-root user)
Replace
curl -sfL https://get.k3s.io | sh -s - --write-kubeconfig-mode 644
with
curl -sfL https://get.k3s.io | INSTALL_K3S_SKIP_SELINUX_RPM=true sh -s - --write-kubeconfig-mode 644
This change is to prevent that users running SLES 15 SP3 run into an issue when installing K3s due to missing container-selinux
package.
The section title DRBD/NFS Automated Configuration
in the automation documentation page is not really accurate.
The drbd-formula
only configures DRBD, not NFS. In order to use NFS, we need to do things on top of it.
The thing is that SUSE recommends to use DRBD with HA to have a high available NFS share. For that, we configure the HA cluster in top of DRBD to create the NFS exports. But this is not anything the drbd-formula
does.
Maybe the title is confusing.
If we want to document how to have high available NFS with DRBD, we will need to add more things (explain that we need to use drbd-formula
and habootstrap-formula
, and after that apply a specific configuration to the cluster to setup as NFS).
Hello,
In order to improve the documentation of the automation page, we should add a new section explaining how to use the formulas in a generic way. I have the next in my mind.
Create a new section after all of the formulas are listed. Something like How to use the formulas
.
This section would have the next information:
The salt formulas can be used with 2 different approaches: salt master/minion or only salt minion execution.
For the master/minion approach, all of the next steps must be executed in the master machines. If only salt minion option is used, the steps must be executed in all of the minions where the formulas are going to executed.
Install the formulas
Install the formulas that are going to be applied using zypper
Create the pillar files structure
The pillar files work as configuration files for the salt formulas, they are the input the user has to give. In order to use the files, the next structure is the most appropriate one (this is only an example):
a. Create the /srv/pillar
folder (this might be already in place)
b. Create a top.sls
file in /srv/pillar
with the next content. This will apply the hana
, netweaver
, drbd
and ha
formulas in the nodes specified by the hostname.
base:
'hana01,hana02':
- hana.hana
- hana.cluster
'netweaver01,netweaver02':
- netweaver.netweaver
- netweaver.cluster
'drbd01,drbd02':
- drbd.drbd
- drbd.cluster
c. Create the folders /srv/pillar/hana
, /srv/pillar/netweaver
and /srv/pillar/drbd
for the pillar files (notice that these names match with the names used in the top.sls
file.
d. Add the pillar files in the correct folder. hana
folder will have hana.sls
and cluster.sls
, netweaver
folder will have netweaver.sls
and cluster.sls
, and drbd
folder will have drbd.sls
and cluster.sls
.
Find here an example of the contents you can use for each of the formulas:
hana.sls
netwaever.sls
drbd.sls
cluster.sls
The content of the pillar files must be configured depending on the needed configuration of each of the formulas (notice that in this example, 3 different cluster.sls
are used, and each of them might be different, as the HA clusters requirements might be different too.
e. Create a top.sls
in /srv/salt
folder. It follows the same rules as the pillar top.sls
file. Here an example (in this case the folder part is removed, as we are pointing to the formulas with their name):
base:
'hana01,hana02':
- hana
- cluster
'netweaver01,netweaver02':
- netweaver
- cluster
'drbd01,drbd02':
- drbd
- cluster
salt '*' state.highstate
b. For the only minion approach run:
salt-call --local state.highstate
Edit toms:
Related to bsc#1174530
The update sections in the Trento Premium documentation still reflect the update process for the community / open-source version. They must be updated and aligned with the Trento Premium installation process.
Particularly:
export KUBECONFIG=/etc/rancher/k3s/k3s.yaml
HELM_EXPERIMENTAL_OCI=1 helm upgrade --install trento-server oci://registry.suse.com/trento/trento-server --version 0.3.5 --set-file trento-runner.privateKey=PRIVATE_SSH_KEY
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.