Distribute files using large global vars

we use a lot of custom scripts for checks.
we need to update/distribute these scripts often (~once a day).
network is ~3000 computers.

set one or more global variables on master with a base64-string

/* xz-compressed tarball, converted with base64 -w0 */
const MyTarball = "foobarbaz"

on every computer we can use after a short time:

$ icinga2 get MyTarball | base64 -d | xz -d >tarball.tar

this works in a small scale, we tested 16 kilobyte (which uncompresses to 160 kilobytes)
we need to use ~500 Kilobytes for the content of a variable.
are we running into limits?
is this a bad idea?

Yes, this is what a SCCM or a package manager is for. This looks like a classical, I have hammer and now every thing looks like a nail, story :wink:

You are right, background is:

We are somehow restricted on many nodes,
so no Internet and/or no direct SSH-access,
but TCP-5665 works. So this was an idea to make things easier.

What about internal icinga2-limits of variable-space or variable-size?
What about network traffic, how often are global-static-vars submitted?

I’m in the process of setting up a mirror to have automatic updates without Internet access and total control of the deployment. Besides the security updates, one of the repositories is containing our icinga checks scripts.
This way the machines pull the new versions and it runs over a mature mechanism with all the power of .deb and .rpm like install, uninstall and version pinning etc.

I can’t help you with your questions about internal Icinga2 limitations.