When working against an ordinary diagnostic bundle, it is going to re-archive the file with scrubbed- prepended to the identify. One documents and directories will probably be enclosed within a new archive .
Unzip the downloaded file into your Listing you want to operate from. This may be on a similar host since the as the Elasticsearch, Kibana or Logstash host you would like to interrogate, or over a remote server or workstation.
Prior to deciding to start off, ensure that your server meets the minimum amount demands for ElasticSearch. 4GB of RAM and a couple of CPUs is recommended. Not meeting these requirements could lead to your occasion currently being killed prematurely in the event the server operates out of memory.
At that point it is possible to interface While using the diagnostic in the exact same way as you'll when it absolutely was immediately put in over the host. Should you seem while in the /docker
You have got to supply credentials to establish an ssh session towards the host containing the specific Elasticsearch node, but it's going to collect the same artifacts given that the regional sort. api
If glitches arise when aiming to obtain diagnostics from Elasticsearch nodes, Kibana, or Logstash processes running within just Docker containers, take into consideration functioning While using the --form established to api, logstash-api, or kibana-api to confirm that the configuration is not causing challenges Using the procedure contact or log extraction modules inside the diagnostic. This should allow the REST API subset to generally be successfully collected.
Working the api form to suppress procedure call and log assortment and explicitly configuring an output Listing.
The hostname or IP tackle of the host from the proxy url. Elasticsearch support This should NOT be in the shape of a URL containing http:// or https://.
You may also run it from in a Docker container(see further more Recommendations down for creating a picture).
This utility enables you to extract a subset of checking details for interval of up to 12 hours at a time. It will eventually package this into a zip file, very like The existing diagnostic. Just after it truly is uploaded, a support engineer can import that information into their own monitoring cluster so it might be investigated outside of a monitor share, and be conveniently seen by other engineers and developers.
Just after it's got checked for IP and MAC addresses it will use any configured tokens. When you include a configuration file of equipped string tokens, any prevalence of that token will get replaced which has a produced substitute.
See particular documentation for more facts on These form choices. It can even gather logs within the node within the focused host Except if it truly is in Relaxation API only method.
Generates obfuscated prompt to the elasticsearch password. Passing of the simple textual content password for automated processes is possible although not encouraged provided it can't be hid through the history.
Ensure that you have a legitimate Java set up that the JAVA_HOME ecosystem variable is pointing to.