Compliant Kubernetes Service documentation has moved

Please note: You are not reading Kubernetes documentation. If you're looking for Compliant Kubernetes Service documentation, it has moved. Read more here.

Extracting Messages from Elasticsearch

The guides from Elasticsearch will provide the most detail for executing searches with Elasticsearch.

Using curl and jq you can retrieve and filter the syslog messages from Elasticsearch. The messages are stored as structured data in JSON format.

Elasticsearch can be queried directly by making a request to https://podhostname/__es


The Logging dashboard requires authentication to access.

After the release of The Platform 2.1 on July 21, the only way to get access to your logging dashboard is via a session token from the Datica authentication server. Below is a sample script that will automatically generate a session token and then allow you to supply an Elasticsearch URL as a query:


corl () {
    nonce="X-Request-Nonce: `python -c \"import base64, os; print base64.b64encode(os.urandom(32))\"`"
        ts="X-Request-Timestamp: `python -c \"import time; print int(time.time())\"`"
        curl "$@" -H "${nonce}" -H "${ts}" -H 'Content-Type: application/json' -H 'Accept: application/json'


        export RESPONSE=$(corl -XPOST -d '{"identifier": "", "password": "test123456"}')
          ENCODEDTOKEN=$(python -c "import urllib; import os; import json; print urllib.quote(json.loads(os.environ['RESPONSE'])['sessionToken'])")

        corl -H "Cookie: sessionToken=${ENCODEDTOKEN}" ${URL} ${ARG1} ${ARG2} ${ARG3}

Be sure to make this script executable with chmod +x <script_name>. All the examples below will have this script saved as esquery.

Structuring Your Query

The index and type of the documents are specified in the path of the uri.

The index name is in the format logstash-YYYY.MM.DD. For example, logstash-2015.11.09

The type is “syslog”

To return all the records in Elasticsearch from 2015-11-09, the command would be:

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search

The request will return a JSON document. You can pipe the results through jq to filter the results and only show the syslog message:

jq '.hits.hits[] | ._source| .syslog_message'

The full command would be:

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search | jq '.hits.hits[] | ._source| .syslog_message'

Search parameters can be added to a search by including a json document in the request:

Make a file called es_params.json to store the parameters of the request:

es_params.json: { "query" : { "match" : { "syslog_message" : "Error" } } } Include the parameters in the request:

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search -d @es_params.json | jq '.hits.hits[] | ._source| .syslog_message'

The results from the request are paginated and by default only 10 results are shown.

Add a size query parameter to the URI. Be aware that too many results will significantly increase the memory usage of Elasticsearch and negatively impact performance

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search?size=2 -d @es_params.json | jq '.hits.hits[] | ._source| .syslog_message'