Compliant Kubernetes Service documentation has moved

Please note: You are not reading Kubernetes documentation. If you're looking for Compliant Kubernetes Service documentation, it has moved. Read more here.

Extracting Messages from Elasticsearch

The guides from Elasticsearch will provide the most detail for executing searches with Elasticsearch.

Using curl and jq you can retrieve and filter the syslog messages from Elasticsearch. The messages are stored as structured data in JSON format.

Elasticsearch can be queried directly by making a request to https://podhostname/__es

Authentication

The Logging dashboard requires authentication to access.

After the release of The Platform 2.1 on July 21, the only way to get access to your logging dashboard is via a session token from the Datica authentication server. Below is a sample script that will automatically generate a session token and then allow you to supply an Elasticsearch URL as a query:

#!/bin/bash

USER='<user>'
PASSWORD='<password>'

URL=$1
ARG1=$2
ARG2=$3
ARG3=$4

corl () {
nonce="X-Request-Nonce: $(cat /dev/urandom | head -c 32 | base64)"
ts="X-Request-Timestamp: $(date +%s)"
curl "$@" -H "${nonce}" -H "${ts}" -H 'Content-Type: application/json' -H 'Accept: application/json'
}

check_mfa () {
echo "${RESPONSE}" | grep -q 'mfa'
mfa=$?
if [[ "${mfa}" == 0 ]]; then
echo "${RESPONSE}" | grep -q 'email'
email_mfa=$?
if [[ "${email_mfa}" == 0 ]]; then
RESPONSE=$(corl https://auth.datica.com/auth/signin?mfaType=email -XPOST -d "{\"identifier\": \"${USER}\", \"password\": \"${PASSWORD}\"}")
mfa_id=$(echo "${RESPONSE}" | grep '"mfaID":' | awk '{ print $2 }' | sed 's#[",]##g')
echo
read -p "Check your email for MFA OTP. Enter OTP here: " otp
echo
mfa_status=0
else
echo -e "\nTo use MFA, you will also need to enable email MFA.\nhttps://product.datica.com/account"
exit 1
fi
fi
}

RESPONSE=$(corl https://auth.datica.com/auth/signin -XPOST -d "{\"identifier\": \"${USER}\", \"password\": \"${PASSWORD}\"}")
check_mfa
if [[ "${mfa_status}" == 0 ]]; then
RESPONSE=$(corl https://auth.datica.com/auth/signin/mfa/${mfa_id} -XPOST -d "{\"identifier\": \"${USER}\", \"password\": \"${PASSWORD}\", \"otp\": \"${otp}\"}")
else
RESPONSE=$(corl https://auth.datica.com/auth/signin -XPOST -d "{\"identifier\": \"${USER}\", \"password\": \"${PASSWORD}\"}")
fi
ENCODEDTOKEN=$(echo "${RESPONSE}" | python -c "import sys, json; print(json.load(sys.stdin)['sessionToken'])")

corl -H "Cookie: sessionToken=${ENCODEDTOKEN}" ${URL} ${ARG1} ${ARG2} ${ARG3}

Be sure to make this script executable with chmod +x <script_name>. All the examples below will have this script saved as esquery.

MFA Users

Regardless of your default MFA method, you must have email MFA enabled. The script will prompt you for an OTP (One Time Password). You will receive your OTP via email.

Structuring Your Query

The index and type of the documents are specified in the path of the uri.

The index name is in the format logstash-YYYY.MM.DD. For example, logstash-2015.11.09

The type is “syslog”

To return all the records in Elasticsearch from 2015-11-09, the command would be:

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search

The request will return a JSON document. You can pipe the results through jq to filter the results and only show the syslog message:

jq '.hits.hits[] | ._source| .syslog_message'

The full command would be:

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search | jq '.hits.hits[] | ._source| .syslog_message'

Search parameters can be added to a search by including a json document in the request:

Make a file called es_params.json to store the parameters of the request:

es_params.json: { "query" : { "match" : { "syslog_message" : "Error" } } } Include the parameters in the request:

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search -d @es_params.json | jq '.hits.hits[] | ._source| .syslog_message'

The results from the request are paginated and by default only 10 results are shown.

Add a size query parameter to the URI. Be aware that too many results will significantly increase the memory usage of Elasticsearch and negatively impact performance

./esquery https://podhostname/__es/logstash-2015.11.09/syslog/_search?size=2 -d @es_params.json | jq '.hits.hits[] | ._source| .syslog_message'