Analyse WSO2 Identity Server event logs with ELK (Part 1)

Avarjana Panditha
3 min readJun 6, 2022

WSO2 Identity Server (IS) provides various ways to access their events through their event publishers. Apart from the existing WSO2 IS Analytics Solution, users can create their own analytic solutions using those event publishers offered by WSO2 IS. I am going to guide you through the process of configuring WSO2 IS to publish their events to a locally installed Elastic Stack.

I have discussed about the HTTP publisher based approach in this article (ELK v7.X). However, there are certain drawbacks of using a HTTP based approach like losing data in the downtime. The approach discussed in this article is to ship wso2carbon log files with Filebeat to ELK and visualise them.

You need to have up and running ELK v8.X stack to follow this tutorial series.

I will continue this as a series of articles with three parts containing the following.

  • Part 1: Configuring WSO2 IS and Filebeats to collect and ship logs to Logstash.
  • Part 2: Logstash filters to further process and reshape the logs and send to Elasticsearch indices.
  • Part 3: Visualising data with Kibana.

Additional articles for your knowledge,

Configuring WSO2 Identity Server (6.X.X) to publish login and session events to wso2carbon.log file

Get the latest IS pack from WSO2 IS releases and extract it. There is a new configuration to enable data publishing for ELK analytics. Open <IS_HOME>/repository/conf/deployment.toml and add the following configuration.

[analytics.elk]
enable=true

This config will enable event publishers for authentication and session events via wso2carbon.log file located at <IS_HOME>/repository/logs.

Run the pack as usual and login to MyAccount (https://localhost:9443/myaccount) using admin credentials. Now check the terminal to verify if two authentication events and a session event is printed there.

[xxxx-xx-xx xx:xx:xx,xxx] [xxxxxx-xxx-xxx-xxx-xxxxx]  INFO {org.wso2.carbon.event.output.adapter.logger.LoggerEventAdapter} - Unique ID: auth[xxxx-xx-xx xx:xx:xx,xxx] [xxxxxx-xxx-xxx-xxx-xxxxx]  INFO {org.wso2.carbon.event.output.adapter.logger.LoggerEventAdapter} - Unique ID: auth[xxxx-xx-xx xx:xx:xx,xxx] [xxxxxx-xxx-xxx-xxx-xxxxx]  INFO {org.wso2.carbon.event.output.adapter.logger.LoggerEventAdapter} - Unique ID: session

Congratulations! You have completed the first step toward the great bond of WSO2 IS and ELK :)

Configuring Filebeat to capture the wso2carbon.log files

Once the installation of Filebeat is completed, replace the content of filebeat.yml file with the following

filebeat.inputs:
- type: filestream
enabled: true
parsers:
- multiline:
type: pattern
pattern: '^[[:space:]]Event:'
negate: false
match: after
include_lines: ['Event:']
paths:
- <IS_HOME>/repository/logs/wso2carbon*.log
filebeat.registry.path: /var/lib/filebeat/registry
output.logstash:
hosts: ["localhost:5044"]

First three lines of this configuration is pretty straightforward and I will explain from there onwards.

parsers:
- multiline:
type: pattern
pattern: '^[[:space:]]Event:'
negate: false
match: after
include_lines: ['Event:']

WSO2 IS event log contains multiline log. First line contains the log date and time along with the Unique ID tag which we are using to identify which event publisher triggered the log. The second line starts with a space and then the keyword Event to identify the start of event data followed by the actual event data in json format.

[xxxx-xx-xx xx:xx:xx,xxx] [xxxxxx-xxx-xxx-xxx-xxxxx]  INFO {org.wso2.carbon.event.output.adapter.logger.LoggerEventAdapter} - Unique ID: auth
Event: {<JSON_EVENT_DATA>}

The multiline parser is used to capture this log content in a single event. ELK is only interested in these auth and session event logs in the carbon log file. Therefore, the include_lines parser is used to capture those lines only.

paths:
- <IS_HOME>/repository/logs/wso2carbon*.log

Carbon log files have a daily log backup with the format of wso2carbon-{DATE}.log. Therefore, the path for Filebeat is provided with the wildcard value.

Now you’re ready to ship WSO2 Identity Server logs for Logstash. There will be more processing included in the Logstash module. Let’s meet with the part 2 of this article soon.

--

--