Dockerized ELK Stack (Windows 11 + WSL2 + Carbon Black Cloud Data From S3) [2025]
Introduction
Today, I’ll guide you through setting up the ELK stack on your machine using Docker.
This tutorial assumes you already have an AWS S3 bucket containing Carbon Black data.
For confidentiality reasons, some information has been redacted, and you’ll need to provide your own values where applicable.
Step 1: Install WSL2 (Ubuntu 20.04 Distro)
Follow the official Windows WSL Installation Docs (you may need to Update WSL2 Kernel too).
# Some useful commands
# Check WSL versions
> wsl -l -v
# Set WSL Version
> wsl - set-version Ubuntu-20.04
Step 2: Add Packages to WSL2 (“jq”, “git”, and “curl”)
# NOTE: Move on if there's errors relating to the "jq" package at your own
# risk as this may lead to problems down the road.
$ apt-get update
$ apt-get install jq git curl
Step 3: Install Docker
Install Docker on the WSL2 following the Linux Docker Engine Installation Docs.
Don’t forget to start the docker daemon!
# Start Docker Daemon
sudo service docker start
# Sanity check while you're at it
sudo docker run hello-world
Step 4: Setup the Dockerized ELK Stack
We will use this existing GitHub repository, modify the .env file, and spin up dockerized instances of ElasticSearch, Kibana, and a FleetServer.
# 1. Clone the repo
$ git clone https://github.com/peasead/elastic-container.git elastic-container
$ cd elastic-container
# 2. Manually modify the following environment variables in the .env file:
#
# # Password must be something other than "changeme"
# ELASTIC_PASSWORD="changeme"
# KIBANA_PASSWORD="changeme"
#
# # I can only verify this version
# STACK_VERSION="8.12.2"
#
# 3. [Optional] Destroy previous instances
$ sudo ./elastic-container.sh destroy
# 4. Run the ELK stack
$ sudo ./elastic-container.sh start
Here is the example output after running “sudo ./elastic-container.sh start” and then listing the docker containers:
Step 5: Log into Kibana
- Open a browser
- Go to https://localhost:5601/ (notice the “https”)
- The username is “elastic” and the password is the KIBANA_PASSWORD from the .env file modified in “Step 4: Setup the Dockerized ELK Stack”
Step 6: Setup Elastic Agent
(1) Go to Management > Integrations and click on VMware Carbon Black Cloud:
(2) Click on Add VMware Carbon Black Cloud:
(3) Configure the integration settings:
- Disable Collect Carbon Black Cloud logs via API using HTTP JSON [Legacy]
- Disable Collect Carbon Black Cloud logs via API using CEL [Beta],
- ENABLE Collect Carbon Black Cloud logs via AWS S3 or AWS SQS
(4) Click on “Change Defaults” for Collect Carbon Black Cloud logs via AWS S3 or AWS SQS to see more configuration options:
- Enable Collect logs via S3 Bucket
- Set [S3] Bucket ARN to [REDACTED]
- Set Acess Key ID to [REDACTED]
- Set Secret Access Key to [REDACTED]
(5) For Collect endpoint events from Carbon Black Cloud, update the [S3] Bucket Prefix to [REDACTED]
(5) For Collect watchlist hit from Carbon Black Cloud, update the [S3] Bucket Prefix to [REDACTED]
(6) Click Save and continue
- Remember the [New agent policy name]
Step 7: Setup Elastic Agent
(1) Go to Management > Fleet
(2) Click Add agent (a side Drawer should appear on the right side of your screen
(3) For the “What type of host are you adding?” section, select the [New agent policy name] you remembered “Step 6: Setup Elastic Agent”.
Note: I renamed my agent policy to “VMware CBC AP”
(4) For the“Enroll in Fleet?” section, select “Enroll in Fleet (recommended)”
(5) For the “Install Elastic Agent on your host” section
First, xopy the enrollment token from the commands provided by Kibana
Then run the following commands in the WSL2 (being sure to replace the <enrollment token> with what you copied from Kibana)
# Download the elastic-agent
$ curl -L -O https://artifacts.elastic.co/downloads/beats/elastic-agent/elastic-agent-8.12.2-linux-x86_64.tar.gz
# Extract the elastic agent
$ tar xzvf elastic-agent-8.12.2-linux-x86_64.tar.gz
# Run the agent
$ cd elastic-agent-8.12.2-linux-x86_64
sudo ./elastic-agent install --url=https://172.26.181.66:8220 --enrollment-token=<enrollment token> -f --insecure
(6) Now wait for the remaining sections “Confirm agent enrollment” and “Confirm incoming data” to transform into green check marks.
Step 8: Verify Carbon Black Cloud Data
(1) Go to Analytics > Discover
(2) Apply the following filters to see stuff
- Filter by “logs-” with a wildcard suffix (“logs-*”).
- Change the time window Start date to Apr 1,2024
- Change the time window End date to May 31,2024
Note: This step is data dependent, your filters will vary
Next Steps: Kibana Local Plugin Development
Stay tuned for the next tutorial where I will show you how to use this local dockerized ELK stack for Kibana plugin development here!