Run the latest version of the ELK (Elasticseach, Logstash, Kibana) stack with Docker and Docker-compose.
**Note**: This version has [X-Pack support](https://www.elastic.co/products/x-pack).
It will give you the ability to analyze any data set by using the searching/aggregation capabilities of Elasticseach and the visualization power of Kibana.
You need to increase `max_map_count` on your Docker host:
```bash
$ sudo sysctl -w vm.max_map_count=262144
```
## SELinux
On distributions which have SELinux enabled out-of-the-box you will need to either re-context the files or set SELinux into Permissive mode in order for docker-elk to start properly.
For example on Redhat and CentOS, the following will apply the proper context:
When cloning this repo on Windows with line ending conversion enabled (git option `core.autocrlf` set to `true`), the script `kibana/entrypoint.sh` will malfunction due to a corrupt shebang header (which must not terminated by `CR+LF` but `LF` only):
````bash
```bash
...
Creating dockerelk_kibana_1
Attaching to dockerelk_elasticsearch_1, dockerelk_logstash_1, dockerelk_kibana_1
: No such file or directory/usr/bin/env: bash
````
```
So you have to either
So you have to either:
* disable line ending conversion *before* cloning the repository by setting `core.autocrlf` set to `false`: `git config core.autocrlf false`, or
* convert the line endings in script `kibana/entrypoint.sh` from `CR+LF` to `LF` (e.g. using Notepad++).
...
...
@@ -67,17 +78,14 @@ Now that the stack is running, you'll want to inject logs in it. The shipped log
$ nc localhost 5000 < /path/to/logfile.log
```
And then access Kibana UI by hitting [http://localhost:5601](http://localhost:5601) with a web browser.
And then access Kibana UI by hitting [http://localhost:5601](http://localhost:5601) with a web browser and use the following credentials to login:
*NOTE*: You'll need to inject data into logstash before being able to create a logstash index in Kibana. Then all you should have to do is to
*NOTE*: You'll need to inject data into logstash before being able to create a logstash index in Kibana. Then all you should have to do is to hit the create button.
*NOTE*: In order to use Sense, you'll need to query the IP address associated to your *network device* instead of localhost.