Click here to Skip to main content
15,845,436 members
Articles / Database Development / Elasticsearch

Logging with Elasticserach & Enterprise Library

Rate me:
Please Sign up or sign in to vote.
4.00/5 (1 vote)
25 Feb 2019CPOL2 min read 6.7K   3  
In this tutorial, we will see an example of JSON format logging with Microsoft Enterprise logging, sending the logs to elasticsearch with Filebeat and use Kibana to view our logs.


Elasticsearch is one of the best open source search engines we have today, having great abilities as a nosql document DB, which can make a great tool for application logging.

We will learn today how we can write our logs into rolling file, send the logs to elasticsearch with Filebeat, and view our logs in a beautiful way with Kibana.


  • Logging with Microsoft Enterprise Library (we will use it for this example, but can do the same with other libraries like NLOG, for example)
  • Elasticsearch cluster installed and ready
  • Filebeat + Kibana installed (+optional: elasticsearch 'head' chrome extension)


  1. Add new listener to entlib.config
  2. Create custom formatter to write json logs
  3. Configure filebeat to send the logs to elasticsearch
  4. View the logs in Kibana

1. Add New Listener to entlib.config

Create new listener of type RollingFlatFileTraceListener.

Name it Json TraceListener.

Use the new formatter Json Text Formatter.

Make sure the header and footer are empty.


<add fileName="D:\Logs\JsonLogs\rolling.log" footer=""
     formatter="Json Text Formatter" header=""
     rollFileExistsBehavior="Increment" rollInterval="Day" rollSizeKB="50000"
     RollingFlatFileTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, 
     Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
     traceOutputOptions="None" filter="All" 
     Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral, 
     name="Json TraceListener" />

Image 1

2. Create Custom Formatter to Write json Logs

Filebeat is sending logs in json format by default, separating each log written according to the new line character (\n).

In that case, we will want to write each log in a different log, and to remove all the spaces character when writing each row.

We will do it by Creating new formatter, which will include new method for cleaning white spaces:

<add template="{&#34;Timestamp&#34;:&#34;{timestamp(MM/dd/yyyy HH:mm:ss.fff)}&#34;, 
 &#34;Category&#34;:&#34;{category}&#34;, &#34;Machine&#34;:&#34;{machine}&#34;, 
 &#34;Process Id&#34;:&#34;{processId}&#34; {dictionary(, &#34;{key}&#34;: &#34;{value}&#34;)} }"
  type="Infrastructure.Logger.Formatters.CustomTextFormatter, Infrastructure.Logger"
  name="Json Text Formatter" />

CustomTextFormatter inherits from Microsoft.Practices.EnterpriseLibrary.Logging.Formatters.TextFormatter and implementing the formatForJsonValue method:

Image 2

For more instructions about custom formatter, check out this link.

3. Configure Filebeat to Send the Logs to elasticsearch

In filebeat.yml, add the path and the output as follows:

- paths:
   - E:\temp\logs\*.log
  input_type: log
  json.keys_under_root: true
  json.add_error_key: true   

  # Array of hosts to connect to.
  hosts: ["localhost:9200"]
  index: "bingo-logs-%{[beat.version]}-%{+yyyy.MM.dd}"

Image 3

After running filebeat, we will be able to see the logs sent to elasticsearch:

Image 4

4. View the Logs in Kibana

Go to KibanaManagementIndex PatternsCreate Index Pattern

And add the index pattern same as in your filebeat.yml configuration.

Image 5

Go to Discover → select your new index.

Create your own custom view (save your view for future usage):

Image 6


  • 25th February, 2019: Initial version


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Written By
Israel Israel
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

-- There are no messages in this forum --