Comprehensive Guide to .NET Logging with Serilog and Elasticsearch
Introduction
Logging is a critical part of any production application. In the .NET ecosystem, there are several logging frameworks available, but Serilog has emerged as one of the most popular due to its flexibility, powerful structured logging capabilities, and extensive sink support. This guide will walk you through implementing a robust logging strategy for .NET 7+ applications using Serilog with Elasticsearch, complete with monitoring via Kibana.
What You'll Learn
- Setting up the necessary NuGet packages for logging
- Configuring Serilog with Elasticsearch
- Implementing structured logging in a .NET application
- Using Docker to run a sample application with Elasticsearch and Kibana
- Monitoring and analyzing logs using Kibana dashboards
Prerequisites
- .NET 7 SDK or later
- Docker and Docker Compose
- Basic knowledge of .NET and C#
NuGet Packages
First, let's add the necessary NuGet packages to your project:
dotnet add package Serilog
dotnet add package Serilog.AspNetCore
dotnet add package Serilog.Enrichers.Environment
dotnet add package Serilog.Enrichers.Thread
dotnet add package Serilog.Settings.Configuration
dotnet add package Serilog.Sinks.Console
dotnet add package Serilog.Sinks.ElasticsearchEach package serves a specific purpose:
Serilog: The core librarySerilog.AspNetCore: Integration with ASP.NET CoreSerilog.Enrichers.Environment: Adds environment information to logsSerilog.Enrichers.Thread: Adds thread information to logsSerilog.Settings.Configuration: Allows configuring Serilog from appsettings.jsonSerilog.Sinks.Console: Outputs logs to the consoleSerilog.Sinks.Elasticsearch: Sends logs to Elasticsearch
Setting Up a Sample Application
Let's create a simple ASP.NET Core Web API project:
dotnet new webapi -n LoggingDemo
cd LoggingDemoConfiguration
Update your appsettings.json file to include Serilog configuration:
{
"Serilog": {
"MinimumLevel": {
"Default": "Information",
"Override": {
"Microsoft": "Warning",
"System": "Warning"
}
},
"Enrich": ["FromLogContext", "WithMachineName", "WithThreadId"],
"Properties": {
"Application": "LoggingDemo"
}
},
"ElasticConfiguration": {
"Uri": "http://elasticsearch:9200"
},
"AllowedHosts": "*"
}Program.cs Setup
Configure Serilog in your Program.cs:
using Serilog;
using Serilog.Sinks.Elasticsearch;
var builder = WebApplication.CreateBuilder(args);
// Configure Serilog
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
var configuration = new ConfigurationBuilder()
.AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
.AddJsonFile($"appsettings.{environment}.json", optional: true)
.Build();
Log.Logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.Enrich.WithMachineName()
.Enrich.WithThreadId()
.WriteTo.Console()
.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(configuration["ElasticConfiguration:Uri"]))
{
AutoRegisterTemplate = true,
IndexFormat = $"handover-{environment?.ToLower()}-{DateTime.UtcNow:yyyy-MM}"
})
.ReadFrom.Configuration(configuration)
.CreateLogger();
builder.Host.UseSerilog();
// Add services to the container.
builder.Services.AddControllers();
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
try
{
Log.Information("Starting web application");
app.Run();
}
catch (Exception ex)
{
Log.Fatal(ex, "Application terminated unexpectedly");
}
finally
{
Log.CloseAndFlush();
}Creating a Sample Controller
Create a controller with various logging examples:
using Microsoft.AspNetCore.Mvc;
using ILogger = Serilog.ILogger;
namespace LoggingDemo.Controllers;
[ApiController]
[Route("[controller]")]
public class LoggingController : ControllerBase
{
private readonly ILogger _logger;
public LoggingController(ILogger logger)
{
_logger = logger;
}
[HttpGet("info")]
public IActionResult LogInfo()
{
_logger.Information("This is an information message");
return Ok("Information logged");
}
[HttpGet("warning")]
public IActionResult LogWarning()
{
_logger.Warning("This is a warning message");
return Ok("Warning logged");
}
[HttpGet("error")]
public IActionResult LogError()
{
_logger.Error("This is an error message");
return Ok("Error logged");
}
[HttpGet("structured")]
public IActionResult LogStructured()
{
_logger.Information("Request processed for {User} with ID {UserId}", "John", 123);
return Ok("Structured log created");
}
[HttpGet("exception")]
public IActionResult LogException()
{
try
{
throw new Exception("Sample exception");
}
catch (Exception ex)
{
_logger.Error(ex, "An exception occurred");
return StatusCode(500, "Exception logged");
}
}
}Docker Setup
Create a docker-compose.yml file in the root of your project:
version: "3.8"
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.8.0
environment:
- discovery.type=single-node
- xpack.security.enabled=false
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ports:
- "9200:9200"
volumes:
- elasticsearch-data:/usr/share/elasticsearch/data
networks:
- logging-network
kibana:
image: docker.elastic.co/kibana/kibana:8.8.0
ports:
- "5601:5601"
depends_on:
- elasticsearch
networks:
- logging-network
loggingdemo:
build:
context: .
dockerfile: Dockerfile
environment:
- ASPNETCORE_ENVIRONMENT=Development
- ElasticConfiguration__Uri=http://elasticsearch:9200
ports:
- "8080:80"
depends_on:
- elasticsearch
networks:
- logging-network
networks:
logging-network:
driver: bridge
volumes:
elasticsearch-data:
driver: localCreate a Dockerfile for your .NET application:
FROM mcr.microsoft.com/dotnet/aspnet:7.0 AS base
WORKDIR /app
EXPOSE 80
FROM mcr.microsoft.com/dotnet/sdk:7.0 AS build
WORKDIR /src
COPY ["LoggingDemo.csproj", "./"]
RUN dotnet restore "./LoggingDemo.csproj"
COPY . .
WORKDIR "/src/."
RUN dotnet build "LoggingDemo.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "LoggingDemo.csproj" -c Release -o /app/publish
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LoggingDemo.dll"]Running the Application
To run the application with Elasticsearch and Kibana:
docker-compose up -dThis will start:
- Elasticsearch on port 9200
- Kibana on port 5601
- Your .NET application on port 8080
Setting Up Kibana
- Open Kibana at
http://localhost:5601 - Go to "Stack Management" > "Index Patterns"
- Create a new index pattern with
loggingdemo-* - Select
@timestampas the time field - Go to "Discover" to see your logs
Creating a Dashboard
- In Kibana, go to "Dashboard"
- Create a new dashboard
- Add visualizations:
- Line chart showing log counts over time
- Pie chart showing log levels distribution
- Data table showing errors and exceptions
- Filters for specific application parts
Best Practices
Log Levels
Use the appropriate log level for different scenarios:
- Trace: Detailed debugging information
- Debug: Useful debugging information
- Information: General application flow
- Warning: Non-critical issues
- Error: Errors that affect functionality
- Fatal: Critical errors that cause application shutdown
Structured Logging
Always use structured logging for better searchability:
// Instead of this:
_logger.Information($"User {userName} logged in");
// Do this:
_logger.Information("User {UserName} logged in", userName);Use Log Context
Enrich your logs with context information:
using (LogContext.PushProperty("RequestId", HttpContext.TraceIdentifier))
{
_logger.Information("Processing request");
}Log Exception Details
Always include the exception object when logging errors:
try
{
// Code that might throw
}
catch (Exception ex)
{
_logger.Error(ex, "Error occurred while processing");
}Advanced Monitoring with Kibana
Alert Setup
- Go to "Stack Management" > "Rules and Connectors"
- Create a new rule for error threshold alerting
- Configure actions to send notifications
Log Analysis
Use Kibana's Machine Learning features to detect anomalies in your logs:
- Go to "Machine Learning"
- Create a new job to analyze your log patterns
- Set up anomaly detection for error rates or response times
Conclusion
You now have a complete .NET logging solution with:
- Structured logging using Serilog
- Centralized log storage with Elasticsearch
- Visualizations and monitoring with Kibana
- Docker containerization for easy deployment
This setup provides a solid foundation for application monitoring and troubleshooting in production environments. As your application grows, you can extend this logging infrastructure with additional sinks, enrichers, and monitoring tools to meet your specific needs.