slog: Datadog handler

A Datadog Handler for slog Go library.
See also:
HTTP middlewares:
Loggers:
Log sinks:
🚀 Install
go get github.com/samber/slog-datadog/v2
Compatibility: go >= 1.21
No breaking changes will be made to exported APIs before v3.0.0.
💡 Usage
GoDoc: https://pkg.go.dev/github.com/samber/slog-datadog/v2
Handler options
type Option struct {
// log level (default: debug)
Level slog.Leveler
// datadog endpoint
Client *datadog.APIClient
Context context.Context
Timeout time.Duration // default: 10s
// batching (default: disabled)
Batching bool
BatchDuration time.Duration // default: 5s
// source parameters
Service string
Source string
Hostname string
GlobalTags map[string]string
// optional: customize Datadog message builder
Converter Converter
// optional: custom marshaler
Marshaler func(v any) ([]byte, error)
// optional: fetch attributes from context
AttrFromContext []func(ctx context.Context) []slog.Attr
// optional: see slog.HandlerOptions
AddSource bool
ReplaceAttr func(groups []string, a slog.Attr) slog.Attr
}
Attributes will be injected in log payload.
Other global parameters:
slogdatadog.SourceKey = "source"
slogdatadog.ErrorKeys = []string{"error", "err"}
Example
import (
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
slogdatadog "github.com/samber/slog-datadog/v2"
"log/slog"
)
func newDatadogClient(endpoint string, apiKey string) (*datadog.APIClient, context.Context) {
ctx := datadog.NewDefaultContext(context.Background())
ctx = context.WithValue(
ctx,
datadog.ContextAPIKeys,
map[string]datadog.APIKey{"apiKeyAuth": {Key: apiKey}},
)
ctx = context.WithValue(
ctx,
datadog.ContextServerVariables,
map[string]string{"site": endpoint},
)
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
return apiClient, ctx
}
func main() {
host := "1.2.3.4"
service := "api"
endpoint := slogdatadog.DatadogHostEU
apiKey := "xxx"
apiClient, ctx := newDatadogClient(endpoint, apiKey)
logger := slog.New(slogdatadog.Option{Level: slog.LevelDebug, Client: apiClient, Context: ctx, Timeout: 5*time.Second, Hostname: host, Service: service}.NewDatadogHandler())
logger = logger.
With("environment", "dev").
With("release", "v1.0.0")
// log error
logger.
With("category", "sql").
With("query.statement", "SELECT COUNT(*) FROM users;").
With("query.duration", 1*time.Second).
With("error", fmt.Errorf("could not count users")).
Error("caramba!")
// log user signup
logger.
With(
slog.Group("user",
slog.String("id", "user-123"),
slog.Time("created_at", time.Now()),
),
).
Info("user registration")
}
Batching
To improve performance and reduce network overhead, you can enable batching to send multiple log entries in a single request.
When batching is enabled, logs are buffered and sent either:
- When the batch duration elapses (configurable via
BatchDuration, default: 5 seconds)
- When the buffer reaches
MaxBatchSize (if configured)
- When
Stop() is called to gracefully shut down
Important: Always call Stop() before your application exits to ensure all buffered logs are flushed and goroutines are properly cleaned up.
import (
"context"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
slogdatadog "github.com/samber/slog-datadog/v2"
"log/slog"
"time"
)
func main() {
host := "1.2.3.4"
service := "api"
endpoint := slogdatadog.DatadogHostEU
apiKey := "xxx"
apiClient, ctx := newDatadogClient(endpoint, apiKey)
handler := slogdatadog.Option{
Level: slog.LevelDebug,
Client: apiClient,
Context: ctx,
Timeout: 5 * time.Second,
Hostname: host,
Service: service,
Batching: true, // Enable batching
BatchDuration: 10 * time.Second, // Send logs every 10 seconds (default: 5s)
}.NewDatadogHandler()
// Stop the batching goroutine and flush any remaining logs w/ a 5s timeout
defer func() {
handler.Stop(context.WithTimeout(context.Background(), 5*time.Second))
}()
logger := slog.New(handler)
logger = logger.
With("environment", "dev").
With("release", "v1.0.0")
// log error
logger.
With("category", "sql").
With("query.statement", "SELECT COUNT(*) FROM users;").
With("query.duration", 1*time.Second).
With("error", fmt.Errorf("could not count users")).
Error("caramba!")
// log user signup
logger.
With(
slog.Group("user",
slog.String("id", "user-123"),
slog.Time("created_at", time.Now()),
),
).
Info("user registration")
}
When batching is enabled, logs are buffered in memory and sent to Datadog periodically based on BatchDuration. This significantly reduces the number of API calls and improves throughput for high-volume logging scenarios.
Tracing
Import the samber/slog-otel library.
import (
slogdatadog "github.com/samber/slog-datadog"
slogotel "github.com/samber/slog-otel"
"go.opentelemetry.io/otel/sdk/trace"
)
func main() {
tp := trace.NewTracerProvider(
trace.WithSampler(trace.AlwaysSample()),
)
tracer := tp.Tracer("hello/world")
ctx, span := tracer.Start(context.Background(), "foo")
defer span.End()
span.AddEvent("bar")
logger := slog.New(
slogdatadog.Option{
// ...
AttrFromContext: []func(ctx context.Context) []slog.Attr{
slogotel.ExtractOtelAttrFromContext([]string{"tracing"}, "trace_id", "span_id"),
},
}.NewDatadogHandler(),
)
logger.ErrorContext(ctx, "a message")
}
🤝 Contributing
Don't hesitate ;)
# Install some dev dependencies
make tools
# Run tests
make test
# or
make watch-test
👤 Contributors

💫 Show your support
Give a ⭐️ if this project helped you!

📝 License
Copyright © 2023 Samuel Berthe.
This project is MIT licensed.