The documentation you are viewing is for Dapr v1.7 which is an older version of Dapr. For up-to-date documentation, see the latest version.

Logs

Understand Dapr logging

Dapr produces structured logs to stdout either as a plain text or JSON formatted. By default, all Dapr processes (runtime and system services) write to console out in plain text. To enable JSON formatted logs, you need to add the --log-as-json command flag when running Dapr processes.

If you want to use a search engine such as Elastic Search or Azure Monitor to search the logs, it is recommended to use JSON-formatted logs which the log collector and search engine can parse using the built-in JSON parser.

Log schema

Dapr produces logs based on the following schema.

Field Description Example
time ISO8601 Timestamp 2011-10-05T14:48:00.000Z
level Log Level (info/warn/debug/error) info
type Log Type log
msg Log Message hello dapr!
scope Logging Scope dapr.runtime
instance Container Name dapr-pod-xxxxx
app_id Dapr App ID dapr-app
ver Dapr Runtime Version 0.5.0

Plain text and JSON formatted logs

  • Plain text log examples
time="2020-03-11T17:08:48.303776-07:00" level=info msg="starting Dapr Runtime -- version 0.5.0-rc.3 -- commit v0.3.0-rc.0-155-g5dfcf2e" instance=dapr-pod-xxxx scope=dapr.runtime type=log ver=0.5.0-rc.3
time="2020-03-11T17:08:48.303913-07:00" level=info msg="log level set to: info" instance=dapr-pod-xxxx scope=dapr.runtime type=log ver=0.5.0-rc.3
  • JSON formatted log examples
{"instance":"dapr-pod-xxxx","level":"info","msg":"starting Dapr Runtime -- version 0.5.0-rc.3 -- commit v0.3.0-rc.0-155-g5dfcf2e","scope":"dapr.runtime","time":"2020-03-11T17:09:45.788005Z","type":"log","ver":"0.5.0-rc.3"}
{"instance":"dapr-pod-xxxx","level":"info","msg":"log level set to: info","scope":"dapr.runtime","time":"2020-03-11T17:09:45.788075Z","type":"log","ver":"0.5.0-rc.3"}

Configurating plain text or JSON formatted logs

Dapr supports both plain text and JSON formatted logs. The default format is plain-text. If you want to use plain text with a search engine, you will not need to change any configuring options.

To use JSON formatted logs, you need to add additional configuration when you install Dapr and deploy your app. The recommendation is to use JSONformatted logs because most log collectors and search engines can parse JSON more easily with built-in parsers.

Configuring log format in Kubernetes

The following steps describe how to configure JSON formatted logs for Kubernetes

Install Dapr to your cluster using the Helm chart

You can enable JSON formatted logs for Dapr system services by adding --set global.logAsJson=true option to Helm command.

helm install dapr dapr/dapr --namespace dapr-system --set global.logAsJson=true

Enable JSON formatted log for Dapr sidecars

You can enable JSON-formatted logs in Dapr sidecars activated by the Dapr sidecar-injector service by adding the dapr.io/log-as-json: "true" annotation to the deployment.

apiVersion: apps/v1
kind: Deployment
metadata:
  name: pythonapp
  namespace: default
  labels:
    app: python
spec:
  replicas: 1
  selector:
    matchLabels:
      app: python
  template:
    metadata:
      labels:
        app: python
      annotations:
        dapr.io/enabled: "true"
        dapr.io/app-id: "pythonapp"
        dapr.io/log-as-json: "true"
...

API Logging

API logging enables you to see the API calls from your application to the Dapr sidecar to debug issues. You can combine both Dapr API logging with Dapr log events. See configure and view Dapr Logs and configure and view Dapr API Logs for more information.

Log collectors

If you run Dapr in a Kubernetes cluster, Fluentd is a popular container log collector. You can use Fluentd with a json parser plugin to parse Dapr JSON formatted logs. This how-to shows how to configure the Fluentd in your cluster.

If you are using the Azure Kubernetes Service, you can use the default OMS Agent to collect logs with Azure Monitor without needing to install Fluentd.

Search engines

If you use Fluentd, we recommend to using Elastic Search and Kibana. This how-to shows how to set up Elastic Search and Kibana in your Kubernetes cluster.

If you are using the Azure Kubernetes Service, you can use Azure monitor for containers without indstalling any additional monitoring tools. Also read How to enable Azure Monitor for containers

References