How to read logs from Kubernetes worker nodes using kubectl
"I run k8s on a cloud provider so I don't care about worker nodes' logs and I'm so happy of that"
Some days later..
Writing on google "How to access k8s worker node's logs in a SaaS solution ..."
Problem
You probably faced at least one time problems with Kubernetes worker nodes using managed services (like AWS EKS, Azure AKS and so on).
I mention for example a private Kubernetes cluster configuration (aka air gapped k8s) where worker nodes are not properly able to configure themselves for a missing firewall rule.
In many of these cases, because we are in a managed service, it is not easy to get access to worker nodes' logs.
Accessing these logs could save a lot of time, going directly to the error log.
Solution
Kubernetes 1.27 introduced a new feature called Node log query that allows viewing logs of services running on the node.
It basically allows you to query logs usually under /var/log
of any worker nodes using kubectl
How to use it
# Fetch kubelet logs from a node named node-1.example
kubectl get --raw "/api/v1/nodes/node-1.example/proxy/logs/?query=kubelet"
# Fetch kubelet logs from a node named node-1.example that have the word "error"
kubectl get --raw "/api/v1/nodes/node-1.example/proxy/logs/?query=kubelet&pattern=error"
kubectl get --raw "/api/v1/nodes/<insert-node-name-here>/proxy/logs/?query=/<insert-log-file-name-here>"