DevOps | Cloud | Analytics | Open Source | Programming





How To Fix - Error: You must be logged in to the server (Unauthorized) in Kubernetes ?



In this post, we will see How To Fix - Error: You must be logged in to the server (Unauthorized) in Kubernetes.  


Error: You must be logged in to the server (Unauthorized) in Kubernetes

First thing first, do basic check to verify the IAM user..


aws sts get-caller-identity

Configure AWS CLI directly with access key and secret key.  

Solution 1:

Create the cluster under the same IAM profile that you access from via AWS cli. Within ~/.aws/credentials, the profile accessing kubectl must match IAM that was used to create the cluster.  

Solution 2:

Edit the ConfigMap to add the IAM user role to the EKS cluster. Use below command -


kubectl edit -n kube-system configmap/aws-auth

Subsequently you will be granted an editor to map new users.   Create role bound to the kubernetes cluster for the same user as in the ConfigMap. Do using below -


kubectl create clusterrolebinding ops-user-cluster-admin-binding --clusterrole=cluster-admin --user=user1

This grants the cluster-admin a ClusterRole to a user named user1 across the cluster.  

Solution 3:

  • Check the IAM user details (who has created the cluster) are set properly on AWS cli. Use below

aws sts get-caller-identity

  • Update the kubeconfig file

aws eks --region region-code update-kubeconfig --name cluster1

  • Verify the config file
 

  • Run the kubectl

kubectl get svc

Solution 4:

  • Set up role in kubeconfig file.
 

  • Add the role permission

{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::xxxxxxxxxxx:role/eks-role"
}

  • Modify trust relationship to allow the user1 to get the role

{
  "Sid": "",
  "Effect": "Allow",
  "Principal": {
  "AWS": "arn:aws:iam::xxxxxxxxxxx:user/user1"
},
"Action": "sts:AssumeRole"

  • Confirm IAM user credentials are set properly

$ aws sts get-caller-identity

  • Update the kubeconfig file

aws eks --region region-code update-kubeconfig --name cluster_name --role-arn arn:aws:iam::xxxxxxxxxxx:user/eks-role

  • Verify the config file
 

  • Run the kubectl

kubectl get svc

Solution 5:

 

  • Check  if  you are using expired keys.

export AWS_ACCESS_KEY_ID="***************"
export AWS_SECRET_ACCESS_KEY="*************"
export AWS_SESSION_TOKEN="************************"

Hope this helps.  

Other Interesting Reads -


 you must be logged in to the server (unauthorized) kubectl, aks error: you must be logged in to the server (unauthorized) kubectl get pods error: you must be logged in to the server (unauthorized), error you must be logged in to the server (unauthorized) codebuild, error: you must be logged in to the server (unauthorized) gke, aws-auth configmap, error you must be logged in to the server (unauthorized) openshift, error you must be logged in to the server (unauthorized),