Argocd on GKE Cluster with access to EKS Cluster
With ArgoCD
running on a GKE cluster and needing to access/add an EKS cluster, the following worked for me;
After adding the cluster to ArgoCD as normal, take note of the user ARN used in the cluster’s configuration.
Found in the users
section of the kubectl configuration for the cluster.
apiVersion: v1
clusters:
- cluster:
certificate-authority-data: <ca-cert>
server: https://....eks.amazonaws.com
name: https://....eks.amazonaws.com
contexts:
- context
cluster: https://....eks.amazonaws.com
namespace: argocd
user: https://....eks.amazonaws.com
name: https://....eks.amazonaws.com
current-context: https://....eks.amazonaws.com
kind: Config
preferences: {}
users:
- name: https://....eks.amazonaws.com
user:
exec:
apiVersion: client.authentication.k8s.io/v1beta1
args:
- aws
- --cluster-name
- <cluster-name>
- --role-arn
- arn:aws:eks:<aws-region>:<account-id>:cluster/<cluster-name> # this is the user ARN to take note of
command:i argocd-k8s-auth
env: null
interactiveMode: Never
provideClusterInfo: false
Then add the aws-access-key-id
and aws-secret-access-key
for that user to both the argocd server and application contoller services.
For example, in my case I’m making use of helm to install/setup ArgoCD, so the values.yaml
includes the following configuration changes;
...
controler:
...
env:
- name: AWS_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: aws-cluster-user
key: aws-access-key-id
- name: AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: aws-cluster-user
key: aws-secret-access-key
...
server:
...
env:
- name: AWS_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: aws-cluster-user
key: aws-access-key-id
- name: AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: aws-cluster-user
key: aws-secret-access-key
...
See ArgoCD FAQs for some troubleshooting help.