Skip to main content
Skip table of contents

Multiple AWS Accounts

Leverage Multiple AWS Accounts for Kubernetes Cluster Deployments

Objective

You can leverage multiple AWS accounts in your organization to meet specific business purposes, reflect your organizational structure, or implement a multi-tenancy strategy. Specific scenarios include:

  • Implementing isolation between environment tiers such as development, testing, acceptance, and production.

  • Implementing separation of concerns between management clusters, and workload clusters.

  • Reducing the impact of security events and incidents.

For additional benefits of using multiple AWS accounts, refer to the following white paper.

This document describes how to leverage the D2iQ Kubernetes Platform (DKP) to deploy a management cluster, and multiple workload clusters, leveraging multiple AWS accounts.

Assumptions

This guide assumes you have some understanding of Cluster API concepts and basic DKP provisioning workflows on AWS.

Cluster API Concepts - cluster API concepts

AWS Install Options

Glossary

  • Management cluster - The cluster that runs in AWS and is used to create target clusters in different AWS accounts.

  • Target account - The account where the target cluster is created.

  • Source account - The AWS account where the CAPA controllers for the management cluster runs.

Prerequisites

Before you begin deploying DKP on AWS, you configure the prerequisites for the environment you use either non-air-gapped or air-gapped.

Prerequisites for Install

Deploy DKP on AWS

  1. Deploy a management cluster in your AWS source account.
    AWS: create Kubernetes AWS cluster

  2. Configure a trusted relationship between source and target accounts and create a management cluster:

Step 1:

DKP leverages the Cluster API provider for AWS (CAPA) to provision Kubernetes clusters in a declarative way. Customers declare the desired state of the cluster through a cluster configuration YAML file which is generated using:

(AWS)

CODE
dkp create cluster aws --cluster-name=${CLUSTER_NAME} \
--dry-run \
--output=yaml \
> ${CLUSTER_NAME}.yaml

Step 2:

Configure a trust relationship between the source and target accounts.

Follow all the prerequisite steps in both the source and target accounts

  1. Create all policies and roles in management and workload accounts a. The prerequisite IAM policies for DKP are documented here: Configure AWS IAM policies.

  2. Establish a trust relationship in workload account for the management account.

    a. Go to your target (workload) account b. Search for the role control-plane.cluster-api-provider-aws.sigs.k8s.io c. Navigate to the Trust Relationship tab and select Edit Trust Relationship d. Add the following relationship:

    CODE
    {
      "Effect": "Allow",
      "Principal": {
         "AWS": "arn:aws:iam::${mgmt-aws-account}:role/control-plane.cluster-api-provider-aws.sigs.k8s.io"
      },
      "Action": "sts:AssumeRole"
    }
  3. Give permission to role in the source (management cluster) account to call the sts:AssumeRole API a. Log in to the source AWS account and attach the following inline policy to control-plane.cluster-api-provider-aws.sigs.k8s.io role:

    CODE
    {
      "Version": "2012-10-17",
      "Statement": [
         {
           "Effect": "Allow",
           "Action": "sts:AssumeRole",
           "Resource": [
    "arn:aws:iam::${workload-aws-account}:role/control-plane.cluster-api-provider-aws.sigs.k8s.io"
           ]
         }
      ]
    }
  4. Modify the management cluster configuration file and update the AWSCluster object with following details:

    CODE
    apiVersion: infrastructure.cluster.x-k8s.io/v1alpha3
    kind: AWSCluster
    metadata:
    spec:
      identityRef:
         kind: AWSClusterRoleIdentity
         name: cross-account-role
    …
    …
    ---
    apiVersion: infrastructure.cluster.x-k8s.io/v1alpha3
    kind: AWSClusterRoleIdentity
    metadata:
      name: cross-account-role
    spec:
      allowedNamespaces: {}
      roleARN: "arn:aws:iam::${workload-aws-account}:role/control-plane.cluster-api-provider-aws.sigs.k8s.io"
      sourceIdentityRef:
        kind: AWSClusterControllerIdentity
        name: default

After performing the above steps, your Management cluster will be configured to create new managed clusters in the target AWS workload account.

Next Steps:

AWS Install Options

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.