Prerequisites
The instructions in this section assume that you will be running the commands to create and configure your AWS EKS cluster on a host that has been set up with the appropriate tools to do so.
AWS CLI tools
Install AWS CLI 2.7.1 or above for full compatibility with the different tools and versions
mentioned below. Do not use apt
, yum
or snap
versions of the AWS CLI which install 1.x.
You must use the AWS CLI installer from the link below:
$ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
$ unzip awscliv2.zip
$ sudo ./aws/install
$ sudo yum install git procps
Once you have the AWS CLI installed, you must configure it with your AWS credentials. When it asks
for the “Default output format”, enter json
:
$ aws configure
AWS Access Key ID [None]: [AWS Access Key ID]
AWS Secret Access Key [None]: [AWS Secret Access Key]
Default region name [None]: us-east-2
Default output format [None]: json
jq
Install the jq
tool by following the instructions here:
Download jq
EKS supported versions
As of the Anjuna Nitro Runtime v1.39, Anjuna supports EKS versions 1.23 through 1.28.
The following table matches the currently supported EKS versions to the Anjuna Nitro Runtime versions:
Supported EKS version | Anjuna Nitro Runtime versions |
---|---|
1.28 |
v1.39 - v1.39 |
1.27 |
v1.37 - v1.39 |
1.26 |
v1.35 - v1.39 |
1.25 |
v1.33 - v1.39 |
1.24 |
v1.31 - v1.39 |
1.23 |
v1.27 - v1.39 |
EKS resource requirements
The recommended resource allocation for Anjuna Nitro on EKS per enclave instance is:
-
Trusted vCPUs (Enclave): At least 2
-
Trusted memory (Enclave): At least 1 Gi
-
Untrusted vCPUs (Launcher): 0.5 per trusted vCPU
-
Untrusted memory (Launcher): 1 Gi per enclave
This means a minimum of 3 vCPUs (2 trusted, 1 untrusted) and 2 Gi of memory (1 Gi trusted and 1 Gi untrusted) are recommended per enclave.
For example, if you intend to run three enclaves on a single Node, two with 2 vCPUs and 2GB memory each, and another with 4 vCPUs and 4GB memory, then you should create a Node with at least 12 vCPUs (8 enclave vCPUs, 4 additional for the host) and 11GB memory (8GB for the enclaves, and 3GB for the host).
The Pods can also be deployed across multiple Nodes, as long as each Node has enough resources for its respective enclave(s).
Number of vCPUs | Memory (GB) | Number of Launcher vCPUs | Launcher Memory (GB) | |
---|---|---|---|---|
Enclave 1 |
2 |
2 |
1 |
1 |
Enclave 2 |
2 |
2 |
1 |
1 |
Enclave 3 |
4 |
4 |
2 |
1 |
AWS Nitro Enclaves support up to four enclaves per EKS Node. |
kubectl
The official instructions for installing kubectl
are
here: https://kubernetes.io/docs/tasks/tools/install-kubectl/
Ideally, you should install a version of kubectl
that exactly matches the EKS version you wish to use, but using up to one minor version lower is
supported. Using a kubectl
version newer than your EKS version is not supported. For more details,
see this page:
https://kubernetes.io/releases/version-skew-policy/#kube-controller-manager-kube-scheduler-and-cloud-controller-manager
The following commands will set up your Linux host with kubectl
v1.28:
$ curl -LO "https://dl.k8s.io/release/v1.28.0/bin/linux/amd64/kubectl"
$ sudo install -o root -g root -m 0755 kubectl /usr/local/bin/kubectl
# If /usr/local/bin is not in your PATH, and bash is your shell, you can add it like this:
$ export PATH=$PATH:/usr/local/bin
$ echo 'export PATH=$PATH:/usr/local/bin' >> ~/.bashrc
# Verify that kubectl reports the version you installed
$ kubectl version --client
Install Terraform
Install Terraform: https://learn.hashicorp.com/tutorials/terraform/install-cli
The Terraform configuration for the Anjuna Nitro K8s Toolset has been tested on Terraform v1.3.1, but it should work on v1.1 or higher.
Install Helm
Before you install Helm, it is important to know that each version of Helm supports different versions of EKS as noted in the Supported Version Skew section of Helm’s documentation.
Anjuna supports Helm versions 3.7.x-3.9.x, but only for the EKS versions shown in the EKS supported versions section above.
Install Helm using the instructions on the Installing Helm page of Helm’s documentation.