Is AWS CDK better than Terraform?

Kvs Vishnu Kumar
9 min readFeb 13, 2024

--

In this article, I am going to explain the advantages of using AWS CDK over Terraform for maintaining your cloud infrastructure along with code snippets for better understanding.

AWS CDK vs Terraform

I work as an AWS Cloud Engineer, responsible for managing the cloud infrastructure for my project within the AWS environment. In this capacity, I have utilized both Terraform and AWS CDK. Drawing from my experience and insights, I will now share my perspective on their respective merits.

But before delving into that, let’s understand IaC.

What is IaC (Infrastructure as Code)?

IaC or Infrastructure as Code is a software engineering practice where the Cloud Infrastructure is provisioned and managed through code, rather than through manual processes or interactive configuration tools like AWS Console.

Simply speaking, we write code to deploy the infrastructure rather than manually creating via console. This is an industry practice. I don't think at present there is any organization that doesn't implement IaC for its cloud infrastructure.

The main reasons for IaC are reusability, consistency, automation, and scalability.

Now let's understand Terraform and AWS CDK

Terraform

Terraform is an IaC tool which is developed by HashiCorp. With Terraform, infrastructure configurations are defined in code using a domain-specific language called HashiCorp Configuration Language (HCL), allowing for easy versioning, collaboration, and automation. Terraform then manages the entire lifecycle of these resources, from provisioning and updating to destruction, based on the defined configuration files.

A simple example of Terraform script to create an S3 bucket

# main.tf

provider "aws" {
region = "us-east-1" # Set your desired AWS region
}

resource "aws_s3_bucket" "my_bucket" {
bucket = "my-unique-bucket-name" # Set your desired bucket name (must be globally unique)
}

AWS CDK

The AWS CDK (Cloud Development Kit) is an open-source project developed by AWS that provides developers with a higher-level abstraction for provisioning cloud resources.

With AWS CDK, developers can write infrastructure code using familiar programming languages such as Typescript, Python, Java, Go, etc.

A simple example of how to create an S3 bucket using AWS CDK with Typescript

import * as cdk from 'aws-cdk-lib';
import { Stack, StackProps } from 'aws-cdk-lib';
import * as s3 from 'aws-cdk-lib/aws-s3';

export class MyS3BucketStack extends Stack {
constructor(scope: cdk.Construct, id: string, props?: StackProps) {
super(scope, id, props);

// Create an S3 bucket
new s3.Bucket(this, 'MyBucket', {
bucketName: 'my-unique-bucket-name', // Replace with your desired bucket name
});
}
}

// App creation
const app = new cdk.App();
new MyS3BucketStack(app, 'MyS3BucketStack');

Now let's get deeper into the Terraform vs AWS CDK debate.

Terraform vs AWS CDK

Let's go point by point and understand the differences in these tools. I am going to take some examples to validate some of these points. Please stick to the last to understand the comparison better.

Support

Terraform is cloud agnostic. Meaning it supports multiple cloud providers such as AWS, Azure, GCP, Alibaba, etc

AWS CDK is built by the AWS team specifically for AWS Cloud.

So, if you are working on different cloud providers, terraform is the best choice. Some cloud providers have their own IaC tools such as Bicep for Azure, but they are still in the early stages of adoption.

Language

As said earlier, Terraform uses HCL which is a domain-specific language. Meaning, it is specifically made by Hashicorp for terraform. HCL is a configuration language similar to JSON and is pretty much easy to learn.

AWS CDK supports general-purpose programming languages such as Typescript, Python, Java, etc. So no need to learn any new language. You can choose the programming language you are familiar with and start writing IaC.

Programming languages are robust and flexible. You can use various programming paradigms such as OOPS, functional programming in your IaC. This has many advantages over DSL (Domain Specific language).

State management

Terraform uses the terraform.tfstate file to store the state of deployed resources. This file keeps track of the infrastructure’s current state, including deployed resources and their configurations. Typically, this state file is stored centrally, often in an S3 bucket which facilitates collaboration among various teams.

In my own experience, I’ve found that managing Terraform state can be quite messy. It’s pretty common to run into state drifts within your project, and resolving them tends to eat up a lot of time.

AWS CDK operates in a sort of stateless manner. You rely on version control systems like Git for managing your state. Each time you deploy, you pull the most recent changes, address any conflicts that arise, and proceed with the deployment. This approach is tidy and ensures consistency.

Versioning

We’re all familiar with AWS continually expanding its service offerings. Terraform, as an executable, demands downloading and updating the latest version each time, making infrastructure code updates a challenging task. Many projects in our organization still rely on outdated versions, causing maintenance headaches.

Contrastingly, AWS CDK, being maintained by AWS itself, offers seamless updates by simply managing dependencies. This ensures smoother transitions and fewer compatibility issues over time.

Connectivity

To understand this, let's just dive into an example directly.

For instance, let’s consider a scenario where you need to trigger a lambda whenever an object is created in S3 bucket. You can achieve this by combining S3 event notifications with that lambda.

Let's take a look at below terraform code.

provider "aws" {
region = "us-east-1"
}

resource "aws_s3_bucket" "my_bucket" {
bucket = "my-bucket"
acl = "private"
}

resource "aws_lambda_function" "my_function" {
filename = "lambda_function_payload.zip"
function_name = "my-function"
role = aws_iam_role.lambda_exec_role.arn
handler = "index.handler"
runtime = "nodejs14.x"
}

resource "aws_lambda_permission" "s3_invoke_permission" {
statement_id = "AllowExecutionFromS3Bucket"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.my_function.function_name
principal = "s3.amazonaws.com"
source_arn = aws_s3_bucket_notification.s3_notification.arn
}

resource "aws_s3_bucket_notification" "s3_notification" {
bucket = aws_s3_bucket.my_bucket.id

lambda_function {
lambda_function_arn = aws_lambda_function.my_function.arn
events = ["s3:ObjectCreated:*"]
}
}

In Terraform, the concept revolves around treating everything as a resource. So, initially, you define your resources, such as Lambda and S3 bucket, which is straightforward. Then, to establish a connection between the Lambda function and S3 objects, you first need to grant permissions. This is achieved using the aws_lambda_permission resource. Once permissions are granted, you require another resource to link the Lambda function with S3 notifications which is aws_s3_bucket_notification.

Now, lets see the CDK code to achieve this.

import { aws_lambda_nodejs as lambda_nodejs } from 'aws-cdk-lib';
import { aws_s3 as s3 } from 'aws-cdk-lib';
import { aws_s3_notifications as s3notifications } from 'aws-cdk-lib';
import { App, Stack, StackProps } from 'aws-cdk-lib';
import { Construct } from 'constructs';

export class MyStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);

// Create an S3 bucket
const bucket = new s3.Bucket(this, 'MyBucket');

// Create a Lambda function
const fn = new lambda_nodejs.NodejsFunction(this, 'MyFunction', {
runtime: lambda_nodejs.Runtime.NODEJS_14_X,
handler: 'handler',
entry: 'lambda/index.ts',
});

// Add an event source for S3 bucket to trigger the Lambda function
fn.addEventSource(new s3notifications.S3EventSource(bucket, {
events: [s3.EventType.OBJECT_CREATED],
filters: [{ prefix: 'uploads/' }], // Adjust the prefix as needed
}));
}
}

const app = new App();
new MyStack(app, 'MyStack');

In the above code, first we create an S3 bucket. Then we create the lambda function. As you can see, inside the lambda function code itself, we can create an event source for S3. This is neat and it shows the power of programming.

The above code makes a lot of sense than using a resource block for everything in terraform.

Security

When discussing security within AWS, IAM (Identity and Access Management) takes center stage. IAM roles and policies are crucial components utilized for managing user access and facilitating secure communication between services.

In terraform code, we used aws_lambda_permission resource block to grant S3 necessary permissions so that it can invoke lambda. But in CDK code, you can see we didn't do any such thing. It's because cdk sets the iam permissions under the hood. This is more secure as cdk gives only necessary permissions.

Let's take another example where we have a lambda function that adds data to a dynamodb table. As an infra developer, we don't care how the lambda works. We just need to create a lambda and ddb table and give lambda permissions to write data to dynamodb.

Let's look at cdk code first.

import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as lambda from 'aws-cdk-lib/aws-lambda';
import * as dynamodb from 'aws-cdk-lib/aws-dynamodb';

export class MyStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);

// Create the DynamoDB table
const table = new dynamodb.Table(this, 'MyTable', {
partitionKey: { name: 'id', type: dynamodb.AttributeType.STRING },
billingMode: dynamodb.BillingMode.PAY_PER_REQUEST,
});

// Create the Lambda function
const myLambda = new lambda.Function(this, 'MyLambda', {
code: lambda.Code.fromAsset('path/to/your/lambda/code'), // Adjust the path as necessary
handler: 'index.handler',
runtime: lambda.Runtime.NODEJS_14_X,
});

// Grant the Lambda function read/write access to the DynamoDB table
table.grantReadWriteData(myLambda);
}
}

const app = new cdk.App();
new MyStack(app, 'MyStack');

As you can see, this is concise and clean. We created ddb table and lambda. Next, we just used grantReadWriteData method of ddb table and passed lambda to it. This takes care of all iam permissions necessary.

Now, let’s see the same code in Terraform

provider "aws" {
region = "us-east-1"
}

resource "aws_dynamodb_table" "my_table" {
name = "MyTable"
billing_mode = "PAY_PER_REQUEST"
hash_key = "id"

attribute {
name = "id"
type = "S"
}
}

resource "aws_iam_role" "lambda_execution_role" {
name = "lambda_execution_role"

assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
},
]
})
}

resource "aws_iam_policy" "ddb_access_policy" {
name = "ddb_access_policy"
description = "A policy that allows lambda to access DynamoDB table"

policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = [
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:GetItem",
"dynamodb:Scan",
"dynamodb:Query",
"dynamodb:DeleteItem"
]
Effect = "Allow"
Resource = aws_dynamodb_table.my_table.arn
},
]
})
}

resource "aws_iam_role_policy_attachment" "ddb_access_attachment" {
role = aws_iam_role.lambda_execution_role.name
policy_arn = aws_iam_policy.ddb_access_policy.arn
}

resource "aws_lambda_function" "my_lambda" {
function_name = "MyLambdaFunction"

filename = "path/to/your/deployment/package.zip"
source_code_hash = filebase64sha256("path/to/your/deployment/package.zip")
handler = "index.handler"
runtime = "nodejs14.x"
role = aws_iam_role.lambda_execution_role.arn
}

As you can see how tiresome is this. First we create ddb table, then a lambda role. then we create policy specifying necessary permissions and then attach it to the role. Atlast, we create lambda and use this role.

This code is so lengthy and we are manually specifying permissions in the policy document. This can also raise access issues if we don't specify the correct permissions.

Integration

AWS CDK has a strong integration with AWS SDK. To understand this better, let's take Lambda functions for example.

We know that AWS Lambda only supports Linux. And it only supports two cpu architectures Amd and Arm based. Also, AWS supports various runtimes like Python, Node.js , Go, etc.

Now, if your lambda is of arm based with python runtime and developer is working on a Windows machine, he cannot bundle it in his OS as they will not be compatible with Lambda runtime. This can be solved if you are using a pipeline with arm based linux image for building the code.

In CDK, you can easily deploy the code to lambda. AWS CDK offers different functions for different runtimes.

@aws-cdk/aws-lambda-python-alpha » PythonFunction
aws-cdk-lib » aws_lambda_nodejs » NodejsFunction
@aws-cdk/aws-lambda-go-alpha » GoFunction

If you use the packages, you can point to the source code in lambda and you dont even have to bundle the code. We can choose bundling options our own. esbuild for Nodejs and docker for other runtimes are good choices.

private createGetData(): PythonFunction {
const fn = new PythonFunction(this, "GetData", {
entry: join(__dirname, "..", "src", "get-data"),
runtime: lambda.Runtime.PYTHON_3_11,
bundling: {
assetExcludes: [".venv"],
assetHashType: AssetHashType.SOURCE
},
timeout: Duration.minutes(2),
index: "lambda_handler.py",
handler: "lambda_handler",
memorySize: 128
});
return fn;
}

From above code, I just pointed my lambda code location in entry parameter. As the runtime is python, I only need two files which are lambda_handler.py and requirements.txt

This cannot be done in Terraform

Note: This is so useful if you are doing full-stack development in AWS. That is if you are working on AWS SDK and parallely working on AWS infra creation.

Final Note

In my view, AWS CDK surpasses Terraform based on the points discussed. However, I recognize this as a subjective opinion that may vary depending on the specific needs of your project and organization. It’s essential to evaluate and choose the appropriate Infrastructure as Code (IaC) tool based on your unique requirements and circumstances.

My Project

I built a project called Candletower (www.candletower.com). It’s a website that provides stock market insights based on candlestick pattern analysis. If you are investing, or getting into the stock market world, I highly recommend going through this website. No Ads, No login, and completely free. Let me know your thoughts.

Finally, thanks for reading.

--

--

Responses (12)