Introduction
If you’ve worked with AWS CodeBuild, you know the buildspec file is where everything comes together. It’s a YAML file that tells CodeBuild what to do at each step. Get it right and your pipeline runs smoothly. Get it wrong and you’ll spend hours digging through logs.
This guide covers the basics and some patterns I’ve found useful. The examples focus on Node.js, Python, and Go since those are what I work with most.
Basic Structure
Quick reference for what goes into a buildspec:
- Location: Root directory of your source (or specify a different path in CodeBuild settings)
- Version: Use 0.2 (runs all commands in the same shell instance)
- env: Environment variables, Parameter Store refs, Secrets Manager refs
- phases: Commands that run at each stage
- install: Set up dependencies and runtimes
- pre_build: Prep work before building
- build: Main build commands
- post_build: Cleanup, push artifacts, notifications
- artifacts: What files to output
- reports: Test report configuration
- cache: Paths to cache between builds
Core Components
Version
Always use 0.2. It runs commands in the same shell, so environment variables persist between commands.
version: 0.2
Environment Variables
Three ways to define them:
env:
variables:
NODE_ENV: "production"
parameter-store:
DB_HOST: "/app/prod/db-host"
secrets-manager:
DB_PASSWORD: "prod/db:password"
Phases
Each phase runs in sequence. If one fails, subsequent phases don’t run (unless you configure otherwise).
Artifacts
Defines build output:
artifacts:
files:
- 'dist/**/*'
name: build-$CODEBUILD_BUILD_NUMBER
Cache
Speeds up builds by reusing dependencies:
cache:
paths:
- 'node_modules/**/*'
- '/root/.npm/**/*'
Critical Fix: ECR Login Command
If you’re pushing Docker images to ECR, you need to know about this. The old login command was removed in AWS CLI v2 and will break your builds.
Old Command (Deprecated)
# DEPRECATED - removed in AWS CLI v2
# DO NOT USE - keeping for reference only
- $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION)
This fails silently or throws “unknown command” errors on CodeBuild images 5.0+.
Current Command (Use This)
- aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com
The new command pipes the password directly to docker login, which is more secure since credentials don’t appear in shell history or logs.
Why This Matters
- CodeBuild image 4.0 = AWS CLI v1 (old command works)
- CodeBuild image 5.0+ = AWS CLI v2 (only new command works)
If your builds suddenly fail after an image upgrade, this is probably why.
Practical Examples
Example 1: Node.js Application
Basic Node.js build with npm:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 20
commands:
- echo "Installing dependencies"
- npm ci
pre_build:
commands:
- echo "Running lint"
- npm run lint
build:
commands:
- echo "Running tests"
- npm test
- echo "Building"
- npm run build
post_build:
commands:
- echo "Build complete"
artifacts:
files:
- 'dist/**/*'
- 'package.json'
base-directory: '.'
cache:
paths:
- 'node_modules/**/*'
Example 2: Node.js with ECR Push
Building a Docker image and pushing to ECR:
version: 0.2
env:
variables:
IMAGE_REPO_NAME: "my-node-app"
IMAGE_TAG: "latest"
phases:
install:
runtime-versions:
nodejs: 20
docker: 20
pre_build:
commands:
- echo "Logging in to ECR"
- aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com
- COMMIT_HASH=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
- IMAGE_TAG=${COMMIT_HASH:=latest}
build:
commands:
- echo "Building Docker image"
- docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
- docker tag $IMAGE_REPO_NAME:$IMAGE_TAG $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
post_build:
commands:
- echo "Pushing to ECR"
- docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
- printf '[{"name":"app","imageUri":"%s"}]' $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG > imagedefinitions.json
artifacts:
files:
- imagedefinitions.json
Example 3: Python Script/Lambda Package
Python without frameworks, packaging for Lambda or general use:
version: 0.2
phases:
install:
runtime-versions:
python: 3.12
commands:
- echo "Installing dependencies"
- pip install -r requirements.txt -t ./package
pre_build:
commands:
- echo "Running linter"
- pip install flake8
- flake8 src/
build:
commands:
- echo "Running tests"
- pip install pytest
- pytest tests/
- echo "Creating deployment package"
- cd package && zip -r ../function.zip . && cd ..
- zip -g function.zip -r src/
post_build:
commands:
- echo "Package created"
artifacts:
files:
- function.zip
cache:
paths:
- '/root/.cache/pip/**/*'
Example 4: Go Application
Basic Go build:
version: 0.2
env:
variables:
GO111MODULE: "on"
phases:
install:
runtime-versions:
golang: 1.21
pre_build:
commands:
- echo "Downloading dependencies"
- go mod download
- echo "Running vet"
- go vet ./...
build:
commands:
- echo "Running tests"
- go test ./...
- echo "Building binary"
- CGO_ENABLED=0 GOOS=linux go build -o app ./cmd/main.go
post_build:
commands:
- echo "Build complete"
artifacts:
files:
- app
cache:
paths:
- '/go/pkg/mod/**/*'
Example 5: EKS Deployment
Building and deploying to EKS:
version: 0.2
env:
variables:
EKS_CLUSTER_NAME: "my-cluster"
IMAGE_REPO_NAME: "my-app"
parameter-store:
KUBECONFIG_DATA: "/eks/kubeconfig"
phases:
install:
runtime-versions:
docker: 20
commands:
- echo "Installing kubectl"
- curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl"
- chmod +x kubectl
- mv kubectl /usr/local/bin/
pre_build:
commands:
- echo "Logging in to ECR"
- aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com
- COMMIT_HASH=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
- IMAGE_TAG=${COMMIT_HASH:=latest}
- echo "Building image"
- docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
- docker tag $IMAGE_REPO_NAME:$IMAGE_TAG $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
- docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
build:
commands:
- echo "Configuring kubectl"
- aws eks update-kubeconfig --name $EKS_CLUSTER_NAME --region $AWS_DEFAULT_REGION
- echo "Updating deployment"
- kubectl set image deployment/my-app app=$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
- kubectl rollout status deployment/my-app --timeout=300s
post_build:
commands:
- echo "Deployment complete"
artifacts:
files:
- imagedefinitions.json
Example 6: Conditional Builds
Different behavior based on branch:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 20
commands:
- npm ci
build:
commands:
- |
if [ "$CODEBUILD_WEBHOOK_TRIGGER" = "branch/main" ]; then
echo "Main branch - running full build"
npm run test:all
npm run build:prod
else
echo "Feature branch - quick build"
npm run test
npm run build
fi
post_build:
commands:
- echo "Done"
artifacts:
files:
- 'dist/**/*'
Best Practices
Use npm ci Instead of npm install
npm ci is faster and more reliable for CI environments. It installs exact versions from package-lock.json and deletes node_modules first.
Cache Dependencies
This can cut build times significantly:
cache:
paths:
- 'node_modules/**/*' # Node
- '/root/.npm/**/*' # npm global cache
- '/root/.cache/pip/**/*' # Python
- '/go/pkg/mod/**/*' # Go modules
Handle Secrets Properly
Never hardcode credentials:
env:
secrets-manager:
API_KEY: "prod/api:key"
parameter-store:
DB_HOST: "/app/db-host"
Check Build Status in post_build
The post_build phase runs even if build fails. Check the status:
post_build:
commands:
- |
if [ "$CODEBUILD_BUILD_SUCCEEDING" = "1" ]; then
echo "Build succeeded, pushing artifacts"
# push to S3, notify slack, etc
else
echo "Build failed"
# send alert
fi
Use Specific Runtime Versions
Don’t rely on defaults. Specify versions explicitly:
install:
runtime-versions:
nodejs: 20
python: 3.12
Troubleshooting
ECR Login Fails
Check which AWS CLI version your image uses:
pre_build:
commands:
- aws --version
If it’s v2, use the get-login-password command.
Build Works Locally But Fails in CodeBuild
Common causes:
- Different Node/Python version
- Missing environment variables
- File permissions
- Dependencies not in package-lock.json
Artifacts Not Found
Make sure your paths are correct. Use ls -la to debug:
post_build:
commands:
- ls -la
- ls -la dist/