Bash Scripting Meets AWS: A DevOps Love Story
This comprehensive guide explores the synergy between Bash scripting and AWS. Learn about real-world applications, cautionary tales, and even some mind-blowing possibilities when you harness these tools.
Table of Contents
Get Yours Today
Discover our wide range of products designed for IT professionals. From stylish t-shirts to cutting-edge tech gadgets, we've got you covered.
Hey there, tech enthusiast! Ever wonder why peanut butter and jelly taste so good together? They’re just two spreads, but something magical happens when they join forces on a slice of bread. Today, we’re talking about another dynamic duo that may not be as tasty but is equally amazing: Bash scripting and AWS (Amazon Web Services).
If you’ve landed on this article, chances are you have some interest in tech. Maybe you’ve dabbled in a bit of code, or maybe you’ve heard AWS being thrown around in conversations and you’re like, “What the heck is that?” Either way, you’re in for a treat.
So, what’s Bash scripting? Think of it like writing a to-do list for your computer. “Hey computer, first do this, then do that. Thanks, you’re awesome!” But what if that to-do list is massive and needs to be done repeatedly, securely, and quickly? Enter AWS, the superhero butler who not only takes your list but also performs all the tasks with flair and precision.
Sounds cool, right? Stick around because by the end of this article, you’re going to see why Bash scripting and AWS are indeed the PB&J of the tech world.
Introduction
In the world of Bash scripting and AWS (Amazon Web Services), the synergy between these two powerful tools can transform your DevOps workflows. Whether you’re automating server checks, managing scalable resources, or deploying applications, understanding how Bash and AWS complement each other is essential for modern cloud computing and DevOps practices.
Why Bash and AWS?
- Automation: Streamline repetitive tasks by automating them with Bash scripts and leveraging AWS services.
- Scalability: Easily scale your applications and infrastructure using AWS while managing configurations with Bash.
- Cost-Efficiency: Optimize costs by automating resource management and scaling based on demand.
- Flexibility: Combine the simplicity of Bash with the robustness of AWS to handle a wide range of DevOps tasks.
Understanding Bash Scripting
What Is Bash Scripting?
In the simplest terms, a Bash script is a text file containing a series of commands for a computer running a Unix-based operating system (like Linux or macOS) to execute. It’s like writing down a to-do list for your computer:
echo "Hello, World!"
Save this in a file called hello-world.sh
, give it executable permission with chmod +x hello-world.sh
, and run it by typing ./hello-world.sh
in the terminal. Voila! Your first Bash script.
Why Should I Care?
We live in a world that runs on automation. Just as washing clothes by hand or turning on the TV without a remote is cumbersome, manually executing repetitive tasks in your workflows can be inefficient and error-prone. Bash scripts act as the unseen heroes that handle heavy lifting—automating backups, managing files, processing data, and more.
Let’s See It in Action
Here’s a simple Bash script that automates the creation of a backup directory and copies files into it:
#!/bin/bash
# Create a backup directory with the current date
backup_dir="/backup/$(date +%F)"
mkdir -p "$backup_dir"
# Copy all .txt files to the backup directory
cp /home/user/documents/*.txt "$backup_dir"
echo "Backup completed successfully."
Explanation:
- Shebang (
#!/bin/bash
): Specifies the script should run in the Bash shell. - Variables: Define paths and filenames dynamically.
- Commands:
mkdir -p
creates directories,cp
copies files. - Echo: Provides feedback upon completion.
This script automates the backup process, ensuring your important .txt
files are safely stored without manual intervention.
Say Hello to AWS (Amazon Web Services)
What is AWS, Anyway?
Imagine you have a massive shopping list, like for a Thanksgiving feast. Would you rather do all the shopping, prepping, and cooking yourself, or get some help? If you said “help,” then AWS is the gourmet chef, chauffeur, and personal shopper you’ve been dreaming of.
In tech terms, AWS is a comprehensive cloud computing platform that offers a broad set of services, including computing power, storage, databases, machine learning, and more. You can rent these resources without owning or maintaining any hardware, allowing you to focus on building and deploying applications.
AWS Services 101
You might have heard of some AWS services like EC2 for computing, S3 for storage, and Lambda for serverless computing. Here’s a quick overview:
- EC2 (Elastic Compute Cloud): Virtual servers in the cloud for running applications.
- S3 (Simple Storage Service): Scalable object storage for data backup, archival, and analytics.
- Lambda: Run code without provisioning or managing servers; ideal for event-driven applications.
- RDS (Relational Database Service): Managed relational databases like MySQL, PostgreSQL, and Oracle.
- IAM (Identity and Access Management): Manage user access and permissions securely.
AWS offers a whole buffet of services, ensuring there’s something for every need.
The Bridge Between AWS and Bash
Remember our Bash to-do list? With AWS, you can execute these lists (or scripts) at scale. Need to back up a ton of data every night? Write a Bash script and let an EC2 instance handle the heavy lifting. Want to run complex analyses? AWS Lambda can execute your scripts without worrying about server management. Furthermore, AWS provides the AWS CLI (Command Line Interface), which allows you to manage AWS services directly from your Bash scripts, making the integration seamless.
The Connection—Why They’re BFFs
You’ve got the bread (Bash scripting), and you’ve got the peanut butter and jelly (AWS). Now, how do we make this sandwich work? Let’s slice right in, shall we?
Automation: Two Heads Are Better Than One
Bash gives you the ability to automate tasks, while AWS takes that automation and scales it up. You’re no longer a lone coder; you’re an orchestra conductor leading a symphony of cloud services. For example, you can write a Bash script that deploys your application to multiple AWS regions, ensuring high availability and resilience.
Money Matters: Being Cost-Effective
We’re all in it to make our lives easier, but if we can save some cash while doing it, that’s the cherry on top. AWS’s pay-as-you-go model combined with the automation capabilities of Bash scripting can be a total game-changer. Running scripts during off-peak hours or using reserved instances can save you a ton of money. Bash + AWS = Budget Bliss.
Real-World Example: Backing Up to S3
Imagine you have a bunch of important files you can’t afford to lose. You could write a Bash script to back them up to an external drive, but what if you could back them up to a virtually infinite and secure cloud storage? Here’s a simple Bash script snippet to copy files to an S3 bucket:
#!/bin/bash
# Define variables
SOURCE_DIR="/home/user/documents"
DEST_BUCKET="s3://my-awesome-bucket/backup-$(date +%F)"
# Sync files to S3
aws s3 sync "$SOURCE_DIR" "$DEST_BUCKET"
echo "Backup to S3 completed successfully."
Explanation:
- aws s3 sync: Synchronizes files between your local directory and the S3 bucket.
- Dynamic Bucket Naming: Includes the current date for organized backups.
- Automation: Schedule this script with cron jobs to run automatically every night.
Output:
Backup to S3 completed successfully.
This one-liner can save your day, and you can schedule it to run automatically with cron jobs. AWS + Bash = Peace of Mind.
Scale, Scale, Scale!
AWS services can auto-scale based on the load. Imagine your Bash script is set up to process a file. What if you suddenly have a million files to process? With AWS services like EC2 Auto Scaling or Lambda, your script can scale from a local hero to a global superstar seamlessly.
By now, you should be getting the hang of why Bash scripting and AWS go together like peas in a pod. Or peanut butter and jelly, to stick with our initial metaphor. They complement each other’s strengths and make up for each other’s weaknesses. It’s a symbiotic relationship, like Batman and Robin, but with less spandex and more code.
Rolling Up Our Sleeves—Making Bash and AWS Work Together
By now, you’re probably pumped about the possibilities of Bash scripting and AWS. But let’s move from the “what” and “why” to the “how.” Ready to roll up your sleeves? Let’s dig in.
Prerequisites: What You’ll Need
Before we start cooking up our Bash-AWS combo, you’ll need a couple of things:
- An AWS Account: If you don’t have one, you can sign up here.
- AWS CLI Installed: This is your command-line tool for AWS. Get it here.
- Basic Bash Knowledge: If you’ve been following along, you should be good here.
Step 1: Setting Up Your AWS CLI
Once you’ve got the AWS CLI installed, you’ll need to configure it. Open up your terminal and type:
aws configure
You’ll be prompted to enter your AWS Access Key, Secret Key, default region, and output format. Don’t worry; it’s like setting up a new phone—just follow the steps.
Example Configuration:
AWS Access Key ID [None]: YOUR_ACCESS_KEY
AWS Secret Access Key [None]: YOUR_SECRET_KEY
Default region name [None]: us-east-1
Default output format [None]: json
Note: For enhanced security, consider using IAM Roles instead of hardcoding your credentials, especially when running scripts on AWS resources like EC2 instances.
Step 2: Write Your Bash Script
Here’s where your Bash skills come into play. Let’s write a simple script to list all your S3 buckets. Create a file called list-s3-buckets.sh
and put in the following:
#!/bin/bash
# List all S3 buckets
aws s3 ls
Don’t forget to make it executable:
chmod +x list-s3-buckets.sh
Step 3: Combine and Conquer
Run your script by typing ./list-s3-buckets.sh
in your terminal. Boom! You just combined Bash and AWS to list all your S3 buckets. It’s like making a sandwich, but instead of eating it, you’re organizing your cloud resources.
Sample Output:
2024-04-30 07:00:00 my-awesome-bucket
2024-04-30 07:05:22 another-bucket
Pro Tips: Going Beyond
Cron Jobs: Use cron jobs to schedule your Bash scripts. For example, you could automatically back up files to S3 every night.
Example Cron Entry:
0 2 * * * /home/user/scripts/backup-to-s3.sh >> /home/user/logs/backup.log 2>&1
AWS SDKs: For more advanced functionalities within your Bash scripts, consider using AWS SDKs (e.g., Boto3 for Python) alongside Bash to handle complex tasks.
Logging: Implement logging within your scripts to track their execution and catch errors early.
Example Logging:
#!/bin/bash LOGFILE="/var/log/s3_backup.log" { echo "Backup started at $(date)" aws s3 sync /home/user/documents s3://my-awesome-bucket/backup-$(date +%F) echo "Backup completed at $(date)" } >> "$LOGFILE" 2>&1
Some Real-World Use Cases
By now, you’re probably itching to know how all this theory applies in the real world. Well, itch no more, my friend. Let’s look at some real-world case studies where Bash and AWS join forces.
Case Study 1: Automating Server Health Checks
Imagine you’re running an online store. Your server’s health is crucial, especially during high-traffic events like Black Friday. A simple Bash script can be set to run on an AWS EC2 instance, checking server health metrics like CPU usage or disk space. If anything goes off the rails, the script could automatically send an alert or even trigger auto-scaling.
Sample Script:
#!/bin/bash
# Check CPU usage
CPU_USAGE=$(top -bn1 | grep "Cpu(s)" | awk '{print $2 + $4}')
# Threshold for CPU usage
THRESHOLD=80
if (( $(echo "$CPU_USAGE > $THRESHOLD" | bc -l) )); then
echo "High CPU usage detected: $CPU_USAGE%" | mail -s "Server Alert" admin@example.com
# Trigger AWS Auto Scaling or other actions
fi
Explanation:
- CPU_USAGE: Captures the current CPU usage by parsing the output of the
top
command. - THRESHOLD: Sets the CPU usage threshold to trigger alerts or scaling.
- Condition: If CPU usage exceeds the threshold, an email alert is sent, and additional AWS actions can be triggered.
Case Study 2: Auto-Scaling Resources
Let’s say you run a blog that suddenly goes viral because a celebrity tweeted about it. A Bash script could use AWS CLI commands to check traffic. When the numbers hit a certain threshold, AWS Auto Scaling kicks in, increasing your EC2 instances to meet demand. You’re a hit, and your site stays up!
Sample Script:
#!/bin/bash
# Fetch current traffic metrics
TRAFFIC=$(curl -s https://api.yourblog.com/traffic | jq '.current_visitors')
# Threshold for scaling
THRESHOLD=1000
if [ "$TRAFFIC" -gt "$THRESHOLD" ]; then
aws autoscaling set-desired-capacity --auto-scaling-group-name my-asg --desired-capacity 5
echo "Auto-scaling triggered due to high traffic: $TRAFFIC visitors."
fi
Explanation:
- TRAFFIC: Retrieves current visitor count from your blog’s API.
- THRESHOLD: Sets the visitor threshold to trigger scaling.
- Condition: If traffic exceeds the threshold, AWS Auto Scaling adjusts the number of EC2 instances accordingly.
Case Study 3: Automated Deployments
DevOps folks, this one’s for you. Imagine you’ve just finished coding a new feature for your app. With a Bash script, you can automate the entire deployment process on AWS, from spinning up new instances to handling database migrations and finally, rolling out the update. Less manual work, less room for error.
Sample Deployment Script:
#!/bin/bash
# Pull the latest code
git pull origin main
# Build the application
./build.sh
# Deploy to AWS Elastic Beanstalk
eb deploy my-environment
echo "Deployment completed successfully."
Explanation:
- git pull: Updates the codebase with the latest changes.
- ./build.sh: Executes your build process.
- eb deploy: Deploys the application to the specified Elastic Beanstalk environment.
- Echo: Provides confirmation of successful deployment.
Things to Watch Out For
While Bash scripting and AWS are a powerful combination, there are some gotchas and pitfalls to watch out for.
Potential Pitfalls: “Don’t Be the Person Who Accidentally Deletes an Entire S3 Bucket.”
Mistakes happen, but when working with powerful tools like Bash and AWS, they can be costly. One wrong command can literally delete an entire S3 bucket. Always double-check your scripts and consider implementing safeguards.
Best Practices:
Use AWS CLI Commands Correctly: Ensure that you are using the correct AWS CLI commands (
aws s3 rm
for deleting S3 objects) instead of local commands likerm
.Example of Correct AWS CLI Usage:
# Correctly delete only .txt files from S3 bucket aws s3 rm s3://my-awesome-bucket/ --recursive --exclude "*" --include "*.txt"
Dry Runs: Utilize the
--dryrun
flag to simulate the command without making actual changes.Example:
aws s3 rm s3://my-awesome-bucket/ --recursive --exclude "*" --include "*.txt" --dryrun
Confirmation Prompts: Add confirmation steps before executing destructive commands.
Example:
read -p "Are you sure you want to delete all .txt files in the bucket? (y/n): " CONFIRM if [ "$CONFIRM" = "y" ]; then aws s3 rm s3://my-awesome-bucket/ --recursive --exclude "*" --include "*.txt" echo "Bucket contents deleted." else echo "Deletion aborted." fi
Use IAM Policies: Restrict permissions to only what’s necessary. For example, grant delete permissions only to specific resources or actions.
Security Considerations: Multi-Factor Authentication and IAM Roles
Security is paramount. Always secure your AWS account with Multi-Factor Authentication (MFA). When writing your Bash scripts, use AWS Identity and Access Management (IAM) roles to assign permissions. Never hardcode sensitive information like AWS keys in your scripts—use environment variables or AWS IAM roles instead.
Example of Using IAM Roles: Attach an IAM role to your EC2 instance with the necessary permissions. This way, your scripts can access AWS services securely without embedding credentials.
#!/bin/bash
# List all S3 buckets using IAM role permissions
aws s3 ls
Additional Tips:
Rotate Credentials Regularly: If you must use access keys, rotate them periodically.
Use Environment Variables: Store sensitive data in environment variables rather than in scripts.
Example:
#!/bin/bash # Export AWS credentials export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY" export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY" # Use AWS CLI commands aws s3 ls
Encrypt Sensitive Data: Use AWS KMS (Key Management Service) to encrypt sensitive information.
Interesting Facts About Bash and AWS You Probably Didn’t Know
Who doesn’t love a good trivia session? Especially when it comes to tech, knowing some lesser-known facts can not only make you the star at nerd parties but also help you appreciate the depth and richness of the tools you’re using. So, let’s drop some Bash and AWS knowledge bombs!
Fact 1: Bash’s Grandpa is 50 Years Old!
Believe it or not, Bash is a descendant of the original Unix Shell, which was developed way back in the ’70s. That’s right, the scripting language you’re using has a lineage that’s over 50 years old. Talk about standing the test of time!
Fact 2: AWS Operates in 80 Availability Zones
As of April 2024, AWS operates in 80 Availability Zones across 30 geographic regions worldwide, with plans to add more. This vast infrastructure allows you to deploy your applications globally with low latency and high availability.
Fact 3: Bash Can Generate Random Numbers
Ever need to generate a random number? Bash has got you covered with the $RANDOM
variable. Try running echo $RANDOM
in your terminal and see what you get!
echo $RANDOM
Fact 4: AWS Was Launched in a Truck
In a way, yes! The idea of AWS was pitched in an Amazon offsite meeting in 2003. Jeff Bezos reportedly approved the project by writing the Amazon press release first—a backward process to figure out what they wanted to achieve. The offsite meeting happened in a box truck with a whiteboard installed inside it!
Fact 5: You Can Run Bash Scripts on AWS Lambda
AWS Lambda is typically known for running backend code in languages like Python or Node.js, but did you know you can run Bash scripts as well? By using a custom runtime or a Lambda Layer, your trusty Bash scripts can live in the serverless world too!
Example Using a Lambda Layer: Create a Lambda Layer that includes the Bash runtime and your scripts, then configure your Lambda function to use this layer.
# Example of running a Bash script in AWS Lambda using a custom runtime
#!/bin/bash
echo "Hello from Bash on AWS Lambda!"
Bash-AWS Horror Stories: Lessons from the Dark Side
Sometimes, learning what not to do can be as valuable as learning what to do. So sit back, relax, and let’s dive into some real-life Bash and AWS horror stories.
Horror Story 1: The “$200,000 Mistake”
A developer wanted to automate the deletion of old, unused AWS EC2 instances. They wrote a Bash script that used AWS CLI commands to find and terminate these instances. Sounds great in theory, right? Wrong. The script had a bug, and it terminated all instances, including production servers. Result? Several hours of downtime and an estimated cost of $200,000 in lost revenue and recovery efforts.
Lesson Learned:
- Thorough Testing: Always test scripts in a staging environment before running them in production.
- Implement Safeguards: Use filters and confirmation steps to prevent mass deletions.
Horror Story 2: “Dr. Strangelove: or How I Learned to Stop Worrying and Love the AWS CLI Command”
A system admin was responsible for daily backups on AWS. They used a Bash script to copy crucial data to an S3 bucket. One fine day, they decided to modify the script to delete local files that were successfully copied to S3. You can probably guess what happened next. Due to a typo in the aws s3 rm
command, the script deleted all objects in the specified S3 bucket instead of just the intended ones.
Incorrect Command Example:
# Intended to delete only .txt files, but typo causes all files to be deleted
aws s3 rm s3://my-awesome-bucket/ --recursive
Corrected Command Example:
# Correctly delete only .txt files
aws s3 rm s3://my-awesome-bucket/ --recursive --exclude "*" --include "*.txt"
Result? Critical data was lost from the S3 bucket, leading to significant downtime and data recovery challenges.
Lesson Learned:
Command Verification: Always double-check AWS CLI commands, especially those that perform destructive actions.
Use Dry Runs: Utilize the
--dryrun
flag to simulate the command without making actual changes.Example:
aws s3 rm s3://my-awesome-bucket/ --recursive --exclude "*" --include "*.txt" --dryrun
Implement Logging: Keep logs of script executions to track actions performed.
Least Privilege Principle: Restrict IAM permissions to allow only necessary actions, minimizing the risk of accidental deletions.
Horror Story 3: “Open Sesame! Or Maybe Not.”
In an attempt to automate the security settings for a new AWS environment, an admin wrote a Bash script that was supposed to tighten up security group settings. However, due to an oversight, the script did the exact opposite: it opened up all inbound and outbound ports, basically inviting the entire internet to a free-for-all. Not exactly the cybersecurity stance you want to have.
Lesson Learned:
- Review Security Configurations: Always review scripts that modify security settings.
- Use Least Privilege Principle: Ensure scripts grant only necessary permissions.
Lessons to Learn
- Double, Triple Check: Always, always double and triple-check your scripts. Test them in a controlled, non-production environment.
- Peer Review: Get someone else to look over your code. A fresh pair of eyes can catch mistakes you’ve become blind to.
- Be Cautious with Destructive Commands: Commands that delete or modify resources should be handled with kid gloves. Have confirmation steps or even manual interventions for these.
Conclusion
Bash scripting and AWS are a powerful combination that can significantly enhance your DevOps workflows. From automating simple tasks to managing complex, scalable infrastructures, mastering this dynamic duo can propel your productivity and efficiency to new heights. However, with great power comes great responsibility. Always prioritize security, thorough testing, and best practices to harness their full potential safely and effectively.
Embrace the synergy of Bash and AWS, and watch as your DevOps tasks become more streamlined, automated, and scalable. Whether you’re a seasoned developer or just starting your DevOps journey, understanding and leveraging these tools will undoubtedly make your tech life a whole lot easier.
FAQ
Q: How do I extract just the filename from a full path in Bash?
A: Use basename /path/to/your/file
. This will give you just the filename without the path.
Q: Can I remove a file extension using Bash commands?
A: Yes, you can use basename
with a suffix option, like basename /path/to/file.txt .txt
, which will return file
without the .txt
extension. Alternatively, Bash parameter expansion allows for this with ${filename%.*}
.
Q: What if I need to handle filenames with multiple extensions, like .tar.gz
?
A: You can use basename
or parameter expansion for simple cases, but for complex manipulations, awk
or sed
might be more effective. For example, echo "archive.tar.gz" | awk -F'.' '{print $1}'
will return archive
.
Q: How do I extract directories from a path without including the filename?
A: The dirname
command will strip the filename from a path and return only the directory part. For instance, dirname /path/to/your/file.txt
will return /path/to/your
.
Q: Are there performance considerations when choosing between these methods?
A: Yes, using Bash’s built-in parameter expansion can be more efficient than spawning new processes for basename
or dirname
. However, for complex text manipulations, awk
and sed
might perform better despite the overhead.
Q: How can I handle filenames with spaces or special characters in my scripts?
A: Always quote your variables (e.g., "$file"
) and use IFS= read -r
in loops to handle filenames with spaces or special characters correctly.
Q: Can I use these commands to process directories as well as files?
A: Yes, basename
and dirname
can be used with directory paths just as they are with file paths to extract the last directory name or the parent directory path.
Q: How do I secure my AWS credentials in Bash scripts?
A: Use IAM roles assigned to your AWS resources instead of hardcoding credentials. If you must use credentials, store them in environment variables or AWS credentials files with proper permissions.
Suggestions for Further Exploration
Explore Script Libraries:
Look into existing Bash libraries and scripts shared by the community. Many common tasks have been solved efficiently by others. Websites like GitHub and GitLab host numerous repositories with valuable scripts.Combine Tools:
Learn to combine tools likefind
,awk
,sed
, and Bash scripting to handle more complex scenarios that involve file and directory manipulations. Understanding how to create pipelines can vastly improve your scripting efficiency.Profile Scripts:
Use tools liketime
and performance profiling in your scripts to understand the impact of different commands and techniques on script performance. This is especially useful for optimizing scripts that process large numbers of files.Advanced Bash Features:
Delve into more advanced Bash features such as arrays, associative arrays, and functions to create more powerful and flexible scripts. Learning about error handling, traps, and debugging techniques can also enhance your scripting skills.Version Control Integration:
Integrate your Bash scripts with version control systems like Git to manage changes and collaborate with others effectively. This practice ensures that your scripts are maintainable and trackable over time.Security Considerations:
Learn about secure scripting practices to prevent vulnerabilities such as code injection, especially when dealing with user inputs or processing untrusted data.Cross-Platform Scripting:
Explore how Bash scripting can be adapted for cross-platform environments, including Windows with tools like Cygwin or Windows Subsystem for Linux (WSL).Integration with Other Languages:
Understand how Bash scripting can interact with other programming languages like Python or Ruby to leverage their strengths alongside Bash’s simplicity for certain tasks.Automated Testing for Scripts:
Implement automated testing for your Bash scripts using frameworks like Bats to ensure reliability and catch bugs early in the development process.Stay Updated with Best Practices:
The tech landscape evolves rapidly. Regularly follow reputable sources, blogs, and forums to stay updated with the latest best practices and advancements in Bash scripting and Unix/Linux file management.
Sources and Links
For practical examples and more detailed explanations:
- GNU Coreutils: https://www.gnu.org/software/coreutils/manual/
- AWK Manual: https://www.gnu.org/software/gawk/manual/
- SED Manual: https://www.gnu.org/software/sed/manual/
- Bash Guide: https://mywiki.wooledge.org/BashGuide
- Stack Overflow - Bash Questions: https://stackoverflow.com/questions/tagged/bash
- ShellCheck - Bash Linter: https://www.shellcheck.net/
- Advanced Bash-Scripting Guide: https://tldp.org/LDP/abs/html/
- GitHub Bash Repositories: https://github.com/search?q=bash+scripting
- Bash Best Practices: https://github.com/Idnan/bash-guide
- AWS Documentation: https://docs.aws.amazon.com/
- AWS CLI Documentation: https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html
- AWS IAM Best Practices: https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html
- AWS Lambda Custom Runtimes: https://docs.aws.amazon.com/lambda/latest/dg/runtimes-custom.html
- Bats - Bash Automated Testing System: https://github.com/bats-core/bats-core
- Cygwin: https://www.cygwin.com/
- Windows Subsystem for Linux (WSL): https://docs.microsoft.com/en-us/windows/wsl/about
- Boto3 - AWS SDK for Python: https://boto3.amazonaws.com/v1/documentation/api/latest/index.html