Posts

Deploy Your React App to AWS S3 and CloudFront

Deploy Your React App to AWS S3 and CloudFront Using a Python Boto3 Script If you've ever wanted to automate the deployment of your React web app to AWS S3 and serve it securely via Amazon CloudFront , this detailed guide is for you. In this post, we’ll walk through a powerful Python deployment script that builds your React project, uploads it to S3, configures the bucket, sets up CloudFront with an Origin Access Control (OAC) , and verifies your app is live on a secure HTTPS URL — all using Boto3 , AWS’s Python SDK. Overview of the Deployment Workflow This Python script automates the entire deployment pipeline in 7 major steps: Build your React app using npm run build . Create or reuse a private S3 bucket. Create or reuse a CloudFront Origin Access Control (OAC). Create or reuse a CloudFront Distribution. Attach a secure bucket policy that only CloudFront can access. Upload your React build to S3 and invalidate the CloudFront cache. Verify...

Building an AWS C++ S3 Program on Windows with vcpk

Building an AWS C++ S3 Program on Windows with vcpkg This guide shows how to set up a working C++ project on Windows that uses the AWS SDK to list S3 buckets. We will use vcpkg as the package manager to handle the AWS SDK installation and dependencies. Step 1: Install vcpkg If you haven't installed vcpkg yet, download it from the official repository and bootstrap it: git clone https://github.com/microsoft/vcpkg.git C:\Users\YourName\CDK\vcpkg cd C:\Users\YourName\CDK\vcpkg .\bootstrap-vcpkg.bat Note: On Windows, you must run bootstrap-vcpkg.bat in the Command Prompt. --- Step 2: Install the AWS SDK for C++ using vcpkg Install the core and S3 components for 64-bit Windows: vcpkg install aws-sdk-cpp[core,s3]:x64-windows This will install the required libraries and headers under installed\x64-windows\ . vcpkg will handle dependencies like aws-c-common , aws-crt-cpp , and others. --- Step 3: Set up your C++ project Create a project folder, e.g., C:\...

Automating .NET ECS Deployment with Boto3 and EventBridge

Deploying containerized applications on AWS ECS can be a tedious, multi-step process. Between setting up IAM roles, creating repositories, configuring clusters, and scheduling tasks, there’s a lot of moving parts. To streamline this, I built a Python deployment script using Boto3 that automates the entire process from Docker build to scheduled execution with Amazon EventBridge . Overview of the Workflow The script covers the following: Ensures IAM roles exist with the right policies Creates or resets an Amazon ECR repository Builds and pushes a Docker image Creates a CloudWatch Logs group for container logging Creates an ECS cluster (if missing) Registers a new ECS Fargate task definition Deregisters old task definitions Stops old running ECS tasks Creates or updates an EventBridge schedule to trigger ECS tasks IAM Roles and Permissions The first step in ECS deployments is to make ...

Asynchronous DynamoDB Queries with Java Using GSIs and the AWS SDK v2 Enhanced Async Client

When building modern Java applications that interact with AWS DynamoDB , you often need to query data by attributes other than the primary key . This is where Global Secondary Indexes (GSIs) come into play. In this post, we’ll break down: Why GSIs are important How to use the Enhanced Async DynamoDB Client in Java How to query a GSI asynchronously using Reactive Streams How to process results efficiently using CompletableFuture 1. Understanding Global Secondary Indexes (GSIs) A Global Secondary Index (GSI) allows you to query DynamoDB tables using an alternative partition key and optional sort key. Unlike the primary key of the table: A GSI can have different partition and sort keys. It provides fast queries for attributes you don’t want to scan. GSIs are eventually consistent by default, but you can request strong consistency if needed. Why use a GSI here? In our scenario, we have a table of form templates with this schema...

Automate Your API Gateway Setup with Boto3: Rendering HTML from Lambda

Creating RESTful endpoints using the AWS Management Console can quickly become frustrating — especially when dealing with API Gateway. If you've ever felt like the Console experience is too slow for repeatable tasks, you're not alone. In this post, we’ll walk through a Python script using the boto3 library that automates the creation of an AWS API Gateway REST API backed by a Lambda function — and returns HTML, not JSON. This makes it ideal for rendering mobile-friendly forms or web content directly from serverless logic. Why Use Python Over the Console? Repeatability : Run your script anytime without clicking through dozens of console menus. Version Control : Store and track API configurations in Git. Speed : Automate CORS setup, deployment stages, permissions — in seconds. Precision : Avoid human error by defining everything in code. Use Case This script will: Create a REST API in API Gateway. Add a /form route with a GET method. Integrat...

Automating AWS API Gateway and Lambda Integration with Boto3 in Python

 Setting up AWS API Gateway endpoints and integrating them with Lambda functions can be a repetitive and error-prone task when done manually through the AWS Management Console. Automating this process using Python's Boto3 library not only saves time but also ensures consistency across deployments. Prerequisites Before diving into the automation script, ensure you have the following: AWS CLI Configured : Set up with the necessary credentials and default region. Python 3.x Installed : Along with the Boto3 library. If not installed, you can do so using: pip install boto3 Existing Lambda Function : Ensure you have a Lambda function created. For this example, we'll refer to it as xxxx . The Automation Script The following Python script performs the following actions: Creates a REST API named SDKStatsAPI . Adds a /stats resource to the API. Configures a GET method on the /stats resource, integrating it with the specified Lambda function. Enables CORS by...

Building and Deploying a Fargate Container that runs Python and performs CloudWatch Logging

Image
This guide walks you through building and deploying an AWS Fargate container that runs a Python script. The container will be tagged as the latest version, pushed to Amazon Elastic Container Registry (ECR), and configured to log messages to CloudWatch Logs. The process includes setting up the Dockerfile, creating the Python script, building and pushing the Docker image, and configuring AWS services to ensure your container runs smoothly on Fargate. By the end of this tutorial, you will have a running Fargate service that logs information to CloudWatch Logs, allowing you to monitor the status and output of your containerized application in real time. NOTE - In this guide, when you see aws_account_id replace it with your AWS account number and region with your AWS region.  Step-by-Step Guide Overview Setting up the Dockerfile Define a Dockerfile that uses Amazon Linux 2023 as a base image, installs Python, sets up the environment, and configures the CloudWatch logging agent. Wri...