Managing Objects on S3 with Client-Side Encryption and Automation

Introduction:
we need to securely transfer sensitive documents to the cloud without manual intervention. A common requirement is to automatically detect file changes, apply client-side encryption, and upload only the modified files on a scheduled basis, such as every midnight. This approach ensures strong data security, reduces unnecessary uploads, and optimizes bandwidth and storage usage. In this Blog we will go through securing our object and moving it to the S3 bucket.
Architecture Diagram

To complete this Project we will be using AWS CLI and Bash-Script for Automation and Cronjob for triggering event every-midnight.
Procedure:
Step 1: Create a bucket and enable versioning
aws s3 mb s3://max-demo-abi7
#enable the versioning of bucket
aws s3api put-bucket-versioning --bucket max-demo-abi7 --versioning-configuration Status=Enabled
- Go to the AWS management console and Search for S3 and also check for versioning enabled by clicking the Properties option.


Step2: Create a file script.sh
This script does the following:
Watches a file (file1.txt) for changes.
If modified
Encrypts it with GPG.
Uploads the encrypted file to an S3 bucket.
Updates a timestamp log.
If not modified: Skips processing.
If the file doesn't exist: Logs that info.
Now, update the variables in the script, also update passphrase
#!/bin/bash
# Set DIRECTORY_PATH ,FILE_NAME and S3_BUCKET variables
DIRECTORY_PATH="/home/abi/Desktop/max-demo/"
cd "$DIRECTORY_PATH" || { echo "Failed to change directory to $DIRECTORY_PATH"; exit 1; }
FILE_NAME="file1.txt"
ENCRYPTED_FILE="${FILE_NAME}.gpg"
S3_BUCKET="s3://max-demo-abi7"
LAST_MODIFIED_FILE="$DIRECTORY_PATH/last_modified.txt"
# Check if the file exists
if [ -f "$FILE_NAME" ]; then
# Get the current last modified time of the file
CURRENT_MODIFIED_TIME=$(stat -c %Y "$FILE_NAME")
# Check last_modified.txt for previous timestamp (or empty)
if [ -f "$LAST_MODIFIED_FILE" ]; then
LAST_MODIFIED_TIME=$(cat "$LAST_MODIFIED_FILE")
# If the file is empty, treat as not previously uploaded
if [ -z "$LAST_MODIFIED_TIME" ]; then
LAST_MODIFIED_TIME=0
fi
else
LAST_MODIFIED_TIME=0 # If the file doesn't exist
fi
# Compare timestamps to decide if the file was modified
if [ "$CURRENT_MODIFIED_TIME" -gt "$LAST_MODIFIED_TIME" ]; then
echo "$(date): File $FILE_NAME has been modified or is new. Encrypting..."
# Encrypt the file using GPG (symmetric encryption)
gpg --batch --yes --symmetric --cipher-algo AES256 --passphrase "max@Passw0rd" -o "$ENCRYPTED_FILE" "$FILE_NAME"
if [ $? -eq 0 ]; then
echo "$(date): Encryption successful. Uploading to S3..."
# Upload the encrypted file to S3
aws s3 cp "$ENCRYPTED_FILE" "$S3_BUCKET/"
if [ $? -eq 0 ]; then
echo "$(date): File successfully uploaded to $S3_BUCKET"
rm -f "$ENCRYPTED_FILE"
# Update the last modified timestamp
echo "$CURRENT_MODIFIED_TIME" > "$LAST_MODIFIED_FILE"
else
echo "$(date): Upload to S3 failed!"
fi
else
echo "$(date): Encryption failed!"
fi
else
echo "$(date): File $FILE_NAME has not been modified. Skipping encryption and upload."
fi
else
echo "$(date): File $FILE_NAME not found in $DIRECTORY_PATH"
fi
Note: following command is used for encrypting the object so the —passphrase “max@Passw0rd“ should be noted for decrypting later.
gpg --batch --yes --symmetric --cipher-algo AES256 --passphrase "max@Passw0rd" -o "$ENCRYPTED_FILE" "$FILE_NAME"
Step3: Make the script.sh executable
sudo chmod +x script.sh
Step4: Set the cronjob for automation at 12:00 AM midnight
#checking the status of cron
systemctl status cron
crontab -e
#paste the below line and change path to script.sh file
00 00 * * * <Path/to/scritp>/script.sh >> <Path/to/log>/logfile.log 2>&1
for testing set the cronjob for min after some minutes to current time.
Example: This will run the cronjob at 1:40 PM everyday
40 13 * * * ~/Desktop/max-demo/script.sh >> ~/Desktop/max-demo/logfile.log 2>&1

- After the Successful execution of the script , you will get your encrypted object with extension .gpg uploaded to bucket

Step5: Reading the File
- Download the file and decrypt using the passphrase that you have set before.
#download the file from s3 to local directory
aws s3 cp s3://max-demo-abi7/file1.txt.gpg .
cat file1.txt.gpg
#decrypting the file using passphrase and output in file1.txt file
gpg --batch --yes --passphrase "max@Passw0rd" -o file1.txt -d file1.txt.gpg
#view the file after decryption
cat file1.txt
- View the encrypted text file using cat command


You can also check the log_files.log and last_modified.txt files by setting cronjob without changing the context of file1.txt.

Congratulations!!!
You have successfully completed this demonstration. Now you will be able to use client-side encryption into your S3-bucket.
Conclusion:
By automating file change detection, client-side encryption, and scheduled uploads, we can ensure our sensitive data is transferred securely and efficiently.



