A few days ago, I wrote a post outlining how I use Hugo and Amazon S3 to create a severless blog hosting platform. While this solution works awesome for hosting the site, publishing is still a bit of a pain. After a few too many rounds of drag-and-drop uploading, I set out to find a better publishing workflow.
My first breakthrough was a quick terminal command using the AWS CLI to automatically upload the public directory that Hugo generates. The
sync parameter is used instead of
upload to make sure that the source and target directories match, and the
--delete modified is appended to make sure that any files removed from the source directory are also removed from the target (S3) directory. This ensures that any outdated images or documents are removed! Finally, the
-- acl public-read modifier makes sure that the uploaded files are public readable (this should be redundant if you have a public bucket policy, but never hurts).
aws s3 sync PATH/public/ s3://YOUR_BUCKET_NAME/ --delete --acl public-read
While my upload script was working great, I started learning more about AWS’s DevOps tools and wanted to see if I might be able to fit them into my writing workflow. For those who don’t know, AWS has a number of developer tools that can be put together a la carte to help build a DevOps delivery pipeline.
- CodeCommit: provides a managed source control service which allows AWS users to host versioned source code repositories using Git
- CodeBuild: provides a managed build service which spins up virtual machines on-demand to build and test code
- CodeDeploy: automates code deployment to EC2 instances
- CodePipeline: provides a continuous integration (CI) and continuous delivery (CD) pipeline which connects CodeCommit, CodeBuild, and CodeDeploy to manage source code, build and test the artifacts, and deploy them to the cloud
Since my blog architecture is severless, I don’t need to use the CodeDeploy functionality, which is designed for EC2 instances. Instead, I can use a CodeCommit repository to hold my blog and a CodeBuild script to build it with Hugo and drop the generated code into S3. With those two mechanisms in place, we can stitch together a workflow in CodePipeline which waits for updates to the CodeCommit repository and triggers a Hugo build and S3 deployment automatically. Let’s take a look at how to put that together! Guidance and some code snippets borrowed from Ivan du Toit and Karel Bemelmans.
Building the Pipeline
Create a new CodeCommit repository
Use directions from AWS to set up the command line on your computer, configure an IAM user for CodeCommit, and clone your new repository
Move all of your Hugo files into the repository
cdto the repository root on your computer and run
git add .to add all of the files to your commit
git commit -m "YOUR_MESSAGE"to commit the files to your repository
git push origin masterto push the files upstream
Now, if you access your repository through the CodeCommit console, you should be able to see your code
Create a new Build Project within AWS CodeBuild
- Use your CodeCommit repository as the source
- Select Ubuntu 14.04 as the build environment (Ubuntu 16.04 would be better but is not supported by AWS at this time)
- Choose S3 as the destination for build artifacts and direct it to the S3 bucket which you are using to serve your site (for me, this is conormclaughlin.net)
- Create the Build Project
Navigate to the IAM Dashboard and select “Roles” from the sidebar
Attach the AmazonS3FullAccess and AmazonSNSFullAccess policies to the role to allow your build script to transfer the generated files to S3 and notify users of a build
Navigate to the Simple Notification Service (SNS) console within AWS and create a new topic, adding yourself as a subscriber as you see fit
On your computer, navigate to the root folder of your blog repository and create a file titled
version: 0.2 phases: install: commands: - pip install Pygments - wget https://github.com/gohugoio/hugo/releases/download/v0.21/hugo_0.21_Linux-64bit.deb - dpkg -i hugo_0.21_Linux-64bit.deb build: commands: - hugo - echo "S3 Upload Beginning" - aws s3 sync public/ s3://YOUR_S3_BUCKET/ --region us-east-1 --delete --acl public-read - echo "S3 Upload Complete" post_build: commands: - echo "Build complete" - aws sns publish --topic-arn YOUR_TOPIC_ARN --subject 'AWS CodeBuild Complete' --message 'Your build using AWS CodeBuild has been completed. Check the CodeBuild console to see the details.'
Push the changes to your CodeCommit repository so that the
buildspec.ymlfile is in the root of your repository
Preemptively delete all of the files in your server S3 bucket, as long as you are comfortable with some momentary downtime. If the CodeBuild process goes wrong, you can always re-copy the files from your computer
Navigate back the CodeBuild console and start a new build
With any luck, your build should be successful and look like the image below:
CodePipeline is the final piece in the puzzle to automating your Hugo build and deployment to S3. With CodePipeline in place, simply pushing a new post from your computer to the CodeCommit repository will automatically trigger a CodeBuild operation, culminating with your updated blog being generated and automatically placed into S3.
Configure your CodePipeline to use your CodeCommit repository as well as your CodeBuild Build Project
Follow the steps to create an IAM role for CodePipeline
Create the Pipeline and view the workflow
Trigger a build and watch the pipeline execute!