Skip to main content

Experimenting with AWS – Cloud Resume Challenge

By July 21, 2020April 13th, 2023AWS, Cloud

The cloud is the future. I’ve spent the past 2 years doing tier 1 technical support but it was unfulfilling and left me wanting more. I decided to pursue the cloud with AWS and went for a deep dive the past few months. Having no knowledge of the cloud, I started with the AWS Cloud Practitioner certification. After that was done, I then went for the AWS Solutions Architect Associate. Now I had a bit of knowledge of AWS services but no good way to use them. One day while reading a post on reddit, I saw the cloud resume challenge. It looked simple enough and I knew HTML and Python so I thought I’d give it a try. Little did I know I was headed into. Before you read on though, why not check out the finished site!

Challenge Overview

The challenge is pretty straightforward at first glance: create a resume site with a visitor count using AWS. I started getting suspicious though when I saw there were 16 steps in the challenge and NO INSTRUCTIONS. A site with a visit counter does not need 16 steps. Reading through all the steps was causing a headache but I decided to go one step at a time. With the exception of the first 6 steps, every step had me using my Google skills to the max as I read countless tutorials, documentation, and conducted experiments. Here are the steps

  1. Get AWS Certified
  2. Creating the site with HTML and CSS
  3. Host the website on S3
  4. Get HTTPS and DNS setup
  5. Write a Javascript visitor counter
  6. Make a Database to store the info
  7. Get an API to serve between the JS and the DB
  8. Make Python update the DB
  9. Make test cases for said Python code
  10. Putting all the Infrastructure as Code
  11. Git some Source Control
  12. Continuous integration and deployment

The Challenge

Step 1: Certification

Thankfully I certified before doing this challenge so this was an easy check. For resources in completing this, I recommend lurking around r/awscertifications. The community there has all the best resources compiled including content, practice exams, and what to watch for on the exam.

Step 2: Creating the resume with HTML and CSS

With frameworks, people rarely code sites from scratch now. I’m a fan of bootstrap and found a resume template pretty quickly. After making a few adjustments, I got it to how I liked.

Step 3: Static S3 Website

S3 is a simple object storage service. I really like it for the ease and security it provides. Now since the site didn’t have anything fancy, S3 would be able to host the site without any problem at all. In fact because people used S3 for this purpose, AWS added that as an option in the properties which you can enable. Now when you create a bucket, it has to be a unique name. If you create a bucket serving a website, you have to name the bucket the same as the domain. Reason for this is because each bucket actually creates a subdomain like this https://yourbucket.s3.amazonaws.com. I didn’t want people typing all that so I went to get my own domain.

Step 4: HTTPS and DNS

For the certificate, I used AWS Certificates because it was free. Now you could do this step using some other DNS provider but for simplicity purposes, I created a hosted zone in Route 53. This let me use aliases for the various other AWS services I was using and saved time. Additionally, I setup a cloudfront distribution so my site could be cached at edge locations around the world. Cloudfront is a CDN provider and provides content at edge locations to improve user latency. Another benefit was that my S3 objects would not have to be exposed to the public and I could also “host” the website with HTTPS.

Step 5: Javascript

Now that my site was alive and well, I needed to count the visitors with JS. I wasn’t all familiar with JS but after reading and Googling, stuff started to come together. The code started to work but I realized I would need to connect the JS with the API endpoint (step 9). It was at that point that it dawned on me that I had not setup the database or the API at all and I couldn’t continue.

Step 6,7,8: Database, API Gateway, Lambda

These steps go together. There’s just not a good way to split them apart. I’ll go through the steps I thought were best and what I could have done differently.

  1. The database is simple and I didn’t really need a relational database (literally just storing the count of visits). I went with the suggested DynamoDB to create the table. Now I had never used a noSQL database before but once I figured out the structure of keys and attributes, it was a breeze.
  2. Lambda is just code snippets that work when triggered based on events. I built mine with Python and the AWS SDK boto3. The idea here was to have the code update my visits by 1 every time it was triggered. It was very interesting to see how lambda integrated with the other AWS services and was actually fun to play with.
  3. API Gateway allows you to create APIs with ease. I had my API invoked with my JS code which triggered the lambda function, which added a visit to my DynamoDB table, which returned the update visits to my API, where JS displayed the number on the website.

NOTE:

I learned there are several ways of integrating Lambda and API Gateway to send header responses.

  1. Gateway Method: You can specify header responses inside API Gateway. This way, Lambda only returns the count and nothing else.
  2. Lambda Proxy Method: In addition to the count, you can also write the header in Lambda and have that sent to the API Gateway. The APIGW acts as a proxy and doesn’t add anything else to the header. It sends everything from the Lambda code straight to the endpoint.

Other thoughts:

  • You need CORS otherwise you will  get errors.
  • You need to set correct permissions for both the gateway and lambda otherwise you will get stuck.
  • Cloudwatch logs are your best friend.

Step 9: Tests

I don’t know how anybody else did it but I think this was the most painful process of the challenge. Tests are a good way for you to make sure your code still works after adding/removing changes from the code. It should be relatively simple except in this case its not. Lambda is serverless, DDB is serverless, but your code is local. Every time I did tests on the lambda function, it would call out to the actual AWS service. Hours of Google later, I realized someone created a nice library that lets you mock AWS services. moto actually saved my life. After mocking the DDB table in my code, I was finally able to test my lambda function without any problems (You will need this later in step 14). I also recommend Pytest over unittest for a cleaner test.

Step 10: Infrastructure as code

As a AWS Solutions Architect, you learn about CloudFormation and how its templates let you create entire infrastructures from a yaml file. Thankfully that gave me a headstart but I didn’t realize how powerful and confusing it was. There’s two ways to do this. You can either use SAM and deploy with the SAM CLI or you can manually write the YAML template. I ended up using a little of both and referred to this documentation a lot. I also recommend using parameters and pseudo parameters so you don’t hard code anything important. Outputs are optional, but they make you look like a pro. Here is what mine looked like.

Step 11 & 12: CI/CD Pipelines

Like the instructions say,

You do not want to be updating either your back-end API or your front-end website by making calls from your laptop, though. You want them to update automatically whenever you make a change to the code.

AWS has an integrated solution called AWS CodePipeline. Github has a solution called Github Actions. Now I wanted this to work after I lose my free year of AWS services so I went with Github. I created two repositories, one for my front end and another for my back end. Github Actions is interesting as you can specify various things to do with a push or a pull of the repo.

I worked on the front end first as that seemed to be more manageable than the back end. I actually found there was an action created already that pushed your repo straight into S3 and invalidated the Cloudfront cache for you. Now there are several actions and some of them don’t work that well; you have to figure out the one that fits your needs.

Back end was the harder one because you have to implement your python test before uploading to lambda. This requires your github action to either have separate jobs or separate steps. Since I ended up using Pytest, I found a github action that creates a docker environment and tests your file with pytest. Depending on the environment, python may complain that it needs dependencies.

The second part was uploading the code after it passes the test. I ended up having several steps as I wanted to zip the files I had as well. This would make it easier as lambda would automatically unzip it when the code got there. I had to make sure I didn’t put my credentials in the code as well. Github secrets is great use for this as hackers will never find your secret credentials.

Summary

This challenge was certainly interesting and a “challenge” to do. I have a newfound appreciation for AWS services as I found how secure they can be. You should ALWAYS always apply the principle of least privilege to your roles and users. I learned a lot about AWS, infrastructure-as-code, continuous integration and development, and server-less applications. I also rediscovered how exciting learning is and to keep going even though it may seem like you are getting nowhere. Just want to give a shout out to Forrest Brazeal for this engaging project. Thanks to stackoverflow for being the best resource out there.

I am currently looking for new opportunities. If you would like to connect or learn more about my experience, you can find me on LinkedIn and Twitter.

Join the discussion 2 Comments

Leave a Reply