5. January 2017 19:20
by Aaron Medacco
0 Comments

Setting up Consolidated Billing for Accounts on AWS

5. January 2017 19:20 by Aaron Medacco | 0 Comments

Whether you are a consultant managing cloud resources for customers or you're part of a large company where each department has their own AWS account, setting up consolidated billing on Amazon Web Services will join all charges for multiple accounts on to one bill.

Consolidated Billing

This makes tracking charges per account less of a headache, and can even save money overall by allowing the paying account to benefit from volume pricing discounts gained thru aggregate account usage.

Signing up for consolidate billing on the paying account:

  1. Click on your account name on in the top right of your management console.
  2. Click "My Account".
  3. Select "Consolidated Billing" in the sidebar.
  4. Click "Sign up for Consolidated Billing".

You may need to wait before proceeding from here. Amazon will validate your payment information before allowing your to continue.

Linking another account under the paying account's bill:

  1. If you left and came back and/or Amazon validated your payment information (you'll receive an email), navigate back to the "Consolidated Billing" section of your account settings.
  2. Click "Send a Request".
  3. Enter the email address for the root user of the AWS account you want to pay for.
  4. Include notes if necessary, and click "Send".
  5. The account owner of the AWS account will receive an email asking them to verify the request. They must click the request acceptance link.
  6. They must then click "Accept Request".

At this point, you will see the linked account in the "Consolidated Billing" section of the payer account's Account Settings. For additional info on consolidate billing, click here.

Cheers!

3. January 2017 21:13
by Aaron Medacco
2 Comments

Automating Backups of Your Route 53 Hosted Zone DNS Records

3. January 2017 21:13 by Aaron Medacco | 2 Comments

Not too long ago I was editing entries in a Route 53 hosted zone and thought to myself what would happen if the record sets for the zones were lost. A pretty epic disaster would need to occur for you to somehow lose your DNS record sets. Maybe someone accidentally gets rid of a zone believing it to no longer be necessary, or perhaps someone configured IAM permissions incorrectly and a disgruntled employee who notices he has access wreaks havoc on your DNS before he finds the door. Either way, without your DNS, your perfectly designed system architecture might as well be driftwood. Therefore, I created a serverless method for storing backups of all Route 53 hosted zone records in S3, just in case:

Route 53 Backup Diagram

Creating the S3 bucket:

  1. Navigate to S3 in your management console.
  2. Click "Create Bucket".
  3. Enter an appropriate name and select a region.
  4. Click "Create".

Creating an IAM policy for the first Lambda function's role:

  1. Navigate to IAM in your management console.
  2. Select "Policies" in the sidebar.
  3. Click "Create Policy".
  4. Select "Create Your Own Policy".
  5. Enter an appropriate policy name and description.
  6. Paste the following JSON into the policy document:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "route53:ListResourceRecordSets"
                ],
                "Resource": [
                    "*"
                ]
            },
            {
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject"
                ],
                "Resource": [
                    "Your Bucket ARN"
                ]
            }
        ]
    }
  7. Substitute "Your Bucket ARN" with the ARN for the S3 bucket you created. Make sure you add "/*" after the bucket ARN. For instance, if your bucket ARN was "arn:aws:s3:::furyandzealbrothers", you would use "arn:aws:s3:::furyandzealbrothers/*".
  8. Click "Create Policy".

Creating the IAM role for the first Lambda function:

  1. Select "Roles" in the sidebar.
  2. Click "Create New Role".
  3. Enter an appropriate role name and click "Next Step".
  4. Select "AWS Lambda" within the AWS Service Roles.
  5. Change the filter to "Customer Managed", check the box of the policy you just created, and click "Next Step".
  6. Click "Create Role".

Creating the first Lambda function:

  1. Navigate to Lambda in your management console.
  2. Click "Create a Lambda function".
  3. Select the "Blank Function" blueprint.
  4. Click "Next".
  5. Enter an appropriate function name and description. Select Node.js for the runtime.
  6. Under "Lambda function code", select "Edit code inline" for the Code entry type and paste the following code in the box:

    var AWS = require("aws-sdk");
    
    exports.handler = (event, context, callback) => {
        var route53 = new AWS.Route53();
        var id = event.id;
        var name = event.name;
        var recordParams = { HostedZoneId: id };
        route53.listResourceRecordSets(recordParams, function(err, data){
            if (err) {
                console.log(err, err.stack);
            }
            else {
                console.log(JSON.stringify(data));
                var records = [];
                for (var j = 0; j < data.ResourceRecordSets.length; j++){
                    records.push(data.ResourceRecordSets[j]);
                }
                var zone = { id:id, name:name, records:records };
                uploadBackupToS3(zone);
            }
        });
    };
    
    var uploadBackupToS3 = function(data) {
        var s3 = new AWS.S3();
        var bucket = "Your Bucket Name";
        var timeStamp = Date.now();
        var key = data.name + "_" + data.id.replace(/\//g, '').replace("hostedzone", '') + "_backup_" + timeStamp;
        key = key.replace(/[.]/g, "_");
        var body = JSON.stringify(data);
        var param = { Bucket: bucket, Key: key, Body: body, ContentType: "text/plain", StorageClass: "STANDARD_IA" };
        s3.upload(param, function(err, data) {
            if (err){
                console.log(err, err.stack);
            } else{
                console.log("Route 53 backup successful.")
            }
        });
    };
  7. Substitute "Your Bucket Name" with the name of the bucket you created earlier.
  8. Leave Handler as "index.handler".
  9. Choose to use an existing role and select the IAM role you created earlier.
  10. Leave the other default values and click "Next".
  11. Click "Create function".

Creating an IAM policy for the second Lambda function's role:

  1. Navigate to IAM in your management console.
  2. Select "Policies" in the sidebar.
  3. Click "Create Policy".
  4. Select "Create Your Own Policy".
  5. Enter an appropriate policy name and description.
  6. Paste the following JSON into the policy document:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "route53:ListHostedZones"
                ],
                "Resource": [
                    "*"
                ]
            },
            {
                "Action": [
                    "lambda:InvokeFunction"
                ],
                "Effect": "Allow",
                "Resource": "Your Lambda Function ARN"
            }
        ]
    }
  7. Substitute "Your Lambda Function ARN" with the ARN of the lambda function you created earlier and click "Create Policy".

Creating the IAM role for the second Lambda function:

  1. Select "Roles" in the sidebar.
  2. Click "Create New Role".
  3. Enter an appropriate role name and click "Next Step".
  4. Select "AWS Lambda" within the AWS Service Roles.
  5. Change the filter to "Customer Managed", check the box of the second policy you just created, and click "Next Step".
  6. Click "Create Role".

Creating the second Lambda function:

  1. Navigate to Lambda in your management console.
  2. Click "Create a Lambda function".
  3. Select the "Blank Function" blueprint.
  4. Under "Configure triggers", click the grey box and select "CloudWatch Events - Schedule".
  5. Enter an appropriate rule name and description.
  6. Select the frequency you'd like Lambda to backup your Route 53 hosted zone records in the Schedule expression input. I chose "rate(30 days)" for my usage.
  7. Check the box to "Enable trigger" and click "Next".
  8. Enter an appropriate function name and description. Select Node.js for the runtime.
  9. Under "Lambda function code", select "Edit code inline" for the Code entry type and paste the following code in the box:

    var AWS = require("aws-sdk");
    
    exports.handler = (event, context, callback) => {
        var route53 = new AWS.Route53();
        var lambda = new AWS.Lambda();
        var params = {};
        route53.listHostedZones(params, function(err, data){
            if (err) {
                console.log(err, err.stack);
            } 
            else {
                for (var i = 0; i < data.HostedZones.length; i++) {
                    var id = data.HostedZones[i].Id;
                    var name = data.HostedZones[i].Name;
                    var payload = { id:id, name:name };
                    var lambdaParams = {
                        FunctionName: "Your Lambda Function Name", 
                        InvocationType: "Event",
                        Payload: JSON.stringify(payload)
                    };
                    lambda.invoke(lambdaParams, function(err, data) {
                        if (err) {
                            console.log(err, err.stack);
                        }
                        else {
                            console.log(data);  
                        }
                    });
                }
            }
        });
    };
  10. Substitute "Your Lambda Function Name" with the name of the first lambda function you created earlier.
  11. Leave Handler as "index.handler".
  12. Choose to use an existing role and select the second IAM role you created earlier.
  13. Leave the other default values and click "Next".
  14. Click "Create function".

Depending on how frequently you schedule the backups, you might also want to configure a lifecycle policy in S3 to archive or delete them after a period of time.

Cheers!

1. January 2017 16:55
by Aaron Medacco
0 Comments

Setting up SSL/TLS Certificates on an Amazon Elastic Load Balancer w/ ACM

1. January 2017 16:55 by Aaron Medacco | 0 Comments

Securing your web traffic using SSL/TLS certificates is a growing standard. Even for small websites, with how cheap SSL/TLS certificates are, there's no excuse not to use them. For those using Amazon's Elastic Load Balancing service to distribute their traffic, the following is how you would establish secure communication without needing to buy and install certificates on your instances. Certificates provisioned using Certificate Manager are free of charge, and can be used with Amazon's Elastic Load Balancing & AWS CloudFront services.

Admittedly, I omitted one important step the first few times I did this because apparently I can't read.

Certificate Manager 3

Creating the SSL/TLS Certificate:

  1. Navigate to Certificate Manager (ACM) in your management console.
  2. Click "Request a certificate".
  3. Enter a wildcard entry for your domain. For instance, my domain is aaronmedacco.com, so a "*.aaronmedacco.com" will cover any subdomain.
  4. Enter two more entries, one for both the www and non-www versions of your domain. For instance, "aaronmedacco.com" and "www.aaronmedacco.com", in my case.

    Certificate Manager 1
  5. Click "Review and request".
  6. Click "Confirm and request".
  7. Click "Continue".

The first couple times I did this, I only used the wildcard entry and of course, when I navigated to the non-www version of my domain, I received a certificate name mismatch screen in my browser. It never occurred to me to read the grey text underneath the big blue button staring me in the face.

Validating the SSL/TLS Certificate:

  1. You should be sent back to the dashboard page of Certificate Manager. You should see a status of "Pending validation" for your certificate.
  2. In order to activate the certificate, the domain owner will receive an email asking to verify the certificate request.

    Certificate Manager 2
  3. Click the link to approve the request.
  4. If you used more than one entry for the certificate and receive multiple emails, approve each of them.

Shortly after, the status should flip to "Issued". Now, we just need to add a listener to the Elastic Load Balancer. For this, I assume you've already created a load balancer listening for HTTP traffic on port 80.

Configuring the Elastic Load Balancer:

  1. Navigate to EC2 in your management console.
  2. Select "Load Balancers" in the sidebar.
  3. Select the load balancer distributing traffic to your web application.
  4. Click the "Listeners" tab.
  5. Click "Add listener".
  6. Select "HTTPS (Secure HTTP)" as the Protocol, 443 as the Port, and the appropriate target group.
  7. Choose to use an existing certificate from AWS Certificate Manager (ACM) and select the certificate you just created.
  8. Click "Create".

At this point, your load balancer will now listen for requests using HTTPS before relaying them to your application instances. One advantage to this configuration over managing SSL/TLS from the web servers, is that you offload that little bit of CPU. Your next step will likely entail configuring whichever web server you are using (IIS, Apache, etc.) to force the use of HTTPS for requests that are HTTP in nature.

Cheers!

30. December 2016 19:00
by Aaron Medacco
0 Comments

Automating Alerts for Unassociated Elastic IPs w/ AWS

30. December 2016 19:00 by Aaron Medacco | 0 Comments

Amazon charges for Elastic IP addresses that are allocated, but not associated with a running instance. This is to discourage AWS customers from wasting the dwindling pool of available iPv4 addresses available. Wouldn't it be nice if, as someone who manages AWS resources, you received alerts when your account's allocated Elastic IPs are being wasted?

I've created an automated process to send out an email when this occurs. Using a simple Lambda function (triggered by a CloudWatch schedule) and an SNS topic, notifications can be sent to the appropriate employees when someone forgets to cleanup after terminating their instances.

Elastic IP Waste Diagram

Creating the SNS topic:

  1. Navigate to SNS in your management console.
  2. Select "Topics" in the sidebar.
  3. Click the "Create new topic" button.
  4. Enter an appropriate topic name and display name and click "Create topic".

Subscribing to the SNS topic:

  1. Select "Topics" in the sidebar.
  2. Click the ARN link for the topic you just created.
  3. Under Subscriptions, click "Create subscription".
  4. Select Email as the Protocol and enter your email address as the Endpoint.
  5. Repeat steps 3 and 4 for each email address you want to receive notifications.
  6. Each email address endpoint will receive an email asking to confirm the subscription. Confirm the subscriptions.

Creating an IAM policy for access permissions:

  1. Navigate to IAM in your management console.
  2. Select "Policies" in the sidebar.
  3. Click "Create Policy".
  4. Select "Create Your Own Policy".
  5. Enter an appropriate policy name and description.
  6. Paste the following JSON into the policy document:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "sns:Publish",
                    "sns:Subscribe"
                ],
                "Resource": [
                    "Your Topic ARN"
                ]
            },
            {
                "Effect": "Allow",
                "Action": [
                    "ec2:DescribeAddresses"
                ],
                "Resource": [
                    "*"
                ]
            }
        ]
    }
  7. Substitute "Your Topic ARN" with the ARN for the SNS topic you created and click "Create Policy".

Creating an IAM role for the Lambda function:

  1. Select "Roles" in the sidebar.
  2. Click "Create New Role".
  3. Enter an appropriate role name and click "Next Step".
  4. Select "AWS Lambda" within the AWS Service Roles.
  5. Change the filter to "Customer Managed", check the box of the policy you just created, and click "Next Step".
  6. Click "Create Role".

Creating the Lambda function:

  1. Navigate to Lambda in your management console.
  2. Click "Create a Lambda function".
  3. Select the "Blank Function" blueprint.
  4. Under "Configure triggers", click the grey box and select "CloudWatch Events - Schedule".
  5. Enter an appropriate rule name and description.
  6. Select the frequency you'd like Lambda to check for unassociated Elastic IPs in the Schedule expression input. I chose "rate(1 day)" for my usage.
  7. Check the box to "Enable trigger" and click "Next".
  8. Enter an appropriate function name and description. Select Node.js for the runtime.
  9. Under "Lambda function code", select "Edit code inline" for the Code entry type and paste the following code in the box:

    var AWS = require("aws-sdk");
    
    exports.handler = function(event, context) {
        var sns = new AWS.SNS();
        var ec2 = new AWS.EC2();
        var message = "The following Elastic IPs are not associated:\n\n";
        var params = {};
        ec2.describeAddresses(params, function(err, data) {
            if (err) {
                console.log(err, err.stack); 
            }
            else {
                var unassociatedAddresses = 0;
                for (var i = 0; i < data.Addresses.length; i++){
                    if (!data.Addresses[i].hasOwnProperty("InstanceId")){
                        console.log(data.Addresses[i].PublicIp);
                        unassociatedAddresses++;
                        message += " " + data.Addresses[i].PublicIp + "\n";
                    }
                }
                if (unassociatedAddresses > 0){
                    var publishParams = {
                        Message: message, 
                        Subject: "Elastic IP Addresses Unassociated",
                        TopicArn: "Your Topic ARN"
                    };
                    sns.publish(publishParams, context.done);
                }
            }
        });
    };
  10. Substitute "Your Topic ARN" with the ARN for the SNS topic you created earlier.
  11. Leave Handler as "index.handler".
  12. Choose to use an existing role and select the IAM role you created earlier.
  13. Leave the other default values and click "Next".
  14. Click "Create function".

That's it! Now you'll at least be made aware when your Elastic IPs are being wasted. Hopefully before whoever is paying your account's AWS bill.

Cheers!

10. December 2016 22:45
by Aaron Medacco
0 Comments

AWS Solution Architect - Associate Certification Tips & Advice

10. December 2016 22:45 by Aaron Medacco | 0 Comments

AWS Certified Solution Architect - Associate

A little over a month ago, I decided that receiving the Amazon Web Services Certified Solution Architect - Associate certification was a great way to validate my expertise using Amazon's cloud computing platform. Those that know me know that while I come from a development background, I don't have a history as an IT Professional or Systems Administrator. However, I rarely find myself wearing only the development hat, so being able to assume another role seemed like a great opportunity.

I set my exam date 30 days out from making my decision, and while I certainly was not "experienced" using Amazon Web Services at the time, I was confident in my ability to learn whatever was necessary quickly. What a ride. The last month has been an absolute grind of no sleep, red eyes, notes, training, videos, quizzes, whitepapers, labs, FAQs, blogs, and Mountain Dew (shower me, oh coder fuel). Last Monday, I passed with substantial time left, but definitely with my share of incorrect responses. Unfortunately, I didn't get to review what I missed after submission. The testing center only provides the percentages for how well you performed in each of the four areas the exam covers.

Anyways, I'm providing some pointers to anyone else attempting to get the AWS Solution Architect Associate certification, especially for those who do not come from a traditional IT or administrator background. Maybe someone can benefit from my experience.

1) Review the certification guide provided by Amazon Web Services here.

You should be able to check off most, if not all of the requirements provided in their guide. The instructor led training can be expensive for some and isn't necessary if you are good at self-learning. If you're a developer, brushing up on system architecture, networking and security best practices will be helpful.

2) Set a date, but give yourself more than one month to prepare.

This is especially true if you are new to cloud computing. Unless you're an impatient wretch like me and are prepared to burn the midnight oil, I'd recommend spacing it out so you feel completely confident on exam day. There's a lot of content and frankly, a ton of reading, which will consume a lot of your time. I'd recommend 3 months if you want to have time for other things while still remaining dedicated to study. However, set a date so you stay disciplined.

3) Take advantage of online video training.

I used Pluralsight, although there are several e-learning resources designed for users seeking certification. If you choose Pluralsight as well, I recommend the following courses. They benefited me most at the time of this writing:

These provide a solid overview of Amazon Web Services, address the core services, and go over many of the important features.

4) Run through as many practice questions as possible.

Amazon Web Services offers a collection of example questions which can be found here. Practice exams are available for each certification. These cost $20, are timed, and will give you a taste of what the real exam will be like. While I recommend taking one practice exam, you should take your results with a grain of salt. Don't let yourself get too comfortable just because you pass the practice exam. Keep practicing.

There are a few mobile apps available that can help you prepare, too. I used this one, although the amount of questions the app has is limited. It's a good tool to use until you start memorizing the questions, at which point it loses its value. Worth a few bucks, though.

The best resource I found for practice questions was Cloud Academy. They provide a rich volume of questions perfect for exam preparation. They also offer video training, but I mainly stuck to the quizzes. What I enjoyed about Cloud Academy was the ability to filter questions you wanted to receive by service level. For example, if you felt weak on Simple Storage Service (S3), you could start a quiz where you were asked only questions related to S3. Again, I wouldn't celebrate being able to perform well on each quiz. The question pool Cloud Academy pulls from definitely has a lot of "softball" questions that are easy and designed to throw beginners a bone. However, there isn't a better service I could find that offered a high volume of relevant questions in a timed environment.

5) Read the whitepapers and the FAQ for each service.

Not every service of Amazon Web Services will appear on the exam. However, since the vast majority of the questions are supposed to test your "ability to design and architect cloud solutions", you'd be doing yourself a disservice by not familiarizing yourself with all the available tools. The length of the FAQs vary by service, but many of them can be read in 20-30 minutes. Several questions in the exam pull directly from information found in the FAQs, and I recommend going over them the night before the exam so they are fresh in your mind.

Concerning the whitepapers, I found reading those outlined in the exam guide was enough. And while I don't want to discourage anyone from reading more of them, I believe your time would be better spent on the FAQs, quizzes, or hands on practice. Again, I recommend having at least a high level understanding of each service. There are a handful of services that you should have a complete understanding of in order to be successful on the exam. They include the obvious such as EC2, VPC, Route 53, S3, etc. I won't get more specific since all exam participants are required to agree to their NDA.

Going forward, I'm already looking forward to preparing for the Developer and SysOps Administrator exams. Plus, there's a huge amount of information that just got posted from the AWS re:Invent 2016 event that happened a little bit ago. Exciting stuff.

Hope this was helpful to anyone pursuing the certification.

Cheers!

Copyright © 2016-2017 Aaron Medacco