14. March 2018 22:20
by Aaron Medacco
0 Comments

Migrating SQL Server Databases to Amazon RDS

14. March 2018 22:20 by Aaron Medacco | 0 Comments

If you're interested in moving an on-premises SQL Server database or a customer-managed SQL Server database powered by EC2 to RDS and need a simple method for doing so then this post is for you. Not everyone is familiar working with the AWS "lego-kit" and sometimes you just need to get things done. The following is meant to get your SQL Server data migrated without requiring a large time investment reading the AWS documentation.

Microsoft SQL Server

For this, I'm going to use the sample database backup found here. This is a small .bak file which, if you use SQL Server a lot, is something you're accustomed to working with.

As usual, please make sure you have installed and configured the AWS CLI on your workstation before invoking these steps. I won't be specifying the AWS region where these resources get provisioned so you'll need to configure the CLI to place everything where you'd like or provide the appropriate options through this process. Additionally, the IAM user I'll be using to invoke these commands has unrestricted administrator access to the account. 

Creating an RDS instance:

  1. If you've already provisioned an RDS instance and just need to move the database to it, you can skip this. In my case, I ran the following to provision a new database instance:
    aws rds create-db-instance --db-instance-identifier sample-instance --allocated-storage 20 --db-instance-class db.t2.medium --engine sqlserver-web --master-username aaron --master-user-password password --db-subnet-group-name default-vpc-07e6y461 --no-multi-az --publicly-accessible --engine-version 14.00.1000.169.v1

    Note: You'll need to provide your own subnet group and your own username and password that's more secure than my demo values.

  2. This commands creates a small RDS instance running SQL Server 2017 Web Edition in a non-Multi-AZ configuration using minimal resources and standard storage.

Creating and uploading your .bak file to S3:

  1. If you already have an S3 bucket and your .bak file stored within it, you can skip this. Otherwise we need to create an S3 bucket:
    aws s3 mb s3://aarons-sqlserver-rds-demo

    Note: Of course, you'll need to choose your own bucket name that is unique and available. Remember to substitute your bucket name for mine on subsequent commands where appropriate.

  2. And to upload the object, I can navigate to the directory where my .bak file lives and invoke the following:
    aws s3 mv AdventureWorks2017.bak s3://aarons-sqlserver-rds-demo/AdventureWorks2017.bak

    Note: Swap the name of your .bak file and bucket name with my example if different.

  3. Understand that this might not work for you depending on the size of the .bak file you are attempting to upload. You may need to use S3 multi-part upload or another method to move a file of a more significant size.

Creating an IAM role granting access to our .bak file for restore:

  1. Create a file named iam-trust-policy.json with the following contents and save it in your working directory:
    {
        "Version": "2012-10-17",
        "Statement":
        [{
            "Effect": "Allow",
            "Principal": {"Service":  "rds.amazonaws.com"},
            "Action": "sts:AssumeRole"
        }]
    }
  2. Create another file named iam-permission-policy.json with the following contents and save it in your working directory:
    {
        "Version": "2012-10-17",
        "Statement":
        [
            {
            "Effect": "Allow",
            "Action":
                [
                    "s3:ListBucket",
                    "s3:GetBucketLocation"
                ],
            "Resource": "arn:aws:s3:::aarons-sqlserver-rds-demo"
            },
            {
            "Effect": "Allow",
            "Action":
                [
                    "s3:GetObjectMetaData",
                    "s3:GetObject",
                    "s3:PutObject",
                    "s3:ListMultipartUploadParts",
                    "s3:AbortMultipartUpload"
                ],
            "Resource": "arn:aws:s3:::aarons-sqlserver-rds-demo/*"
            }
        ]
    }

    Note: This policy assumes you do not require encryption support.
    Note: Remember to swap your bucket name in for mine.

  3. Create the IAM role for RDS by running the following:
    aws iam create-role --role-name rds-backup-and-restore-role --assume-role-policy-document file://iam-trust-policy.json
  4. Write down the ARN value for the role you just created. You'll need it when we add an option to our option group.
  5. Create an IAM policy which defines the necessary permissions to grant RDS by running the following:
    aws iam create-policy --policy-name rds-backup-and-restore-policy --policy-document file://iam-permission-policy.json
  6. Write down the ARN value for the policy you just created. It should be returned to you from the command output.
  7. Now we just need to attach the IAM policy to our IAM role:
    aws iam attach-role-policy --policy-arn <policy-arn> --role-name rds-backup-and-restore-role

Adding the appropriate option group to enable native backup and restore:

In order to restore our .bak file into RDS we need to add an option group to the instance that enables the native backup and restore functionality. 

  1. To create an option group that does this, you can invoke the following command to create an option group. You will need to modify the command for your specific version and edition of SQL Server:
    aws rds create-option-group --option-group-name sqlserver-web-backupandrestore --engine-name sqlserver-web --major-engine-version 14.00 --option-group-description "Allow SQL Server backup and restore functionality."
  2. Now we need to add an option for the native backup and restore. This is where you need to enter the role ARN you saved from earlier:
    aws rds add-option-to-option-group --option-group-name sqlserver-web-backupandrestore --apply-immediately --options OptionName=SQLSERVER_BACKUP_RESTORE,OptionSettings=[Name="IAM_ROLE_ARN",Value="<role-arn>"]
  3. Finally, we need to modify our RDS instance to use the new option group we've setup:
    aws rds modify-db-instance --db-instance-identifier sample-instance --apply-immediately --option-group-name sqlserver-web-backupandrestore
  4. You may need to walk away for a few minutes and come back while RDS modifies the instance option group. If you want to know when the new option group is active then you can run this command:
    aws rds describe-db-instances --db-instance-identifier sample-instance
    and make sure you see a status of "in-sync" for the option group added.

Restoring the .bak file to your RDS instance:

At this point, we can actually begin restoring our backup. You'll need to connect to your database instance, presumably with SQL Server Management Studio or another tool. Assuming that you've defined the appropriate networking settings (enabling public accessibility on the database instance for example) and security group rules (SQL Server requires enabled traffic on port 1433), you can connect using the endpoint for your database instance. If you don't know your database endpoint, you can find it by running the describe-db-instances command again:

aws rds describe-db-instances --db-instance-identifier sample-instance

and checking the address value for "Endpoint".

  1. Once you have connected to the database instance, run the following stored procedure (targeting the "rdsadmin" database is fine). Remember to swap your own values where appropriate:
    exec msdb.dbo.rds_restore_database @restore_db_name='AdventureWorks2017', @s3_arn_to_restore_from='arn:aws:s3:::aarons-sqlserver-rds-demo/AdventureWorks2017.bak';
  2. After running the stored procedure, your restore request will be queued. To see the progress of your request while it executes, you can provide your database name and run this:
    exec msdb.dbo.rds_task_status @db_name='AdventureWorks2017';
  3. Once the request has a status of "SUCCESS", your database should now be available for use.

If you'd like to read the documentation yourself, you can find the relevant material here and here. I thought it would be helpful to condense everything into a step-by-step post for those developers or DBAs still getting up to speed with AWS.

I also encourage readers to review the limitations when using native backup and restore for SQL Server on RDS which you can find here. For instance, you can only restore databases that are 4 TB or less in size at the time of writing. If this solution doesn't work for you due to size limitations or if you require the database to remain online during migration, you may want check out AWS Database Migration Service as recommended by AWS.

Cheers!

20. January 2018 12:00
by Aaron Medacco
0 Comments

New Pluralsight Course: Visualizing Data with Amazon QuickSight

20. January 2018 12:00 by Aaron Medacco | 0 Comments

I've recently completed another course for Pluralsight, this time for Amazon QuickSight. QuickSight is a business intelligence offering within the AWS suite that allows you to import your data and analyze it using dynamic visualization. It's a rather young service, competing with other big-name products like Power BI and Tableau. It'll be interesting to see how this service evolves in the coming years given the pace at which Amazon Web Services moves. And while there are some limitations to the product which I think will be addressed soon, it's a fast and easy data analysis tool to use, especially if you're an AWS customer who already stores their data within the Amazon cloud. 

In Visualizing Data with Amazon QuickSight, I assume the viewer has no experience with Amazon QuickSight or with data analysis at all. The course begins by covering the basics such as account creation, setup and user access management. From there, it covers how to connect or import your data wherever it is to QuickSight. This might mean a flat file you want to import ad-hoc style, objects in S3, a Redshift cluster (which served as the primary data source for the course), or a database stored on-premises or with another provider. Then, I walkthrough how data preparation is done in QuickSight, which is essentially the process for taking data in it's unmodified, raw form and formatting it into a data set that will provide the most value in data analysis. Naturally, data analysis finishes out the course where we enter a deep-dive into the QuickSight data analysis interface and explore the different visualizations and features available to us.

Visualizing Data in Amazon QuickSight

Pluralsight courses are a lot of work, but I'm very satisfied with how this one turned out. There's a lot of demos, but being a very visual tool, that's not a surprise. Plus, "death by slides" is a real thing of which I've suffered through as a student myself. If you're curious about Amazon QuickSight and want to see how you might use it for your own data, go check it out

Cheers!

7. December 2017 02:49
by Aaron Medacco
0 Comments

AWS re:Invent 2017 - Day 4 Experience

7. December 2017 02:49 by Aaron Medacco | 0 Comments

The following is my Day 4 re:Invent 2017 experience. Missed Day 3? Check it out here.

The common theme of not waking up early continues. Missed the entire Werner Vogels Keynote, which is fine since from what others were saying it was a storm to get in. I didn't have it reserved anyways. AWS added it to the event catalog after I had already decided my schedule. Not sure why I thought you were just supposed to show up. 

First session of the day, AWS Database and Analytics State of the Union - 2017 (DAT201). This took place in the Venetian Theatre.

AWS re:Invent 2017

Love this venue for sessions.

I wasn't sure what to expect from a "State of the Union" session. For the most part, this was a combination of sales pitch, history lesson into the origin of some of the AWS database services, and explanation of miscellaneous features. After the explanation of what the RDS Multi-AZ feature does (really? who doesn't know what this is?), the session moved on to highlight the motivations for building Aurora and DynamoDB. Essentially, AWS wanted to combine the benefits of commercial-grade performance provided by products like Oracle and SQL Server with the low-cost of MySQL, PostgreSQL and MariaDB. The product of these efforts became the Aurora product. 

AWS re:Invent 2017

Horrible quality pic. I don't know why.

After sharing some history, a few of the newest Aurora features came up. Specifically, Amazon Aurora Multi-Master and Amazon Aurora Serverless. Amazon Aurora Multi-Master allows you bring the disaster recovery of a failed master instance to almost nothing, 100 ms. The single-region version of this feature is available in preview now, with the multi-region version available later in 2018. Amazon Aurora Serverless allows you to essentially have an on-demand database that scales for you and is for applications that have unpredictable workloads. Being serverless, the customer manages hardly anything.

The origins for DynamoDB came from a disaster affecting Amazon.com. I didn't write down the exact year, but basically Amazon.com was leveraging Oracle to power the online retail site, however the site became unavailable during Christmas. The cause of this was traced back to limitations in Oracle or Amazon.com's implementation of it, of this I wasn't clear. In response, Amazon built the NoSQL database service, DynamoDB, to handle massive volumes of transactions it was experiencing.

AWS re:Invent 2017

DynamoDB. But does it work at scale? Haha.

The remainder of the session focused on general overviews of the rest of the data service catalog. Therefore, a lot of the session was review for anyone who regularly spends time learning Amazon Web Services. Several attendees started leaving during this time. There's a lot going on during re:Invent so while I understand time is precious during the week, I always stay for the whole session. Maybe it's just out of respect for the presenter. Either way, I did learn a few tidbits about Aurora and some origin history of some services I didn't know before.

AWS re:Invent 2017

Big Amazon Echo.

Grabbed a shuttle to the Aria to check out what was going on at the Quad. I was expecting it to be as large as the Venetian Expo. Boy, was that off the mark! The Quad was very underwhelming, especially after visiting the Expo. There was a small handful of booths, but nothing like the Expo which was aisles and aisles full of them. I smiled when I saw the Certification Lounge there, looked like a large-sized cubicle. At this point, it became clear to me that the Venetian was definitely the primary hub for re:Invent. Good to know for next year's hotel room reservations.

They did have a cool Lego section, though.

During my video filming of the re:Invent areas of the Aria, I got yelled at by some woman telling me I couldn't record. I waited until she stopped looking, and turned it back on after. What a joke! Now, I actually was told not to video record prior to this while in the Venetian casino, which while I'd argue is unenforceable, makes more sense to me. However, back in the Aria, what is there to film that can cause harm!? It's a bunch of nerds with laptops walking around banners, session rooms, and re:Invent help desks a quarter mile from the casino floor! Ridiculous. It's 2017, and there's a tech conference going on here. Are you going to watch every attendee's laptop, tablet and smartphone, too, because guess what, those devices can do video recording as well. Unenforceable and moronic. Anyways...

AWS re:Invent 2017

In case the re:Invent app fails you.

Returned to my room after grabbing a Starbucks and did some blogging until my next session at the Venetian, Taking DevOps Closer to the AWS Edge (CTD401). 

AWS re:Invent 2017

Emptier session. Was actually really good.

This session was possible my favorite of the conference. And I certainly wasn't expecting that. The session title, in my opinion, is misleading, however. Now, the presenter did say that the same session took place last year and included a demo involving saving CloudFormation templates to CodeCommit and managing a delivery pipeline with CodePipeline to push modifications to a development CloudFront distribution, perform some testing, and then do the same to a production CloudFront distribution. That seems more DevOps to me. What we got instead was an in-depth overview of how to incorporate the edge services of AWS into application design and development and how to use CloudFront

AWS re:Invent 2017

More terrible quality pics.

AWS re:Invent 2017

Logic determining the TTL CloudFront uses.

Most of the session was a deep dive and explanation of CloudFront and how it works between your origin (EC2, S3, etc.) and your clients (users). The presenter explained how the TCP connections, both from the user to the edge location, and from the edge location to the origin function as well as provided some tips for keeping your cache-hit ratio high using headers like cloudfront-is-mobile-viewer to reduce variability. Plus, there were some cool examples given of Lambda@Edge taking headers and custom modifying them in the in-between. 

AWS re:Invent 2017

Lambda@Edge examples.

I've not used CloudFront a lot, but I'm more confident about it after this session. A lot of people walked out throughout this session, probably hoping for something different. Can't say I wouldn't have wanted to do the same thing if I knew CloudFront inside and out already. Being a 400-level course, it was surprisingly easy to grasp, perhaps due to the presenter.

AWS re:Invent 2017

Lego blocks.

Back at the Bellagio, stopped in for some grub at the FIX Restaurant & Bar. Snatched a cocktail, salmon, and mashed potatoes.

AWS re:Invent 2017

9/10. Would have been 10/10 if the waitress had brought the mac n cheese I ordered. Maybe she didn't hear me? Don't know. The food coma was instant, so I took a power nap before going out to check out the re:Play party.

Which brings us to the re:Play party! AWS goes all out for this. You could only enter in through the Venetian Hall A even though it was behind the LINQ. 

AWS re:Invent 2017

Oomce, oomce, oomce.

Food and drinks were included and the place was packed. It took place under a set of large tents, one being totally dedicated to games like glow-in-the-dark pool, glow-in-the-dark ping pong, adult-sized ball container, putt-putt pool, batting cages, dodge-ball and more.

AWS re:Invent 2017

Guy got lost in the balls. They couldn't find him.

AWS re:Invent 2017

Another tent is where the rave was going on with DJ Snake.

AWS re:Invent 2017

Oomce, oomce, oomce.

AWS re:Invent 2017

Not sure what these were about.

And then a final tent was packed full of arcade style games. There were some other areas I didn't explore since the line was ridiculous or I wasn't clear how to get in. 

AWS re:Invent 2017

Ancient video games.

I didn't end up staying too long since everything had huge lines and I'm not one for live music anyways.

AWS re:Invent 2017

Walked back to the Venetian casino and played poker with other attendees from re:Invent. Lost another $300. What is going on, man! I'm just the worst, although I turned Aces up when a guy got a set. Understandable, but annoying. Came home with some Taco Bell (I know, classy) and turned in for the night.

Cheers!

5. December 2017 21:38
by Aaron Medacco
0 Comments

AWS re:Invent 2017 - Day 3 Experience

5. December 2017 21:38 by Aaron Medacco | 0 Comments

The following is my Day 3 re:Invent 2017 experience. Missed Day 2? Check it out here.

Out and early at 8:30 (holy ****!) and was famished. Decided to check out the Buffet at the Bellagio. Maybe it was just me, but I was expecting a little bit more from this. Most things in Las Vegas are extravagant and contribute to an overall spectacle, but the Bellagio Buffet made me think of Golden Corral a bit. Maybe it gets better after breakfast, I don't know. 

AWS re:Invent 2017

AWS re:Invent 2017

The plate of an uncultured white bread American.

Food was pretty good, but I didn't grab anything hard to mess up. Second trip back, grabbed some watermelon that tasted like apple crisp and ice cream. Not sure what that was about. Maybe the staff used the same knife for desserts and fruit slicing. From what I could tell, half the patrons were re:Invent attendees, either wearing their badge or the hoodie.

Walked back to my room to watch the Keynote by Andy Jassy, but only caught the last bit of it. After some difficulty getting the live stream to work on my laptop, watched him announce the machine learning and internet of things services. Those aren't really my wheelhouse (yet?), but seemed interesting nontheless. Succumbed to a food coma afterwards for a short nap.

Headed over to the Venetian to go back to the Expo for a new hoodie and for my next breakout session. AWS was holding the merchandise hostage if you didn't fill out evaluations for breakout sessions, so I couldn't get the hoodie until after I came back post-session. Good to know for next year. Back where the session halls were, got in the Reserved line for the Optimizing EC2 for Fun and Profit #bigsavings #newfeatures (CMP202) session. Talked with a gentleman while in line about the new announcements, specifically the S3 Select and Glacier Select features. I wasn't clear what the difference was between S3 Select and Athena and neither was he. I'll have to go try it out for myself.

AWS re:Invent 2017

Awaiting new feature announcements.

AWS re:Invent 2017

Great speaker as always. AWS always has good speakers.

AWS re:Invent 2017

More talk about Reserved and Spot Instances.

Best thing about this session was the announcements of new features. The first one was a really helpful feature AWS added to the Cost Explorer within the management console that gives instance recommendations based on your account's historical usage. Having a tool like this that does cost analysis and recommendations is great, means I don't have to. I pulled up the SDK and AWS CLI reference while he was demonstrating it, but couldn't find any methods where I could pull those recommendations using Lambda or a batch script. I figured it'd be useful to automate a monthly email or something that tells you that month's instance billing recommendations. Ended up talking to the speaker afterwards who said it's not available, but will be in the months to come. Nice!

Second announcement was regarding Spot Instances and being able to hibernate instances when they're going to be terminated. The way this was described was that hibernation acts the same way as when you "open and close your laptop". So if you are using a Spot Instance set to hibernate, when that instance gets terminated in the event another customer bids higher or AWS adds it back to the On-Demand instance pool, it will save state to EBS and when you receive it back, it will pick up where it left off instead of needing to completely re-initialize before doing whatever work you wanted. 

T2 Unlimited was also covered, which essentially allows you to not worry so much about requiring the credits for burst capacity of your T2 series of EC2 instances. The rest of the session covered a lot of cost optimization techniques that have been labored to death. Use Reserved Instances, use Spot Instances, choose the correct instance type for your workload, periodically check in to make sure you actually need the capacity you've provisioned, take advantage of serverless for cases where an always-on machine isn't necessary, and other tips of the "don't be an idiot" variety. Again, I must be biased since most of this information seems elementary. I think next year I need to stick to the 400-level courses to get the most value. That said, the presentation was excellent like always. I won't knock it just because I knew information coming in.

Found the shuttle a short walk from the hall, and decided to be lazy (smart) for once. Got back to the Bellagio for some poker before dinner, and came out plus $105. During all the walks back from the Aria to the Bellagio, I kept eyeballing the Gordon Ramsay Burger across the street at the Planet Hollywood, so I stopped in for dinner. 

AWS re:Invent 2017

Pretty flashy for a burger place.

AWS re:Invent 2017

I ate it all...No, I didn't. But wanted to try out the dogs and the burgers.

For a burger and hot dog place, I'd give it a 7/10. Probably would be a bit higher if they had dill pickles / relish, and honestly, better service. You can imagine this was pretty messy to eat, especially the hot dog, so I asked one of the girls upfront where the bathroom was to go wash my hands. The one across the hall was out of order (go figure), so I had to go to the one out thru some of the casino and next to P.F. Changs. I think the tables next to me thought I just walked out without paying. Heard them say "There he is." when I returned. Really? Do I look like a criminal? Yeah, I came to Vegas for a full week to rip off a burger joint.

Cheers!

1. December 2017 17:26
by Aaron Medacco
0 Comments

AWS re:Invent 2017 - Day 1 Experience

1. December 2017 17:26 by Aaron Medacco | 0 Comments

Welcome to the first in a series of blog posts detailing my experience at AWS re:Invent 2017. If you're someone who is considering going to an AWS re:Invent conference, hopefully what follows will give you a flavor for what you can expect should you choose to fork over the cash for a ticket. The following content contains my personal impressions and experience, and may not (probably doesn't?) reflect the typical experience. Also, there will be some non-AWS fluff as well as I have not been to Las Vegas before.

AWS re:Invent 2017

My adventure starts at about Midnight. Yes, midnight. Living in Scottsdale, AZ, I figured, "Why not just drive instead of fly? After all, it's only a 6 hour drive and there won't be any traffic in the middle of the night." While that was true, what a mistake in retrospect. Arriving in Las Vegas with hardly any sleep after the road trip left me in pretty ragged shape for Monday's events. Next year, I'll definitely be flying and will get there on Sunday so I get can settle prior to Monday. I actually arrived so early, I couldn't check into my room and needed to burn some time. What better activity to do when exhausted than sit down at poker tables. Lost a quick $900 in short order. Hahaha! Truth be told, I got "coolered" back to back, but I probably played bad, too.

Once I got checked into my room at the Bellagio around 9:00am, I headed back to the Aria to get registered and pick up my re:Invent hoodie. Unfortunately, they didn't have my size, only had up to a Small. I couldn't help but smile about that. I ended up going to the Venetian later to exchange my Small for a Medium. Anyways, got my badge, ready to go! Or was I?

By the way, kudos to the Bellagio for putting these in every room. Forgot my phone charger. Well, the correct phone charger at least...

 AWS re:Invent 2017

...except it didn't have a charger compatible with my Samsung Galaxy S8. Kind of funny, but I wasn't laughing. Alright, maybe a little. Would end up getting one at a Phone store among one of the malls at the Strip. Oh yeah, and I also forgot to buy a memory card for my video recorder prior to leaving. Picked up one of those from a Best Buy Express vending machine. Vegas knows.

By this time I was crashing. Came back to my room, fell asleep, and missed 2 breakout sessions I was reserved for. Great job, Aaron! Off to a great start! 

Walked to the Aria to go check out the Certification Lounge. They had tables set up, food and drink, and some goodies available depending on what certifications you'd achieved. The registration badges have indicators on them that tell people if you're AWS certified or not, which they use to allow or deny access. I didn't end up staying too long, but there were a decent number of attendees with laptops open working and networking. Here's some of the things collected this year by walking around to the events: 

AWS re:Invent 2017

The re:Invent hoodie picked up at Registration (left) and the certification t-shirt inside the Certification Lounge (right).

AWS re:Invent 2017

Water bottle and AWS pins were given away at the Venetian Expo (top-left), badge and info packet at Registration (right), and the certification stickers at the Certification Lounge depending on which ones you've completed (bottom-left).

Headed over to the MGM Grand for my first breakout session, GPS: Anti Patterns: Learning From Failure (GPSTEC302). Before I discuss the session, I have to talk about something I severely underestimated about re:Invent. Walking! My body was definitely NOT ready. And I'm not an out-of-shape or big guy, either. The walking is legit! I remember tweeting about what I imagined would be my schedule weeks before re:Invent and Eric Hammond telling me I was being pretty optimistic about what I would actually be able to attend. No joke. Okay, enough of my complaining.

AWS re:Invent 2017

Waiting for things to get started.

AWS re:Invent 2017

Session about half-full. Plenty of room to get comfortable.

AWS re:Invent 2017

Presenter's shirt says, "got root?". Explaining methods for ensuring account resource compliance and using AWS account best practices when it comes to logging, backups, and fast reaction to nefarious changes.

This was an excellent session. The presenters were fantastic and poked fun at mistakes they themselves have made or those of customers they've talked to have made regarding automation (or lack thereof), compliance, and just overall bone-headedness (is that a word?). The big takeaways I found were to consider using services like CloudWatch, CloudTrail and Config to monitor and log activity in your AWS accounts to become aware when stupid raises it's ugly head. They threw out questions like, "What would happen if the root account's credentials were compromised and you didn't know about it until it was too late?", and "You have an automated process for creating backups, but do you actually test those backups?". From this came suggestions to regularly store and test backups to another account in case an account gets compromised and using things like MFA, especially for root and privileged users.

Additionally, the presenters made a good argument for not using the management console for activities once you become more familiar with AWS, particularly if you're leveraging the automation tools AWS provides like OpsWorks and CloudFormation as that kind of manual mucking around via the console can leave you in funny states for stacks deployed with those services. Along those lines, they also suggested dividing up the different tiers of your application infrastructure into their own stacks so that when you need to make changes to something or scale, you don't end up changing the whole system. Instead, you only modify or scale the relevant stack. Overall good session. If they have it again next year, I would recommend it. You'll get some laughs, if nothing else. The guys were pretty funny.

Once out, I had a meeting scheduled to talk with a company (presumably about upcoming Pluralsight work) at the Global Partner Summit Welcome Reception. Now, I'll admit I got a little frustrated trying to find where the **** this was taking place! AWS did a great job sending lots of guides with re:Invent flags everywhere to answer questions and direct attendees to their events, and these guys were godsends every time except when it came to finding this event. I think I just got unlucky with a few that were misinformed.

AWS re:Invent 2017

These guys were scattered all over the strip and inside the hotels. Very helpful!

First, I was told to go to the one of the ballrooms. Found what appeared to be some kind of Presenter's Registration there. Then, found another guide who said to go to the Garden Grand Arena. Walked over there, total graveyard, and ironically, a random dude there who wasn't even one of the re:Invent guides told me where it actually was. He also said, "Oh yeah, and unless you want to be standing in line all night, you might want to reconsider." It was late enough at this point, I figured I'd just head back to the Bellagio for a much needed poker session, so that's what I did. However, on the way back, holy ****, he was right. I've never seen a line as long as the one to get into the GPS Welcome Reception in my life. It went from the food court, through the entire casino, out of the casino, and further back to I couldn't tell where. Apparently, I was the only one who missed the memo, since everyone else knew where to go, but still, that line. 

Long hike back to the Bellagio, played poker for about 3 hours, lost $200 (man, I suck), and on my way back to my room discovered I didn't eat anything all day. LOL! Picked up a couple pizza slices and crashed for the night. A good night's sleep? Yes, please. Tomorrow would be better.

Cheers!

Copyright © 2016-2017 Aaron Medacco