19 Apr 2017
Let's quickly look back at the last few years of computing evolution, and think about what our next move is.
If you have been around the IT industry for more than a few years, then you might be well aware of how fast things are changing. No doubt, it is so much easier to spin up servers and deploy applications than it was half a dozen years ago. Not only faster, but cheaper, more reliable and efficient. Let's do a quick rundown of this short history of computing.
Physical Computing - AKA "The Dark Ages"
In the not-so-distant past, if you or your company wanted to deploy an application or run a service, then you had to go out and buy an actual physical server. I remember doing exactly this as recently as 6 years ago. If a customer wanted to setup an internal application, like a SharePoint site for example, then I would spec what they needed and have our purchasing department order the hardware. In 4-8 weeks the hardware arrives, and I would assemble it, install the OS, run the updates, install all the required libraries and components for the application to run, and only then I can start working on the actual application that the customer hired us to do.
It was a frustrating, expensive, and time-consuming process. Scaling was difficult, because it typically entailed buying more hardware, and efficiency was non-existent.
Virtualization to the rescue - AKA "The Renaissance"
Things changed when virtualization became popular, with the advent of tools like VMware and Hyper-V. Organizations were able to provision resources much faster and allocate these resources according to the needs and requirements of their applications. Virtualization also meant that resources can be added, removed or scaled accordingly as those needs changed. It was a complete game-changer.
With that being said, as the pace of application development increased, deploying these applications on virtual machines was lacking in many ways. For one, virtualization still had the overhead of the hypervisor and VM guest operating systems. You still had to go through a lot of software installs and updates. What worked on one VM on a specific hypervisor might not work elsewhere. This prevented applications from being very "portable".
Containerization - AKA "The Modern Era"
Container technology, led by companies like Docker, has alleviated most of these limitations mentioned earlier. A container, much like a shipping container, standardized the process by taking an application and everything that this application needs to run and wrapping it in a single unit. This process lets these "contained" applications run in any environment, regardless of the underlying infrastructure. Containers are very portable and lightweight, and they are very efficient because they take advantage of sharing resources, as opposed to VMs that require the entire guest operating system to themselves.
The power of containers is in standardization. I like to think of it in terms of shipping containers. If I wanted to import some product from Asia, then all I have to do is get that product in whatever quantity I want, find a shipping container, and then fit my product into that container. The next step would be to find a shipping carrier and tell them where I want that container delivered, and it will be transported along with many other containers of the same exact size albeit different contents.
It is important to note that containers still need physical infrastructure to run on, but with the availability of cloud computing services like AWS and Azure, one no longer needs to worry about managing hardware.
What Comes After Containerization?
Contanerization is hot now, but it would be short-sighted not to assume that this evolution will continue. I am sure that one day we will write about how inefficient and time-consuming containerization was!
If I had to venture a guess, I would say that serverless computing will be the next big thing.
With serverless, one no longer has to worry about the underlying resources. Amazon is leading the field with their Lambda service. Lambda will let you run your code without provisioning or managing any of the underlying infrastructure. The deployment model becomes simply "deploying your code" and not deploying VMs or containers. You can instruct Lambda to run your code in repsonse to certain events, as in "if you get this HTTP request, then change this value in the database table or modify this object in S3". As a result, you can build entire applications without provisioning resources.
Amazon seems to be investing a lot of effort into this serverless model, and other cloud providers are following suit. It will be very interesting to see how this plays out over the next few years.
12 Apr 2017
Earlier this year (2017), I decided that it was time for me to stop ignoring the biggest player in IT: AWS (Amazon Web Services).
Generally speaking, I am not a big fan of chasing the latest tech trend because “all the cool kids are doing it”. Yes, one needs to stay relevant in this very dynamic industry but there is also the danger of never becoming an expert if one keeps following these trends. In the case of AWS and cloud computing however, this is not a trend, but the new normal. Ignorance here will make you irrelevant, very fast.
Although I have been using certain AWS-backed services at work, the extent of my knowledge and experience with the different AWS services was fairly limited. I know that Amazon offered a rich ecosystem of storage, compute, automation and infrastructure technologies, so after doing some research, I decided that the best approach would be to purse the AWS Solutions Architect (Associate) certification.
Certifications are a controversial topic in IT. Some argue that obtaining a certification makes you more marketable, while others think that they could have a negative impact by giving the impression that you are compensating for the lack of experience. Regardless of where you stand on this issue, my personal experience with certifications in general has been a very positive one throughout my career. It has definitely opened many doors for me and it set me apart from other candidates.
So why should you pursue AWS certifications?
1. AWS is Quickly Becoming the Gold Standard of the Cloud
AWS is leading the pack in almost every aspect. According to Gartner, Amazon’s cloud is 10 times bigger than its next 14 competitors, combined! This is bad news for the folks at Azure and Google Cloud Platform but it is great news for you.
Whether you’re a web developer, a database admin, a system admin, an IoT developer, a Big Data analyst, an AI developer (and the list goes on and on), your life will be made much easier if you take advantage of Amazon’s platform. Their offerings touch almost every aspect of technology, and discussing them would be outside the scope of this article. They are constantly adding more offerings and innovating in a way that is leaving the competition in the dust.
Gartner’s famous Magic Quadrant report has this handy graph, that shows AWS leading in every aspect of innovation and execution:
2. AWS Certifications Are Feasible and Within Reach
Unlike other vendors, Amazon offers a realistic certification path that does not require highly specialized (and expensive) training to start. I am not saying that it is very easy to get certified, but you won’t have to quit your job and pay for expensive training to get your first AWS certification.
As of early 2017, AWS offers 3 tiers:
1. Associate tier:
- Certified Solutions Architect Associate
- Certified Developer Associate
- Certified SysOps Administrator Associate
2. Professional tier:
- Certified Solutions Architect Professional
- DevOps Professional
3. Specialty tier:
- Advanced Networking
- Big Data
The most common approach is to start with the Certified Solutions Architect Associate. It is a great way to get familiar with the AWS ecosystem and core services. You are required to have an associate certificate before you can sit for the professional or specialty exams. Furthermore, AWS requires that you have your Solutions Architect associate certificate before you can take the Solutions Architect professional test, or that you have your Developer or SysOps Associate certificate before you can sit for the DevOps Professional test.
As far as training, the best resource by far is A Cloud Guru. I passed all three associate certificates by relying mainly on their excellent courses. Ryan Kroonenburg and the rest of the A Cloud Guru team provide excellent training for AWS, Docker, and other cloud technologies and their courses are very affordable and unmatched in quality and content: https://acloud.guru
Self-learners rejoice! With a bit of effort and discipline, you can become very proficient. Amazon also offers a free tier account so you can use most of their services for a year for free. The hands-on experience is crucial in your learning journey.
3. AWS Skills Are in High Demand and Pay Top Money
According to Forbes, these are the top paying certifications for 2016:
Need I say more?
With that being said, please remember that simply getting the AWS Solutions Architect certification DOES NOT automatically mean that you will be making the annual salary indicated in the table above. Many other factors are at play here, including your other skills, experience, geographic location, etc. The point is, proving to potential (or existing) employers that you are competent in using Amazon’s cloud offerings will have a great positive impact on your career.
About a week ago, I passed my AWS SysOps Administrator exam after passing both the Solutions Architect and the Developer exams. I am currently exploring opportunities, so this will be the true test of whether this effort will pay off. Regardless of what the end result is, I feel that I have truly gained a unique set of skills that will definitely be very handy in my own side projects as well as future career opportunities.
Call To Action
Sign up for a free AWS account at: https://aws.amazon.com.
Explore AWS courses on Udemy and A Cloud Guru.
Reach out to me if you have any questions or need advice on how to advance your AWS skills.
31 Jan 2017
Kubernetes (pronounced “koo-burr-NET-eez”) is probably the most popular container orchestration engine out there. A natural follow-up question is: "what the hell is a container and what does container orchestration mean?"
Containers are, at a basic level, individual lightweight standalone packages of software that include everything that this software need to run. Containers are what evolved from virtual machines, and the effectiveness of containerized software lies in the fact that it is standard and can run on any infrastructure.
Container orchestration is the process by which one deploys and manages these individual containers. There are several popular tools othere, like Docker Swarm, Mesos/Marathon, and others as well, but Kubernetes is slowly becoming the go-to container orchestration engine.
Why Should You Care About Kubernetes?
- It is one of the most popular projects on Github, and gets the most activity
- It has a huge active community, so whatever issue you might run into, someone out there might be able to help
- It was developed by Google
- It supports a wide spectrum of languages, including Spring and Java, .NET and .NET Core, Go, Ruby and more
- It is superb at scaling
With that being said, Kubernetes is not necessarily a straightforward tool. Installation instructions might be different for each OS and the setup is a little complex for the beginner.
A great place to start is their official tutorials
and their user guide.