Monday, April 24, 2017

My experience taking Splunk architecture certification exam

I recently took the Splunk Architecture Certification exam. This details some of my thoughts, opinions, and experiences in taking the exam.

Prerequisites 

Almost everyone that will register for the Splunk Architecture Certification exam will have done it through an employer sponsorship. Otherwise, it will become prohibitively expensive to take all the courses plus pay for exam. 

In order to take the Architect exam you need to take and pass a series of courses before Splunk will even let you register for the exam. The following is a list of prereq's required. 
  • Using Splunk
  • Searching and Reporting with Splunk
  • Creating Splunk Knowledge Objects
  • Splunk Administration
  • Advanced Dashboards and Visualizations
  • Architecting and Deploying Splunk
Through my previous Splunk experience, I was able to get some of the requirements waived and initially take Power User certification exam, having only taken the Creating Knowledge Objects course. The Power User certification was surprisingly difficult. However, you have 3 opportunities to take the exam. This is the same with the Splunk admin exam. These exams are free. However, given the limited stakes and ability to retake the exam 3 times, these certifications are not a good indication on skill level.

The most useful courses to prepare for the Architect exam are the following.
- Creating Splunk Knowledge Objects
- Splunk Administration
- Advanced Dashboards and Visualizations

Now onto the good stuff.

Architect exam 

The Architect exam provides test takers up to 24 hours to complete the task. Most users will complete the exam in less than 24 hours. Nonetheless, it took me 24 hours to complete the exam. However, this includes time to eat, and a good nights sleep.

There is more than one way to do things in Splunk. This applies to the Splunk architect exam. It may not be 'best' practice or ideal, or the way you do things, but as long as it works you should be good. Keep in mind you only have 24 hours to complete the exam.

Quick and dirty is better than perfect because it allows you to maintain momentum and complete the exam. You may be used to doing it your way or best practices but sometimes it is okay to configure through the GUI and call it a day. Sometimes you do not need to parse all the fields but only those that are necessary to create your dashboard.

Preparing for the exam

Many test takers get stuck on the step where they need to script the install of universal forwarders. I recommend using an existing script or making one yourself and test it before the exam. During the exam, my initial script was hanging and I used a backup script to get through this step.

I did not do this. But it is also a good idea to setup a couple AWS or Azure instances and practice setting up a Splunk environment from the ground up using search heads, indexers and forwarders. Note: setup using bare instances which means no AMI's or pre-built instances.

Exam time 

I found it extremely helpful and timesaving to manage access through SSH keys so I didn't have to type a password every time to log onto my instances.

The exam does NOT test clustering. So no need to setup search head clusters, or index clusters. You will however need to know how to setup peer distributed search aka >1 indexer.

There's no shame in doing things through the GUI, unless the exam specifically calls out for it. =)
I am accustomed to making adjustments directly in the .conf file and appifying those configs. However, done is better than perfect. I most likely could have saved a couple of hours had I followed this advice.

It is worth reserving an hour before the end time to go over everything to make sure you had all bases covered. Some parts are tricky and you can easily overlook something small like I did.

Summary

Overall, for those with hands-on experience with Splunk, I recommend going for the Architect certification. The exam truly tests your ability to setup a small Splunk environment and allows you to test and demonstrate your knowledge from installing, to search, to creating dashboards. If I had to redo the exam, I would have used the GUI more. 


Friday, April 7, 2017

Putting UniFi controller into the cloud

I have finally decided to experiment and learn about the cloud. Putting my UniFi Controller into the cloud was a great project for me to get hands on experience with moving servers and infrastructure off-prem into the cloud.

There are a variety of cloud offerings out there, including one's from Amazon, Microsoft, Google, Digital Ocean. I choose Amazon's AWS because there is good documentation on installing the UniFi controller using their free tier.

Amazon offers a free tier which allows users to learn and experiement with AWS offerings for
approximately 1 year on an EC2 t2.micro instance which is sufficient to run the controller.

Ubiquity has a well written article that is straight forward to following on their website.

https://help.ubnt.com/hc/en-us/articles/209376117-UniFi-Install-a-UniFi-Cloud-Controller-on-Amazon-Web-Services


But wait there is more.

For those that own their own domains, you can configure DNS to point towards your new AWS instance. For the subdomain or path you specify add an A record DNS entry to point to the public facing IP of your ec2 instance.


But wait there is even more.

When going to the new ec2 instance, I was initially presented with a certificate warning since the self-signed certificate was not already trusted nor a root certificate in my browsers store. For AWS hosted websites and instances, they offer AWS Certificate Manager, where they will issue TLS certificates for free with AWS services.

*Note due to the need for load balancers to distribute across multiple IP's, it is not possible to associate an Elastic IP with a Elastic Load Balancer

Since, I was annoyed at my browser always warning me about my "unsecure" connection, I decided to use the AWS Certificate Manager. This required me to place my ec2 instance behind the Elastic Load Balancer. The load balancer will then present my subdomain certificate to users that navigate to the subdomain. In order to make this work, I needed to modify my DNS records to alias my subdomain to the Elastic Load balancer dns name.

I also needed to create listeners, and a target group to forward the requests.

After this project, I now have a UniFi controller that I can access anywhere on the internet 24/7 with a TLS encrypted session on a valid certificate signed by Amazon.