Saturday, October 14, 2017

Credit Freezes

When freezing your credit file remember to place it at the following places.

Major three credit bureaus

- Experian
- Equifax
- Transunion

In addition, the following places
- Innovis
- Sage Stream

References:

https://krebsonsecurity.com/2015/06/how-i-learned-to-stop-worrying-and-embrace-the-security-freeze/

Saturday, October 7, 2017

Rewriting indexes.conf for volume based definitions

At some point when you move from a one tiered storage to a two tiered storage in Splunk where hot/warm buckets are on fast storage (SSD), and cold on slow storage (HDD), you may need to rewrite your indexes.conf

Rewriting your indexes.conf is a fairly easy exercise, but can go disastrously wrong. Just remember that thawed storage can not reference volumes. So double check that the same data locations are still referenced. I also recommend that you rework the Splunk internal indexes ie _internal, _telemetry to reference volumes.

When rewriting your indexes.conf. I recommend placing the index cluster into maintenance mode to prevent buckets from moving in the event there is a bug in your conf file. Then monitor your index cluster for unusual activity once the modified indexes.conf file is deployed. Anomalous activity can include the number of tasks / fixup tasks increasing drastically. In our situation, we saw the number of tasks go beyond 10,000 tasks.

If you see the following, likely something has gone wrong.

- Data may not be searchable temporarily
- Search factor may not be met
- Replication factor may not be met
- High number of fixups to meet search and replication factor

In my situation, somehow at the OS level, our symlink mapped to a different location with our new indexes.conf. Resulting in a high number of fixups and Splunk not seeing the data.


Wednesday, August 23, 2017

Equifax and credit freezes

Equifax is one of the biggest pains when it comes to credit freezes.

When you have a credit freeze in place with equifax, pulling a credit report via annualcreditreport.com becomes a chore. The site will require offline verification in order to view and audit your report.

Furthermore, when lifting credit freezes, I have encountered the same hassle where it requires an offline submission to lift the freezes. Luckily, I found a hotline 888-298-0045 where you can call to get your freeze lifted.

Wednesday, August 2, 2017

My experience taking FOR610 - Malware Analysis Training

I have recently taken SANS FOR610 with Lenny Zeltser

SANS courses are typically very expensive, I would not recommend paying the full price if it is out of pocket. There are other resources online, and books that deliver more bang for the buck. But many employers will pay for the course because there are few other places that will deliver as much practical knowledge within the time span of 1 week, and many of the concepts you learn can be applied immediately in your job. 

The FOR610 course has been recently been rewritten and the authors have done a good job with updating the material to keep up to date. Fundamental reverse engineering and malware analysis techniques have stayed the same. However, I was surprised at the amount of changes made in the number of available tools. A major change was the use of x64dbg as a debugger over the classic Ollydbg. 

Prereq's for the course

- Laptop
- Ethernet Jack
- VMware Pro
- 8gb Ram
- 128GB ssd

You do not need to have prior programming experience to take this course. However, I recommend people taking the course should have at least taken 1 semester of computer science / programming course. Knowing basic programming concepts such as switches, if statements, arrays, API calls, will help. To get the most out of the course you should also be able to read basic assembly. Know what registers are EDI, EAX, ESP, AL, EBP.... would greatly help on certain days.

I went into this course not coming from a developer or programming background nor doing malware or reverse engineering as part of my job. 

Day 1
Overview of knowledge domain of malware analysis, and goes into dynamic analysis. There's some basic static analysis such as using strings or pestudio.

Day 2
Intensive day into Assembly. What does control flow, conditional statements look like in assembly. 

Day 3
Analyzing pdf and word attachments, deobfuscating javascript

Day 4
Unpacking malware and using debugger to dump from memory

Day 5
Anti-analysis techniques implemented by malware. Different tricks malware authors will utilize to detect virtual environments. 

Day 6
Capture the flag. This was a good exercise in putting the concepts we learned into practice. Not a necessary day since new material isn't covered. However, highly recommended. 

Things missing from the course

This is not a hard core reverse engineering course. Kernel debugging isn't covered nor arm reverse engineering and analysis. Importing symbols also isn't covered. 

Overall
I recommend this course to anyone that does Incident Response, reverse engineering, or malware analysis. The course has a good mix of dynamic and static techniques that will help improve your skills. I do recommend that people should have a basic programming background and/or have done malware analysis before. The learning doesn't stop once the course ends. The course authors have packed more material than there is time to present, there are exercises in the appendix to practice on your own time. I would have liked SANS modify the course to have 1 or 2 days of extended hours There is simply more material than there is time to cover on some days.

Resources for additional learning
Malware Unicorn's RE 101 course on her website provides very good material to learn and great graphics. 
Practical Malware Analysis
Mandiant's Flare challenges

Monday, April 24, 2017

My experience taking Splunk architecture certification exam

I recently took the Splunk Architecture Certification exam. This details some of my thoughts, opinions, and experiences in taking the exam.

Prerequisites 

Almost everyone that will register for the Splunk Architecture Certification exam will have done it through an employer sponsorship. Otherwise, it will become prohibitively expensive to take all the courses plus pay for exam. 

In order to take the Architect exam you need to take and pass a series of courses before Splunk will even let you register for the exam. The following is a list of prereq's required. 
  • Using Splunk
  • Searching and Reporting with Splunk
  • Creating Splunk Knowledge Objects
  • Splunk Administration
  • Advanced Dashboards and Visualizations
  • Architecting and Deploying Splunk
Through my previous Splunk experience, I was able to get some of the requirements waived and initially take Power User certification exam, having only taken the Creating Knowledge Objects course. The Power User certification was surprisingly difficult. However, you have 3 opportunities to take the exam. This is the same with the Splunk admin exam. These exams are free. However, given the limited stakes and ability to retake the exam 3 times, these certifications are not a good indication on skill level.

The most useful courses to prepare for the Architect exam are the following.
- Creating Splunk Knowledge Objects
- Splunk Administration
- Advanced Dashboards and Visualizations

Now onto the good stuff.

Architect exam 

The Architect exam provides test takers up to 24 hours to complete the task. Most users will complete the exam in less than 24 hours. Nonetheless, it took me 24 hours to complete the exam. However, this includes time to eat, and a good nights sleep.

There is more than one way to do things in Splunk. This applies to the Splunk architect exam. It may not be 'best' practice or ideal, or the way you do things, but as long as it works you should be good. Keep in mind you only have 24 hours to complete the exam.

Quick and dirty is better than perfect because it allows you to maintain momentum and complete the exam. You may be used to doing it your way or best practices but sometimes it is okay to configure through the GUI and call it a day. Sometimes you do not need to parse all the fields but only those that are necessary to create your dashboard.

Preparing for the exam

Many test takers get stuck on the step where they need to script the install of universal forwarders. I recommend using an existing script or making one yourself and test it before the exam. During the exam, my initial script was hanging and I used a backup script to get through this step.

I did not do this. But it is also a good idea to setup a couple AWS or Azure instances and practice setting up a Splunk environment from the ground up using search heads, indexers and forwarders. Note: setup using bare instances which means no AMI's or pre-built instances.

Exam time 

I found it extremely helpful and timesaving to manage access through SSH keys so I didn't have to type a password every time to log onto my instances.

The exam does NOT test clustering. So no need to setup search head clusters, or index clusters. You will however need to know how to setup peer distributed search aka >1 indexer.

There's no shame in doing things through the GUI, unless the exam specifically calls out for it. =)
I am accustomed to making adjustments directly in the .conf file and appifying those configs. However, done is better than perfect. I most likely could have saved a couple of hours had I followed this advice.

It is worth reserving an hour before the end time to go over everything to make sure you had all bases covered. Some parts are tricky and you can easily overlook something small like I did.

Summary

Overall, for those with hands-on experience with Splunk, I recommend going for the Architect certification. The exam truly tests your ability to setup a small Splunk environment and allows you to test and demonstrate your knowledge from installing, to search, to creating dashboards. If I had to redo the exam, I would have used the GUI more. 


Friday, April 7, 2017

Putting UniFi controller into the cloud

I have finally decided to experiment and learn about the cloud. Putting my UniFi Controller into the cloud was a great project for me to get hands on experience with moving servers and infrastructure off-prem into the cloud.

There are a variety of cloud offerings out there, including one's from Amazon, Microsoft, Google, Digital Ocean. I choose Amazon's AWS because there is good documentation on installing the UniFi controller using their free tier.

Amazon offers a free tier which allows users to learn and experiement with AWS offerings for
approximately 1 year on an EC2 t2.micro instance which is sufficient to run the controller.

Ubiquity has a well written article that is straight forward to following on their website.

https://help.ubnt.com/hc/en-us/articles/209376117-UniFi-Install-a-UniFi-Cloud-Controller-on-Amazon-Web-Services


But wait there is more.

For those that own their own domains, you can configure DNS to point towards your new AWS instance. For the subdomain or path you specify add an A record DNS entry to point to the public facing IP of your ec2 instance.


But wait there is even more.

When going to the new ec2 instance, I was initially presented with a certificate warning since the self-signed certificate was not already trusted nor a root certificate in my browsers store. For AWS hosted websites and instances, they offer AWS Certificate Manager, where they will issue TLS certificates for free with AWS services.

*Note due to the need for load balancers to distribute across multiple IP's, it is not possible to associate an Elastic IP with a Elastic Load Balancer

Since, I was annoyed at my browser always warning me about my "unsecure" connection, I decided to use the AWS Certificate Manager. This required me to place my ec2 instance behind the Elastic Load Balancer. The load balancer will then present my subdomain certificate to users that navigate to the subdomain. In order to make this work, I needed to modify my DNS records to alias my subdomain to the Elastic Load balancer dns name.

I also needed to create listeners, and a target group to forward the requests.

After this project, I now have a UniFi controller that I can access anywhere on the internet 24/7 with a TLS encrypted session on a valid certificate signed by Amazon.

Monday, January 16, 2017

What do Dash Cam's have to do with Information Security?

I always tell my family and friends to purchase a dash cam for their vehicles. Because you never know. But what does this have to do with Information Security? 

Quite a lot. 

Having a dash cam can help reconstruct what happened right before and after an accident. Often in accidents and investigations there will be conflicting reports between witnesses. People's memories are often hazy or get distorted as they are recalled more often and as more time passes by. 

In information security, I am advocate of having strong Network Security Monitoring infrastructure in place so that when an incident occurs there is data available that can be used to reconstruct the events. One of the worst feelings in information security is realizing that data is simply not there to answer the critical questions. 

So if you drive, go out there and get a dash cam. They run roughly $100-$200 and can be purchased at an electronics retailer or online.