Wednesday, November 21, 2018

importing AWS resources into Terraform

Terraform is a wonderful tool! It helps simplify DevOps work. It turns the thorny bramble of delicate networking, users, databases, and virtual machines into a simple and well-running machine. It allows us to chant "infrastructure as code" to the amusement of well-meaning technologists. Best of all: it lets us have consistent environments. A dev can wreak havoc, learn things, then create a Terraform patch that applies to the entire collection of systems, making everything just a little bit cleaner and better understood.

Terraform, although being a moderately baked and flexible tool, has a few warts. One challenge is that it doesn't play with manually-created resources very well. If you create some users in Terraform, and some users in the AWS Console, applying Terraform later will try to delete the manual users. Terraform imagines that it is the alpha and omega, and that all things are as it thinks they are.

Additionally, Terraform isn't very smart about importing manually-created resources. Traditionally we have to use a third-party tool, terraforming, to do this task. The combination of terraform (to create/update resources) and terraforming (to import manually-created resources) is useful.

Example: here's how to import all the SNS Topics ("snst") to a Terraform file:

$ AWS_PROFILE=myprofile terraforming snst --region=myregion | tee temp-sns.tf
resource "aws_sns_topic" "dynamodb" {
  name            = "dynamodb"
  display_name    = ""
  policy          = ...

}

Now, edit the temp-sns.tf file to make things more clear and regular, then plan and apply with Terraform as per usual.

In AWS, users aren't just users, they're defined in several different types of Identity and Access Management (IAM) resources. Here's how to import just the simple user records:

AWS_PROFILE=myprofile terraforming iamu --region=myregion | tee temp-iamu.tf
resource "aws_iam_user" "john" {
    name = "john@johntellsall.com"
    path = "/"
}

In practice, users aren't useful except as combined with Roles, Groups, and Policies. It's a whole thing. Fortunately, here's a bit of code which imports all AWS IAM user-related permissions into a single Terraform file:

terraforming help | egrep -o 'iam\w+' | AWS_PROFILE=myprofile xargs -I{} -t terraforming {} --region=myregion >> temp-users.tf

Now, you'll be left with a 1,000-line Terraform file for further editing. This isn't that fun, however once you're done, you can move this file into its own module, and apply the same users/groups/permissions on all your environments!

Terraform is a wonderful tool, and in combination with Terraforming and a bit of work, will make your DevOps work a lot simpler!


Thursday, November 15, 2018

Docker leads to so much win (psql ftw)

Containers are an incredibly effective way to be more productive. I'd put it on the same order of convenience as version control: a bit of effort to learn but each technology allows tremendous flexibility and safety and enjoyment out of programming.

Just now I wanted to verify my backups. I didn't want to run Postgres directly on my macOS, as I'm going to nuke the database after a few tests. So instead I started one in its own container. I ran it on a weird port so I couldn't accidentally use the wrong database/proxy:

$ docker run --name temp_postgres  -p 5555:5432 -d postgres:9.6

Next I verify my new database is up and answering commands:

$ PGPASSWORD='' psql postgresql://postgres@localhost:5555/postgres -c 'select now()'

              now
-------------------------------
 2018-11-16 00:32:31.104194+00
(1 row)

The commands worked on the first try! Now I can go ahead and do my real work of verifying backups, then my task will be finished and I'll move on to the next one. Win!

Sunday, November 11, 2018

fast searches with custom search engines

I do a lot of learning, which today means tons of searching on different websites. I've found a trick which makes my job a lot faster -- custom search aliases. In the URL bar, I can type "k explain" to automatically go to the Kubernetes.io site, search for "explain", and give me the results. When I need another cat image for one of my presentations, I type "gis cat" into the URL bar to ask Google Image Search for some inspirational furriness.  Man.cx has all the Linux manpages. Python.org has all the Python modules carefully documented. I have aliases for all of the above and use them constantly.

Here's how to make your daily searches much, much easier:

Easy, very fast keyword searches


1. go to site, do search. Example: https://kubernetes.io, search for "explain"
2. URL has term in bar. Replace it with "%s". For the above example you'll get https://kubernetes.io/docs/search/?q=%s
3. copy URL
4. right click URL, select "Edit Search Engines"
5. under Other Search Engines, click Add button
6. type something for Search Engine ("kubernetes"), then a short alias ("k"), and for the URL, paste the URL with "%s" in it
7. click Add

Testing


In URL bar, type alias then another search term. e.g. "k beer". The resulting page will be a Search Results page, with your new term in it.

Compatibility


The above instructions are for Google Chrome, but all browsers support something like this.


Friday, July 20, 2018

TIP: Bash has “global search and replace”!

TIP: Bash has “global search and replace”! It works with the history mechanism. Example, the bangbang (!!) command repeats the previous command:

$ !! # repeat previous command

Adding a colon (:) and then a letter or two will modify the command before running it.  A useful modifier is "p" for printing.  That is:

$ !!:p # repeat previous command, but just :p-print it

This is useful because you can use up-arrow to now go to the previous command and edit it interactively.

For non-interactive editing, you can do global search and replace!  Example: use the "repeat previous command" command, bangbang (!!). Then modify it (:), then say "global search" (gs).  To do this to find "one" and replace it with "two", use this command:

$ !!:gs/one/two

In my real-world case, I'd already run a command to deploy my Development server with Terraform. The specific command is:

AWS_PROFILE=development terraform plan -var-file=../config/development.tfvars

This command was in my shell history.  I want to recall this command (bangbang, aka !!), then search and replace "development" with "staging" to use Terraform to deploy to my staging environment.

Command:

$ !!:gs/development/staging

Got me:

AWS_PROFILE=staging terraform plan -var-file=../config/staging.tfvars

Resources:


Bash scripting cheatsheet

running My Traceroute (aka Matt's traceroute) (MTR) on macOS


Mtr is a wonderful program that combines ping and traceroute. It shows you each hop along a path to another host on the internet, and how long each hop takes.  It's my #1 go-to tool to debug wifi / networking / DNS issues. And, it's pretty!

Anyway it requires extra privileges, so it's a bit fiddly to run. Even worse, The Internet Is Wrong on this topic, there's lots of bad advice.

Here's how to install and run mtr on a macOS machine:

brew install mtr

PATH=$PATH:/usr/local//Cellar/mtr/0.92/sbin sudo mtr 8.8.8.8

The "8.8.8.8" is a magic IP. Easy to remember, it's a public DNS router that our friends at Google make available to the public. You can use any IP or domain name here. I use the all-8s IP, because sometimes my DNS isn't working, so pinging a raw IP will tell me if my DNS is acting up, and if so, which one.

Here's what it looks like:



If you press d, it switches displays to more visual. This lets the "bad actors" in the network jump out:


Since this is a terminal-based CLI program it's easy to install and run on a server. Maybe your local network is good, but the server's network or DNS is acting up -- mtr will make issues really easy to see and fix!

Thursday, July 19, 2018

tech book recommendations

Recently I was asked about Python books covering Object Oriented programming for someone coming from another language. Here are some resources:

- I recommend subscribing to Safari Books Online. They have jillions of books and videos, including "Python Beyond The Basics - Object Oriented Programming" (high-ranked video). It's $40/month. If you're a professional Dev or DevOps, or trying to be, it's an easy investment.

- the class section of "Modern Python Cookbook" (book) has tons of real-world Python idioms: designing classes with lots vs little processing, classes with __slots__, and advanced class design... Actually this book looks great I'm going to read it.

- David Beazley's "Python Cookbook" is great, overflowing with real-world problems, solutions, and discussion.

- in the Los Angeles region all the Lynda.com tech resources are freeee with an LA public library card.

- the "Cookbook" books I find are good for someone coming from another language, the books don't spend 100 pages talking about dictionaries and strings and so forth. It turns out there's tons of Cookbooks nowadays: data visualization, machine learning, testing, you name it!

Friday, July 6, 2018

Kafka on macOS

Generally I run everything in Docker: it's less fiddly, and I can do a clean uninstall very easily. Alas Docker networking is different, and changes every few months as Docker makes things easier... by changing the networking.

As of July 2018 here's the easiest way I've found to run Kafka on macOS:

brew install kafka kafkacat zookeeper
brew services start zookeeper
brew services start kafka

Once Kafka is up, list out the brokers:

$ kafkacat -L -b localhost
Metadata for all topics (from broker -1: localhost:9092/bootstrap):
 1 brokers:
  broker 0 at 192.168.0.133:9092
 0 topics:

Now let's go to the mysterious directory that has tons of good tools. Create a topic, then use "kafkacat" again to verify our new topic has been created:

$ cd /usr/local/Cellar/kafka/*/libexec/bin
./kafka-topics.sh --create  --topic example-topic --zookeeper localhost:2181 --partitions 1 --replication-factor 1

$ kafkacat -L -b localhost
Metadata for all topics (from broker -1: localhost:9092/bootstrap):
 1 brokers:
  broker 0 at 192.168.0.133:9092
 1 topics:
  topic "example-topic" with 1 partitions:
    partition 0, leader 0, replicas: 0, isrs: 0

Woot! My thanks to springheeledjak and pkafel!