Comparing Remote Files Without Breaking a Sweat

Today I needed to compare a relatively large list of remote files to a local copy. Yep, you guessed it... it's auditing time again!

Here's what my setup looks like.

From various other commands (lots of pipes), I parsed my kludgy list of server metadata down to a list of servers that I needed to check. On that note, despite the really terrible methods we're using to track this kind of information, I really do enjoy the challenge of having to write a bash or perl one liner to parse the output of some badly formatted unknown space quantity delimited data whose case is most likely wrong, trimming multiple spaces, fixing the case, grabbing the columns I need, and redirecting to a file for later use. My thanks to the folks a la GNU for cat, tr, cut, grep, and still again tr.

Anyways, back to the topic at hand. We now have a list of server hostnames, one per line. As they say, "Hey guys, watch this!"

So what have we here?

Firstly, we start up a bash for loop. This will make $s equal to the name of each of the servers as we loop to them.

Now, inside of the loop we first echo the server's name ( $s ) so we've got a marker to tell us which diff we're looking at. After that, the fun happens.

Here, we are running the diff command to diff the remote file with the local file ( sudoers ), and we are redirecting the output to diff.txt. What's neat about this (I think it's neat at least) is the  bit. This is called process substitution. It allows us to take the output of a command and use it as if it were the contents of a file.