12Nov/14

Report On Expiring Certs From A Powered Down Certificate Authority

Let’s hypothetically say I have an old Windows Server 2003 Intermediate Certificate Authority. Let’s also hypothetically say that I already replaced my antiquated Windows Server 2003 PKI infrastructure with a Windows Server 2012 PKI infrastructure and I am only keeping the 2003 stuff around so it can publish a CRL and to run a monthly script that tells me which certs are going to expire within 60 days. It’s good to know which certs will expire within 60 days so you can remember to renew them or confirm that they don’t need renewal.

Perhaps I decide to shut down the 2003 CA so it quits taking up resources and power but I keep it around in case I need to power it back on and revoke a certificate. How do I keep getting those monthly reports about which certs will expire soon? Note: I’m not addressing the CRL publishing concerns or certificate revocation procedure in this post. We’re only talking about the expiring soon notification issue.

Service Management Automation to the rescue! We’re going to set up an SMA runbook that is going to send us monthly emails about which certs from this 2003 CA are going to expire within 60 days. How the heck are we going to do that when the server is powered off? Well, we’re going to cheat.

If we’re going to power this CA down, we’re going to need to get the information on its issued certificates from somewhere else. A CSV file would be a nice, convenient way to do this. As it would turn out, generating such a CSV file is really easy.

Before you power off the old CA, log into it and open the Certificate Authority MMC. Expand the Certification Authority (server name) tree, and the tree for the name of the CA. You should see Revoked Certificates, Issued Certificates, Pending Requests, Failed Requests and maybe Certificate Templates if you’ve got an Enterprise PKI solution. Right click on Issued Requests and click Export List. Switch the Save As Type to CSV and put the file somewhere that you can see it from within Service Management Automation like a network share. Note: Once I get my CSV out of the CA MMC, I manually removed the white space out of the column headings to make it nicer for me to read.

Export your list of Issued Certificates to a CSV file.

Now the fun part. Time to put together an SMA runbook that will go through this CSV and email me a list of all the certs that are going to expire within 60 days of the current date. Sounds scary right? Well it turns out that it isn’t so bad.

Let’s start simply by initializing our PowerShell Workflow. You get to this point in SMA by creating a new runbook. I’m also going to set up a whopping one variable.

Our workflow/runbook is called GetExpiringCerts. The variable $strInPath is the location to where I have all the files I ever load into SMA. That is, it’s just a path to a network share that’s the same in all my runbooks that use it.

So far so good? Good. Next we need to look through the CSV that’s somewhere beneath $strInPath for all our certs. Let’s break down that big block of code:

There’s some cheating right there. PowerShell Workflows are different than regular vanilla PowerShell in ways that I don’t always like. To make a PowerShell Workflow (which is what SMA runbooks use) execute some code like regular PowerShell, you need to wrap it in an inlinescript block. We’re going to take the output of the inline script and assign it to $strCerts.

Let’s break down what’s inside that inlinescript block. First we’re going to import the CSV full of our Issued Certificates from the CA. To use a variable defined outside the inlinescript, you prefix it with “using:”, hence $using:strInPath. $strInPath is defined outside the inlinescript block but I want to use it inside the inlinescript block.

Now to build an array of all the certs we care about. The variable $arrCertsExpireAfterToday is going to hold a selection of the CSV we loaded into $csvCerts. We take $csvCerts and pipe it into a few filters. The first: where the Certificate Expiration Date is greater than today’s date. That way we don’t look at any certs that are already expired. The second: where the Certificate Expiration Date is less than two months from now. That way we don’t see certs that expire in two years that we don’t care about yet. That’s it! That’s the array of certs. Now all we need to do is make the output look nice and send it.

On line 7, we start building the body of the email we’re going to send and assigning the future body of our email to $strResults. I want to send an HTML email because it’s prettier. My email will start with a line that tells us what’s coming next. Then, we need to get the information out of $arrCertsExpireAfterToday and format it nicely so it may be sent. We’re going to pipe the contents of $arrCertsExpireAfterToday through a foreach-object function that will make some nice HTML output containing the Issued Common Name, the Certificate Expiratino Date and the Requester Name. You can format your report differently, take different headings, etc., but this is what worked for me.

I print $strResults on line 9, as the last thing I do in the inlinescript block so that $strResults becomes the value returned by the inlinescript block and therefore the value of $strCerts (line 4).

We’re almost out of the woods. All we have to do now is send the email.

Easy. We need a subject, a list of people to send the email to, and an SMTP server. My email To list is an array and I store my SMTP server in an SMA asset. Then I use the send-mailmessage cmdlet to shoot this email off. Make sure to use the -bodyashtml flag so the HTML is parsed correctly instead of being included as plaintext.

That’s it! Set an SMA schedule to run monthly and you’ll get yourself monthly email notifications of certificates that are due to expire within 60 days even though the CA that issued them is powered off!

05Nov/14

Which Exchange Mailbox Database Was A Certain User’s Mailbox In On A Specific Day?

In Exchange, user mailboxes are stored in databases. You regularly back up these databases, don’t you? Good.

Now imagine the following. User A has a mailbox in Database01. This database is backed up daily. Now imagine User A’s mailbox was moved to another database, Database02. What if User A came to you and needed something recovered? Okay no problem, load up the backup for Database02 and you can recover anything for that user since the user has been on Database02. Wait, what do you mean you want something from BEFORE you were on your current database, User A? How am I supposed to know what database backup I need to mount to find your stuff? Exchange only knows what database you’re on, not what database you came from! Your data is on a backup for who knows which database!

My solution for this is a bit bulky. It involves automating a script to export a list of all users and the database they’re on. The idea is, if you export this list daily, you will have an archive of what database all your users are on for any given day. Even if they move, you can reference the output from this script and see which database their mailbox was on during the day in question. Then you will know which database’s backup you need to load to help User A get his stuff back.

For automation, I am using Service Management Automation (SMA). I love SMA. It uses PowerShell Workflows which are kinda, sorta, almost like regular ol’ PowerShell with some differences. I’ll point out the parts of my solution that aren’t vanilla PowerShell.

First things first, I need to declare my workflow and stick a Try Catch block in it:

We’re already running into something funny. What is Get-AutomationVariable -Name ‘SMTPServer’? In SMA, you can store variables for any of your runbooks to use. This cmdlet is retrieving the previously stored value for my SMTP server. This is nice because if I change SMTP servers, I can update the single SMA asset instead of updating all my scripts individually.

Great! Now let’s actually write the part of the script that does something useful. Let’s start by initializing some variables:

A couple new lines in the Try block! $connectionURI is pretty straight forward. You’re going to make a remote session to the Exchange Management Shell on your Exchange Server so your script needs to know where that is. $getMBXconn is a new PSSession to the URI you specified. Notice that I’m not passing any credentials specifically to this one. The service account that’s running this SMA runbook has the rights it needs to do this. You can either do the same or you can pass specific credentials from an SMA asset or stored elsewhere.

What is it all wrapped in, though? Some inlinescript block? Like I mentioned above, PowerShell Workflows are different than PowerShell. Some of the commands coming up don’t work in PowerShell Workflows but do work in PowerShell. By wrapping the PowerShell script in an inlinescript block, it executes like real PowerShell instead of a PowerShell Workflow. The outcome of this inlinescript is going to get assigned to $mailboxes.

We have our connection, now we need to actually go get some data:

Alright that looks like a lot of stuff at once, but it wasn’t actually too crazy. Let’s break down what we added.

Lines 17 – 24 are just another Catch block to tell us if we messed up in lines 11 – 13.

On lines 11 – 13, we’re invoking a command in our remote session. We’re running a get-mailbox, specifying that we want all of them and that we don’t want to stop on an error. We pipe that into a select-object command to filter out only data we want, the database, name of the mailbox, the server the database is on and the guid. On line 13, we write that data (which in turn is assigned to $mailboxes since $mailboxes is the value of whatever comes out of the inlinescript).

On line 26, we get rid of our PSSession.

On lines 29 and 30 we’re writing our findings to a csv file and naming it in a way that includes the current date and time. That’s how we know what script output is for which day.

That’s our final solution! All you need to do now is create the SMA schedule to run this as frequently as it makes sense for you. You might also want to include some logic to clean up old files. Then you can just go to the output folder, find the file that corresponds with the day you care about and find the database that User A was on the day they deleted the wrong file.

02Nov/14

First Post

Everybody knows that the first post on a blog isn’t supposed to have any real content or be super helpful. Let’s just get it out of the way, then.

You may be interested to know about a couple articles I wrote for SysJAM that would fit in well here:

  1. Using PowerShell to find out who has access to a directory
  2. Troubleshooting an issue with calling a SCORCH runbook from SMA

I guess I could also plug the About/Contact page in case you somehow missed the big link at the top of every page.