28Jan/15

Quick Tip: Opening An Exchange Online Protection Shell

There’s lots of big, exciting, non-blogable things happening at work this week so here’s a very quick tip.

Last week I wrote a post on a PowerShell function I threw in my profile to connect quickly to Exchange. That’s great, but what if you also want to manage Exchange Online Protection (EOP) from a PoweShell console? Well it turns out to be pretty easy.

This looks a lot like the function I showed you last week except it’s connecting to Office 365 and you need to use your Live ID instead of AD credentials.

21Jan/15

Opening A Remote Exchange Management Shell

Here’s a function I stuck in my PowerShell profile. I found myself making lots of remote connections to my Exchange 2013 environment so I put together a quick function to create the connection for me. It’s far from perfect but it saves me time every single time I use it so check it out.

On Line 1, we’re declaring the function – no big deal. I’m naming mine “gimme-exchange” so once my profile loads, I can just type that to start the function.

On Lines 3 to 5 I’m setting a few variables. Line 3 is weird. I made an array of the different Exchange servers I had rather than going through some autodiscovery process. The script will try to open a connection to the first one, if it fails, try the second one, and so on. It’s inefficient but I don’t add and remove a lot of Exchange servers so I can get away with it since this function is just for me. Line 4 is going to be used to detect if we made a connection or not. Line 5 will prompt and store the administrative credentials that you will use to create the connection.

On Line 6, we start looping through all the servers I specified in Line 3. In a Try/Catch block, if we haven’t already made a successful connection to a previous Exchange server, we’re going to make a new connection and import it. You have to make sure you use the -configurationname item because we’re not just creating any old PSSession, Exchange is funny and we connect to it using some special parameters shown in Line 12. When we import the session on Line 13, we’re going to allow clobbering of other existing cmdlets and suppress the output of the import command.

If we run into an error anywhere in the Try block, the Catch block is setup to echo out the error message and continue looping through servers depending on how severe the error is.

That’s it! Yay, saving time.


Update: If you’re already logged in as the user you want to connect to Exchange as, you can skip the credential gathering part and run this instead.

 

14Jan/15

Renewing Exchange 2013 Certificates: SHA-256 Style

I recently ran into an issue that I think is actually pretty funny. It was time to renew the publicly trusted certificate that we install on our Exchange 2013 servers that gets tied to SMTP, OWA and some other IIS services like autodiscover. Since SHA-1 is on the road to deprecation, our cert vendor pushed pretty hard to get something with a hashing algorithm of SHA-2 (or SHA-256, it’s the same thing). Sounds reasonable, right?

Well, here’s the problem. Even though Microsoft is one of many vendors who is pushing the deprecation of SHA-1, Exchange 2013 doesn’t seem to have a mechanism built into it that generates a SHA-2 cert request. Even the New-ExchangeCertificate PowerShell command doesn’t have a way to change which hashing algorithm is used. Windows 2008 R2 and later support SHA-2 but at the time of writing this article, Exchange 2013 doesn’t have a way to generate such a request.

There are other ways to generate cert requests, though. Since Windows Server 2012 R2 supports SHA-2, and even SHA-2 certificates, perhaps – I thought – the Certificates MMC can be used to generate such a cert. I was right, it can be used to generate a SHA-2 cert. “Great!” I can hear you exclaim, “Now give me the steps on how to do it!” Not so fast. I’m not going to put the full steps to generating a SHA-2 certificate using the Windows Certificates MMC here because of a problem.

To generate a SHA-2 request using the Certificates MMC, you add the Certificates for Local Computer snapin to MMC.exe, right click on the Personal certificate store and generate a new request. When asked to choose a request template, you are offered two choices: Legacy or CNG. Legacy doesn’t support changing your hashing algorithm and therefore only generates SHA-1 requests. CNG it is, then! I continued on, generated my SHA-2 cert request, got it approved and took the certificate from my provider and went to test it. Almost everything worked except I couldn’t log into OWA or ECP. Why not? Because Exchange 2013 stores lots of info in encrypted cookies about you when you log into these services and it can’t use a CNG certificate to decrypt the data. Whenever I logged in, I would be immediately redirected back to the login page as if nothing had happened because the encrypted cookie with all my info (like “you logged in as username“) couldn’t be decrypted since Exchange 2013 can’t use the CNG provider in Windows 2012 R2.

How else can you generate a SHA-2 certificate request, then? The Windows MMC requires you to associate it with a CNG provider that Exchange 2013 can’t use and Exchange 2013 doesn’t allow you to create a SHA-2 request? This isn’t looking good, SHA-1 is being deprecated and I have to renew my certificate!

The answer turns out to be unbelievably easy. I couldn’t believe this worked.

Here’s the answer. Here’s how to renew your Exchange 2013 public certificate with a SHA-2 hashing algorithm…

  1. Use the New-ExchangeCertificate PowerShell command to generate a cert request that is perfect for your needs, minus the fact that it will request a SHA-1 hashing algorithm.
  2. Submit the request to your public certificate provider but indicate that it is a SHA-2 certificate request. Your provider should have an option to indicate what sort of certificate your request is set up for. Make sure you say SHA-2.
  3. Your provider will give you back a SHA-2 certificate that is not associated with a CNG provider that will work with your Exchange 2013 environment for all SMTP, IIS, IMAP and POP services you wish to bind it to.

That’s right, the answer is to generate a SHA-1 request anyway and tell your provider that it’s SHA-2.

Credit to Microsoft Premier Support for figuring this out.

07Jan/15

SMA Runbook Daily Report On SMA Runbook Failures

The sad reality of using Service Management Automation is that it can be a little iffy in the stability department. That being so, I decided to put together an SMA runbook that would report on all the other SMA runbook failures of the last 24 hours. Yes, I realize the irony in using SMA to report on its own runbook failures. One must have faith in one’s infrastructure and this particular runbook.

First things first, I need to declare my runbook/workflow and get the stored variable asset which holds my SMTP server.

Easy! Now the SMA PowerShell cmdlets work best in actual PowerShell, not in workflows so I’m going to cheat and use an inlinescript block to hold pretty much everything else. Now before we get to the good stuff, I’m going to knock out the easy task of setting up my try and catch blocks as well as my function that sends email.

Most of this is pretty straight forward. I’m going to put some stuff in a try block and email it out if it works. If I catch an error, I’m going to email a notification that something screwed up in the try block.

Now the meat and potatoes. Something actually worth making a blog entry for! We need to build the content of the email we’re sending in the try block. Right now it’s just text that says “better put something in here” and it’s right. We better.

Wow that got a little ugly really quickly. What you need to keep in mind is that a lot of this ugliness is styling to make the email report pretty. That’s a little counter-intuitive but, hey, welcome to scripting as a working sysadmin. Let’s break it down line by line.

Line 8 is getting an array of failed jobs in the last day. It’s a big pipeline which:

  • Gets the jobs
  • Where there’s a JobException property on the job
  • Where the endtime is greater than within the last day

In Line 9 t0 20, if there are jobs in the array of failed jobs within the last day, we have to build a report. Line 11 initializes the HTML that will become the body of our report by putting in a table, its column headers and styling it.

Starting on Line 12, for each failed job in the array of failed jobs, we’re adding a row to our HTML table with the start time, end time, the runbook ID, the runbook name and the exception that was thrown.

On lines 19 to 21 we finish off our HTML for the body of our email. Then we use the code we already wrote to send it to us.

Boom. Pretty nice report on failed jobs. Hopefully you never see one in your inbox, otherwise you’re going to have some troubleshooting to do.

30Dec/14

Quick Tip: Run An SMA Runbook At A Specific Date/Time

Happy New Year’s Eve! Here’s a quick tip just before New Year’s.

I recently answered a question on Technet about scheduling SMA runbooks. It’s no secret that the scheduling engine in Service Management Automation leaves something to be desired. Here’s how I like to use PowerShell to get specific about when an SMA runbook is going to be triggered.

You’ll need the SMA PowerShell tools installed and imported for this to work.

Line 1 is easy, it’s just a variable for a datetime object and it’s going to represent the time you want to trigger the runbook. Line 2 is a variable for what the name of the SMA schedule asset will be. I like to add something dynamic here to avoid naming collisions.

Now the interesting parts. On Line 3, we’re creating an SMA schedule asset using set-smaschedule. It’s going to be named our Line 2 variable, it’s going to be a onetimeschedule (instead of recurring), start at our start time (Line 1) and expire three hours after the start time. On Line 4, I’m triggering the runbook with start-smarunbook and specifying the schedule we created on Line 3. I’m also passing parameters in a hash table.

You’re done! The only hiccup with this I’ve seen is if one of your parameters for your runbook is a hashtable. Matthew at sysjam.wordpress.com covered this weird situation in a blog post very recently.

17Dec/14

Quick Tip: Get All SMA Runbook Schedules That Will Run Between Now And Then

I wanted to do some maintenance on my SMA runbook servers but couldn’t remember which jobs were going to run in the next 12 hours (if any). Luckily there’s a quick way of getting that information! This work assumes that you have the SMA tools installed and that you ran the below command or have it as part of your profile.

Behold!

This isn’t a very crazy command. “your-server” is the server where you have the SMA management items installed, not an individual runbook server.

You’re getting all the SMA schedules from your SMA instance and filtering for items whose next run is after “now” and before “now plus 12 hours”. You can change the get-date related items easily to suit your needs. For instance, what ran last night? What will run tomorrow? What ran on October 31?

 

10Dec/14

SMA Runbooks And UTC Time

I don’t know about you but I hate dealing with systems that use UTC time. I have SMA runbooks that work with Exchange 2013, Exchange Online Protection and other services that annoyingly return results in UTC instead of my local timezone. I wrote an SMA runbook that can be called from other SMA runbooks to do the conversion for me.

It’s pretty simple runbook! It has one mandatory parameter $UTCTime which, as the name would suggest, is the UTC time that you want to convert to your local time.

Line 7 gets the local timezone by performing a WMI query. Line 8 uses the [SystemTimeZoneInfo]::FindSystemTimeZoneByID to convert the value returned from the WMI query into a timezone. Line 9 performs the actual conversion from whatever the UTC time is to the timezone determined in line 8.

This whole thing assumes that the time and timezone are set correctly on your SMA runbook servers.

03Dec/14

Print Everything In A Folder To A Specific Printer

For one reason or another, I found myself in a situation this week where I needed to print all the contents of a directory on an hourly basis. Not only did I need to print the contents, I needed the jobs to go to a specific printer, too.

SMA runbooks to the rescue! I wrote my solution in PowerShell and stuck it in an inlinescript block in my runbook that I invoked on a print server.

First, I needed to get everything in the directory and print it. I originally looked at using Out-Printer but I have images, PDFs, all kinds of non-plaintext files. I needed another solution and it was this:

Foreach file in this directory, we’re starting a process on the file that prints it. It will effectively open the file in whatever the default application is, render it and print it to your default printer. Great! Except what if I don’t want to print to the default printer? The Start-Process cmdlet doesn’t seem to lend itself to that very well. As usual, I had to cheat.

Since we’re printing to the default printer, why don’t we just change the default? Well, because maybe the default printer (that we don’t want to print to) is default for a reason. So let’s change the default printer and change it back after.

Line 1 gets the name of the default printer. Line 2 sets the default printer to My Desired Printer which is presumably the name of a valid printer on the server. Line 4 sets the default back to whatever the original default was and we already know what line 3 does. Obviously, this is a solution that works in my specific environment that can tolerate a brief interruption to which printer is default.

The rest was easy. I setup a new SMA runbook, invoked the above script on my print server (in an inlinescript block) and scheduled it to run hourly.

26Nov/14

Open File Dialog Box In PowerShell

Here’s a neat little PowerShell function you can throw into your scripts. Lots of times I want to specify a CSV or TXT or some other file in a script. It’s easy to do this:

But that means you have to type the whole absolute or relative path to the file. What a pain. I know what you’re thinking… There must be a better way!

There is! Use an open file dialog box. You know, like when you click File, Open and a window opens and you navigate your filesystem and select a file using a GUI. How do you do it in PowerShell? Let me show you. First things first: let’s declare a function with a couple of the items we’re going to need.

I’m going to name this function Get-FileName because I like the Verb-Noun naming scheme that PowerShell follows. It’s got a parameter, too. $initialDirectory is the directory that our dialog box is going to display when we first launch it. The part of this that most likely looks new is line 3. We need to load a .NET item so we can use the Windows Forms controls. We’re loading via partial name because we want all the Windows Form controls, not just some. It’s faster and easier to do this than it is to pick and choose. We’re piping the output to Out-Null because we don’t want all the verbose feedback it gives when it works.

Now let’s open the thing and get to business selecting a file.

On line 5, we’re creating a new object. That object is unsurprisingly an OpenFileDialog object. On line 6 we’re specifying that initial directory that we got in the parameter. On line 7 we’re doing something a little interesting. The filter attribute of the OpenFileDialog object controls which files we see as we’re browsing. That’s this part of the box.

OpenFileDialogFilter

 

I’m limiting my files to CSV only. The first part of the value is CSV (.csv) which is what the dialog box shows in the menu. The second part after the pipe character *.csv is the actual filter. You could make any kind of filter you want. For instance, if you wanted to only see files that started with “SecretTomFile”, you could have a filter like SecretTomFile.

The next item on line 8 is to open the dialog box, we do that with the ShowDialog() function. We discard the output from this command because it’s spammy in this context, just like when we added the .NET items.

One last thing! We’ve created, defined and opened our OpenFileDialog box but don’t we actually need to get the result of what file was selected? Yes, we do. That’s pretty easy, though.

The Filename attribute is set when someone commits to opening a file in the OpenFileDialog box. On line 9, we’re returning it to whatever called our script.

So to use this function in the same way as the example at the top of this post, your code would look like this.

I think this is a lot nicer than typing a filename every time you want to run a script. I find it particularly convenient on scripts I run a lot.

19Nov/14

Cheating To Fix Access Is Denied Error Using Get-WMIObject

I was doing a little work that involved using PowerShell to get a list of printers from several remote print servers. I figured this would be a great job for WMI and I was right. The command I used, looked like this.

I had a list of print servers that I imported into an array and looped through them but this is the important part of the code. I am simply using WMI to get some information about the logical printer objects on a given print server and exporting them to a CSV.

How boring! This isn’t a very old blog but we usually talk about more complicated things than that. Well things got weird on one print server that we’ll simply call PrintServer2. PrintServer2 threw an error instead of working nicely.

Not cool. I have Domain Admin rights… it’s a domain joined server… what do you mean access is denied? Running the command locally on the server worked, I just couldn’t do it remotely. There’s plenty of literature on trying to fix this error already but I was in a hurry so I tried the next thing that came to mind: cheat a bit and run the command locally on the server… remotely.

I didn’t do anything ground breaking, I just used invoke-command to run the command on the server instead of running the command on my local machine (to retrieve remote information).

Hah! I beat you, stupid Windows Server 2003 box that has been around since I was in junior high school and needs to be decommissioned! I got your printer information from you without having to fix any of your weird problems!

The moral of the story is that sometimes, you can cheat a little bit to accomplish your goal and avoid doing a whole bunch of terrible patches, regedits, etc. to your infrastructure.