Monday, September 30, 2013

When the Shell has too much power

We had an incident last Friday. Our production print queues went down for about 20 minutes

Our post incident report has this as the root cause

The following script was run with the variable $server set to PROD-PRINT-SERVER;
$printers = Get-WmiObject -class win32_printer -computername $server | where {$_.name -like "*Toshiba*"}
foreach ($printer in $printers)
{
    $printer.delete()

This removed all Toshiba print queues on the server in question (the production print server). The $server variable should have been TEST-PRINT-SERVER.


Yup! That'll do it. It was all resolved pretty quickly, just restored the print server from the latest snapshot.

So when rolling out powershell to enthusiastic admins make sure to put the breaks on in the right places.

Our fix for this will be to change the ACLs for production print queues so that no-one has the rights to delete them. That way when the time comes to delete a queue we'll have the additional step of changing the ACL but that's better that being able to blow away production print queues with a goofed WMI query.

Anyone have any other thoughts on what could have prevented this situation? Leave us a comment!

Thursday, September 26, 2013

Everyday Powershell - Part 1 - Scheduling Scripts

This is the first part in a series about Powershell. You may have heard about how awesome Powershell is but have struggled to find ways to make it useful in your day to day work. That's what this series is going to address. It'll provide scripts and knowledge to address practical everyday problems! 

In order to really see the power of the shell you must experience what it can do for automation. To do that we'll have to setup a windows scheduled task.

First this is to make sure Powershell will run your script (set-executionpolicy remotesigned) this sets your execution policy to allow local scripts to run. But prevents remote scripts.

So to the actual work you need a scheduled task... you know how to do those right? Of course you do! You wouldn't be looking at a scripting blog if you didn't know that. (If you don't know and would like to, add a comment below and we'll make sure to do a post about that.)

On Server 2003 you can hand it one command with the powershell.exe and the script as an argument;
C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe -PSConsoleFile "C:\Program Files\Microsoft\Exchange Server\bin\exshell.psc1" -command "& 'C:\scripts\MAHSCRIPT.ps1'" 

Set start in to;
C:\WINDOWS\system32\windowspowershell\v1.0



On Server 2008 you want to select the Action "start a program"
You'll get a box for the Program/Script;
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Then Add Arguments
-command "& 'C:\scripts\MAHSCRIPT.ps1' " 

Things to watch out for will be the credentials that the script will run as. Make sure it'll have access to whatever it needs. I usually test the script "Run As" the same account I'm planning to use in the scheduled task, just to be sure it won't break.

So the next thing you'll need is some kind of script to run everyday, if you don't have one tune in for the next post and I'll provide you with something awesome to run everyday!

Tuesday, September 24, 2013

Top 5 Powershell performance tips

Five quick tips to improve performance of powershell scripts. Don't take my word for them either, each tip links to an article written by people smarter than I am!

Don't use the += operator!
It's not appending to your array... It makes a copy of the original array adding the new data then deletes the old array. This won't scale well at all. Dave Wyatt has come up with a great solution.
http://powershell.org/wp/2013/09/16/powershell-performance-the-operator-and-when-to-avoid-it/

Foreach-object can be more efficient than Foreach
When you use a foreach the entire collection of objects is loaded into memory first. foreach-object processes them one at a time. There's an interesting explanation here from Ketan Thakkar
http://social.technet.microsoft.com/Forums/en-US/e8da8249-ea91-4772-ae85-582a4b37425b/powershell-foreachobject-vs-foreach

Filter on the "Left Side" of the Pipe
If you are using the pipeline see if you can filter before piping to the where-object commandlet. Most commonly for me in get-wimobject or get-aduser. If you use a -filter well you won't have to bring the entire dataset down to be filtered on your client. Martin Pugh has done some interesting analysis on this kind of thing here
http://thesurlyadmin.com/2012/10/08/powershell-and-string-searches/

Use Multiple Threads where you can
Can the job be run in Multiple threads? If so it'll be worth the effort to get your head around the complexities of Multithreading. Powershell has the concept of jobs. Here's a pretty good example;
http://stackoverflow.com/questions/16360019/how-do-i-add-multi-threading

Measuring performance of your Scripts
So how do we prove these tips will actually improve performance? Well we can use measure-command! It'll time how long it takes for a particular command to execute. So we can use this to benchmark and work on optimizing our scripts performance. Documentation is here;
http://technet.microsoft.com/en-us/library/ee176899.aspx

Tuesday, September 17, 2013

Copy files modified in the last X days with Powershell

A friend asked on the weekend for some powershell that would allow him to copy files modified in the last 7 days to a new machine. It's a great opportunity to showcase the powershell pipeline feature!
get-childitem "c:\Source" | where-object {$_.LastWriteTime -gt (get-date).AddDays(-7)} | Copy-Item -destination "C:\Target"

This one liner will work for files in a folder (it'll get messy if we recurse though, there's a fix for that coming later).

See the pipe command '|' this used to pass the output of one command as input to another command.

  1. So we get-childitem (which is the powershell name for DIR) 
  2. then we pass that to where-object with a bit of code that looks at the creationtime property. the little $_. refers to the "next object in the pipeline" so we can hook into the properties of all the obejcts we found with get-childitem, perform an evaluation on the creation time,
  3. Then the object is passed to copy-item

Now if we want to be able to recursively do this we'll need to be a little more tricksy!

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
$source = "c:\users\benh\documents"
$target = "c:\test"

$files = get-childitem $source -Recurse
foreach ($file in $files)
{
    if ($file.LastWriteTime -ge (get-date).AddDays(-7))
    {
        $targetFile = $target + $file.FullName.SubString($source.Length)
        New-Item -ItemType File -Path $targetFile -Force
        Copy-Item $file.FullName -destination $targetFile
    }
}

This process uses a foreach loop which is similar to the pipeline in that it'll iterate through each item. But it allows us to use a complete script block which gives us a bit more power at the cost of performance. The main difference here is;
  1. We use an IF statement to evaluate how old the file is
  2. We figure out what the target path should be by dropping the $source directory from the files full name
  3. We create a new-item with the path we've established
  4. Then it's just copying the item as normal
Feel free to add and comments or if you've got a more elegant way of solving this problem let us know!