Downloads
126046.zip

Some Windows PowerShell scripts can be useful but consume a great deal of memory, take a long time to complete, or both. Here are some tips on how to improve the performance of such scripts.

 

Optimize Last

 

Don't try to optimize PowerShell scripts as you write them. You might be optimizing code that either disappears on its own or doesn't have a significant effect on final performance. Scripter time is always more difficult to come by than CPU cycles.

 

Use Filtering Parameters

 

PowerShell can consume a lot of resources because some cmdlets are designed to provide immense quantities of data. So, if you're using a cmdlet that has filtering parameters (-Filter, -Include, and -Exclude), use them. 

If a cmdlet supports the -Filter parameter, you want to use it first. It uses the underlying APIs for an object, which means the code is extremely fast because the filter is applied before the cmdlet creates the objects. The
-Include and -Exclude parameters are applied to objects after the cmdlet has already created the objects but before the objects go into the PowerShell pipeline. So, they're slower than -Filter, but they're still faster than filtering after the objects are in the PowerShell pipeline.

Sometimes you should use more than one type of filtering. For example, suppose you're searching for all files on the D drive with the file extension .htm. The -Filter parameter uses the traditional Windows file-system semantics, where *.htm returns all files whose extensions begin with .htm. The Windows APIs implement this filtering, making it extremely fast. There's an important limitation of the traditional APIs, however. They're very old and consequently just ignore anything in a file extension beyond the first three characters. So a search only for *.htm with -Filter would also return .html files, for instance. Therefore, for speedy filtering, you should use both -Filter (to cut out the vast bulk of files before loading them) and -Include (to get only .htm files). Here's what this code looks like

Get-ChildItem -Path D:\ -Filter *.htm -Include *.htm -Recurse

Remember, though, that -Filter uses the underlying APIs, so how fast it works depends on those APIs. Take, for example, the code

Get-WmiObject -Class Win32_Product `<br>  -Filter 'Vendor LIKE "%Microsoft%"'

In this case, the -Filter argument works slower because the Get-WmiObject cmdlet uses the Windows Management Instrumentation (WMI) Scripting API. It's also slower because WMI uses the WMI Query Language (WQL) for filtering, so the filtering occurs within WMI.

 

Reduce Resource Usage

 

Performance optimization is about reducing resource usage as well as reducing execution time. Sometimes you can do both. Other times, you have to make a choice. For example, suppose you need to list all the files on the D drive and do something to each file. You could use the ForEach-Object cmdlet to go through all the file-system objects in the collection like this

Get-ChildItem -Path D:\ -Recurse | ForEach-Object \\{. . .\\}

where \\{. . .\\} represents the code being run on each file. When you use this cmdlet, each object goes through extra packaging work when crossing the pipeline boundary, which slows the code down significantly. However, it doesn't consume much memory because only one item passes through the pipeline at a time.

Alternatively, you could use the iterative foreach loop as shown in this code

foreach($file in (Get-ChildItem -Path D:\ -Recurse)) \\{. . .\\}

where \\{. . .\\} represents the code being run on each file. This loop takes less time to run because it avoids pipeline boundaries. However, it collects all the file-system objects before processing them, so it can consume excessive system resources if the collection is large.

The foreach loop is faster but uses more memory than the ForEach-Object cmdlet. So, the foreach loop is generally a better choice if you don't expect to have large data sets.

 

Throttle CPUs with Sleep

 

PowerShell code that touches many objects often requires a long time to execute and might not yield processor time willingly. This is less troublesome than it was in the days of single-core CPUs, but it still can cause the system to spend a lot of time waiting for things to happen. If you have code that consumes a lot of CPU cycles or needs to wait for something to happen, use the Start-Sleep cmdlet to yield the processor regularly. By default, Start-Sleep operates in seconds, but you can specify a pause time in milliseconds. Clock resolution is typically no better than 10 to 20 milliseconds, so the smallest sleep time you'll probably want to specify is 20 milliseconds. In addition, you don't need to have a sleep cycle every time through a loop; simply yielding every few cycles is sufficient to ensure that the current CPU is also available for other work. Here's a loop that uses Start-Sleep and the remainder operator (%) to yield the processor every 10 items

$i=0<br>Get-ChildItem -Recurse |<br>  ForEach-Object\\{<br>    $i+=1<br>    if($i%10 –eq 0)\\{sleep -mill 20\\}<br>  \\}

A Basic Optimization Plan

 

You can combine these tips into a basic plan for optimizing scripts. First, don't worry about optimization until the script is complete. Next, when possible, use -Filter to restrict the number of items read into your script and use -Include and -Exclude for further tweaking. This should reduce both running time and resource use. At that point, if you still have large numbers of items, consider replacing ForEach-Object pipeline elements with a foreach loop to speed up the script. Just remember that if you're looking at hundreds of thousands of objects, this can cause other performance problems. Finally, if you find that your script has excess CPU consumption, you can use Start-Sleep in core loops that see many executions.