Filter PowerShell Output with the Out-Gridview Cmdlet
Sort Output Before Sending to PowerShell Out-GridView
Use PowerShell to Add Commands to the Windows Explorer Command Bar
Use PowerShell to Create an Exchange 2010 Database Report
Proxy Functions: Spice Up Your PowerShell Core Cmdlets
Use PowerShell to Track Email Messages in Exchange Server
Use PowerShell Regular Expressions to Format Numbers
Use PowerShell Regular Expressions to Parse an RSS Feed
Use PowerShell to Explore Disk Utilization on Your Computer
Learn How to Create Custom Column Heads for PowerShell Out-GridView
Use PowerShell to Query All Event Logs for Recent Events
How to Improve the Performance of a PowerShell Event Log Query
Discover How to Filter Remote Event Log Entries in Windows Vista
Use PowerShell to Find WMI Classes that Contain Methods
Use PowerShell to Modify Your Environmental Path
Summary: Learn how to use Windows PowerShell to work with your environmental path variable.
Weekend Scripter: Use Windows PowerShell to Modify Your Path
Microsoft Scripting Guy Ed Wilson here. Welcome back to the weekend and Guest Blogger Sean Kearney. Read more about Sean and his previous guest blog posts. Now without further ado (or RDO or CDO), here is Sean.
Back in the good old days, we had a command called path in MS-DOS. It was used and abused heavily because that’s how you had to make things work. Either dump the application into a common folder or take its folder and do something like this:
PATH %PATH%;C:\THIS\;
This would allow you to now just run the application that was located in the this folder by just typing its name rather than adding an explicit path every time.
This is still available to us of course, but it is isolated to the current session and is only temporary. However, in the Land of Windows PowerShell, there is still a need to modify PATH. You may have a common script folder or an added console application folder (such as C:\Sysinternals)
For whatever reason, if you want to modify the path, accessing the data in the PATH variable is easy, but making a modification can be confusing. Windows PowerShell provides natively a one-way path to read the system environment variables using $ENV
$ENV:PATH
But try as you might, there is no built-in Set-Path cmdlet or even an Append-Path (of course Append is not an approved verb) or Find-Path-to-the-Yellow-Brick-Road.
Of course, I’m not sure that last one would have been useful for anybody other than Dorothy or Toto.
But there is no obvious way to do it. Even running CMD.EXE as an administrator only makes the change temporary. So how do we gain this capability?
We leverage the registry with Windows Powershell. The environment variables are all stored under:
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment
Therefore, it just becomes a matter of editing the key. Now keep in mind this is a permanent and global change. You will need to be running Windows PowerShell as an administrator.
First, to read the key with the content, you need to use the following command:
Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH
The previous command will dump four separate properties on the screen—the PSPath, PSParentPath, PSChildname, PSProvider—and the actual ItemProperty path. We just want the path so that we can modify the previous command to look like the following:
(Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).Path
Now if we would like to add to that property, all we need to do is concatenate to the current value and set it back in the registry:
$oldPath=(Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).Path
$newPath=$oldPath+’;C:\NewFolderToAddToTheList\’
Set-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH –Value $newPath
Now you’ll have to restart Windows PowerShell to see the change, but now all applications (including Windows) will have access to the updated path. But wouldn’t it make more sense to make these into cmdlets or advanced functions? I would think so. So we can just use $ENV:PATH to read it, so no sense rebuilding the wheel. I just need to easily add to the path. But we can get fancier than the old path command from DOS. We can add in some error checking, like validating if it’s already in the path or if the folder even exists.
Function global:ADD-PATH()
{
[Cmdletbinding()]
param
(
[parameter(Mandatory=$True,
ValueFromPipeline=$True,
Position=0)]
[String[]]$AddedFolder
)
# Get the current search path from the environment keys in the registry.
$OldPath=(Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).Path
# See if a new folder has been supplied.
IF (!$AddedFolder)
{ Return ‘No Folder Supplied. $ENV:PATH Unchanged’}
# See if the new folder exists on the file system.
IF (!(TEST-PATH $AddedFolder))
{ Return ‘Folder Does not Exist, Cannot be added to $ENV:PATH’ }
# See if the new Folder is already in the path.
IF ($ENV:PATH | Select-String -SimpleMatch $AddedFolder)
{ Return ‘Folder already within $ENV:PATH' }
# Set the New Path
$NewPath=$OldPath+’;’+$AddedFolder
Set-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH –Value $newPath
# Show our results back to the world
Return $NewPath
}
There you go: a nice new feature to add to your Windows PowerShell arsenal. If you want to get fancy and have a Get-Path, an alias should have done nicely for that, but it seems to hit the 260-character limit for some reason in the path. Therefore, we will cheat with this one-line function add-on:
FUNCTION GLOBAL:GET-PATH() { Return $ENV:PATH }
We can get even fancier now and have the ability to remove items from the path with a little magic from the –replace operator
Function global:REMOVE-PATH()
{
[Cmdletbinding()]
param
(
[parameter(Mandatory=$True,
ValueFromPipeline=$True,
Position=0)]
[String[]]$RemovedFolder
)
# Get the Current Search Path from the environment keys in the registry
$NewPath=(Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).Path
# Find the value to remove, replace it with $NULL. If it’s not found, nothing will change.
$NewPath=$NewPath –replace $RemovedFolder,$NULL
# Update the Environment Path
Set-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH –Value $newPath
# Show what we just did
Return $NewPath
}
Remember you will have to reload Windows PowerShell to get the new changes, and these changes are permanent. So be careful! To simplify getting the code, I put the functions into a module, and uploaded it to the Scripting Center Script Repository.
Thank you, Sean, for providing an great article and module. Please join us tomorrow for more Windows PowerShell goodies from Sean.
I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.
Ed Wilson, Microsoft Scripting Guy
Create a Simple Graphical Interface for a PowerShell Script
Summary: Learn how to create a simple graphical interface for a Windows PowerShell script.
Weekend Scripter: Extending PowerShell to the GUI with Sapien Tools
Microsoft Scripting Guy Ed Wilson here. Sean Kearney joins us again today as our guest blogger. Read more about Sean and his previous blog posts.
Here is Sean!
While playing with Sapien Primal Forms, I was thinking, “If you never used this, how could you figure out how to make it work with Windows PowerShell?” Stop falling down on your faces. I don’t hate the GUI. Windows PowerShell is not about shutting down one interface for another. It’s about enabling the tools you have for the best purpose to meet your job.
Think about it. There is absolutely nothing wrong with Active Directory Users and Computers. It works. It’s a great generic interface that applies to multiple needs and works well across a variety of platforms. But it isn’t necessarily the best interface for a specific job. But each one of us has a need that it may be too much for. A simple example is the person that works the Help Desk and just unlocks user accounts. (It's not the only thing they do of course, but just an example.) In Active Directory they would have to find the account, pull up the properties of the account, reset the unlock field, and maybe reset the password.
You get the idea. When really all I want to do is type in the name and have the job done. The computer should ask me want I need to do. The workflow for unlocking an account would typically be this:
WHO do you want to Unlock?
Unlock the Account
OR
WHO do want to Reset?
WHAT will their new password be?
Unlock the Account
So in Windows PowerShell, I might do something like this for an unlock user script:
$USERFIRSTNAME=READ-HOST ‘First Name’
$USERLASTNAME=READ-HOST ‘Last Name’
GET-QADUSER –FirstName $USERFIRSTNAME –LastName $USERLASTNAME | UNLOCK-QADUSER
Or if I knew the structure of the SAM account, I could just key this in:
UNLOCK-QADUSER ‘SAMAccountName’
This is obviously far more efficient that searching Active Directory, finding the darn button, clicking it, and so on. With little difficulty, I could pipe in a list of users and unlock them.
So great, the administrator is happy. So how does this help our your local Help Desk?
With Sapien Primal Forms Community Edition, you can create a basic GUI to extend that Windows Powershell script into a GUI interface for that Help Desk so that you don’t need to retrain them. You can provision an interface to meet their job needs without incurring heavy development costs.
Here’s a bonus too you may not realize: the generated Windows PowerShell script from Sapien is a stand-alone script. All it does when you’re done is generate the needed code within a Windows PowerShell script to call up forms in Windows. All features of those forms are fully supported because they are features of the GUI.
So here’s a simple form.
The Windows PowerShell script generated by Sapien to do this is shown here.
--------------------------------------HELPDESK.PS1-----------------------------------------------
#Generated Form Function
function GenerateForm {
########################################################################
# Code Generated By: SAPIEN Technologies PrimalForms (Community Edition) v1.0.8.0
# Generated On: 7/3/2011 11:35 AM
# Generated By: sean.kearney
########################################################################
#region Import the Assemblies
[reflection.assembly]::loadwithpartialname("System.Windows.Forms") | Out-Null
[reflection.assembly]::loadwithpartialname("System.Drawing") | Out-Null
#endregion
#region Generated Form Objects
$HelpDeskForm = New-Object System.Windows.Forms.Form
$UnlockAccountButton = New-Object System.Windows.Forms.Button
$InitialFormWindowState = New-Object System.Windows.Forms.FormWindowState
#endregion Generated Form Objects
#----------------------------------------------
#Generated Event Script Blocks
#----------------------------------------------
#Provide Custom Code for events specified in PrimalForms.
$handler_UnlockAccountButton_Click=
{
#TODO: Place custom script here
}
$OnLoadForm_StateCorrection=
{#Correct the initial state of the form to prevent the .Net maximized form issue
$HelpDeskForm.WindowState = $InitialFormWindowState
}
#----------------------------------------------
#region Generated Form Code
$HelpDeskForm.Text = "Our Help Desk"
$HelpDeskForm.Name = "HelpDeskForm"
$HelpDeskForm.DataBindings.DefaultDataSourceUpdateMode = 0
$System_Drawing_Size = New-Object System.Drawing.Size
$System_Drawing_Size.Width = 265
$System_Drawing_Size.Height = 55
$HelpDeskForm.ClientSize = $System_Drawing_Size
$UnlockAccountButton.TabIndex = 0
$UnlockAccountButton.Name = "UnlockAccountButton"
$System_Drawing_Size = New-Object System.Drawing.Size
$System_Drawing_Size.Width = 240
$System_Drawing_Size.Height = 23
$UnlockAccountButton.Size = $System_Drawing_Size
$UnlockAccountButton.UseVisualStyleBackColor = $True
$UnlockAccountButton.Text = "UNLOCK Account"
$System_Drawing_Point = New-Object System.Drawing.Point
$System_Drawing_Point.X = 13
$System_Drawing_Point.Y = 13
$UnlockAccountButton.Location = $System_Drawing_Point
$UnlockAccountButton.DataBindings.DefaultDataSourceUpdateMode = 0
$UnlockAccountButton.add_Click($handler_UnlockAccountButton_Click)
$HelpDeskForm.Controls.Add($UnlockAccountButton)
#endregion Generated Form Code
#Save the initial state of the form
$InitialFormWindowState = $HelpDeskForm.WindowState
#Init the OnLoad event to correct the initial state of the form
$HelpDeskForm.add_Load($OnLoadForm_StateCorrection)
#Show the Form
$HelpDeskForm.ShowDialog()| Out-Null
} #End Function
#Call the Function
GenerateForm
--------------------------------------HELPDESK.PS1-----------------------------------------------
The first time I looked at one of these, I almost fell down! So much code! But most of it is actually comments and object generation. Just look for the spot near the top where it states:
#TODO: Place custom script here.
If you look above, you’ll see it says:
$handler_UnlockAccountButton_Click=
This is the portion generated for WPF that says “When I click this, the code gets run”. You could happily click it, and it would do nothing, because there is no code attached to the button. But we can easily create some new code that does an account unlock. We can take this same block from before and attach it to the button:
$handler_UnlockAccountButton_Click=
{
#TODO: Place custom script here
$USERFIRSTNAME=READ-HOST ‘First Name’
$USERLASTNAME=READ-HOST ‘Last Name’
GET-QADUSER –FirstName $FIRSTNAME –LastName $LASTNAME | UNLOCK-QADUSER
}
Now this will try to work but can’t from a form as it will try to get in the console. What we’ll have to do is create another form or maybe add some input fields to this one. With Primal Tools, I’ve added two parts to the form: two text boxes and two labels. I’ve tried to give them some meaningful descriptive names such as FirstNameLabel.
I could reproduce the code but the important stuff is near the top where you see the objects defined as variables:
#region Generated Form Objects
$HelpDeskForm = New-Object System.Windows.Forms.Form
$LastnameLabel = New-Object System.Windows.Forms.Label
$FirstNameLabel = New-Object System.Windows.Forms.Label
$LASTNAME = New-Object System.Windows.Forms.TextBox
$FIRSTNAME = New-Object System.Windows.Forms.TextBox
$UnlockAccountButton = New-Object System.Windows.Forms.Button
$InitialFormWindowState = New-Object System.Windows.Forms.FormWindowState
#endregion Generated Form Objects
The rest of it is the form calling up and adding the objects to the form. Now look at the bottom of your script where it says GenerateForm. Everything up to but not including that is a function being defined. This is important to know because if I change the function so that it’s seen in the global context and don’t run it from the script, I can now just call it from the command line.
Knowing this is important. If you put it in the global context and switch your variables to global (for testing purposes), you can easily pull up the properties of those variables to see how your data is stored.
So to figure out where the information could be accessed on the $FIRSTNAME text box, I switched both the defined function at the top called GenerateForm and my $FIRSTNAME variable to global. I then removed the GenerateForm from the bottom of the script:
function global:GenerateForm {.
#region Generated Form Objects
$HelpDeskForm = New-Object System.Windows.Forms.Form
$LastnameLabel = New-Object System.Windows.Forms.Label
$FirstNameLabel = New-Object System.Windows.Forms.Label
$LASTNAME = New-Object System.Windows.Forms.TextBox
$GLOBAL:FIRSTNAME = New-Object System.Windows.Forms.TextBox
$UnlockAccountButton = New-Object System.Windows.Forms.Button
$InitialFormWindowState = New-Object System.Windows.Forms.FormWindowState
#endregion Generated Form Objects
With this done, I just run…
GENERATEFORM
…from the Windows PowerShell console and key in some stuff such as is shown in the following figure.
And then afterward, I close the form.
To reveal all the properties and their values, I can now type:
$FIRSTNAME | Format-List
I can now see there is a field called Text, which has the value I entered in for $FIRSTNAME.
With this knowledge I can now extend the values I entered there into my Unlock Account button by changing the Read-Host to simply point to the values in $FIRSTNAME.TEXT and $LASTNAME.TEXT:
$handler_UnlockAccountButton_Click=
{
#TODO: Place custom script here
$USERFIRSTNAME=$FIRSTNAME.TEXT
$USERLASTNAME=$LASTNAME.TEXT
GET-QADUSER –FirstName $USERFIRSTNAME –LastName $USERLASTNAME | UNLOCK-QADUSER
}
We now remove GLOBAL: before GenerateForm and $FIRSTNAME, and resave the script.
Now running the GenerateForm will produce our new UNLOCK USER piece for the Help Desk and allow them to just type in the first and last name to unlock a user.
We can of course go much further with this such as setting up some confirmations, maybe closing the Window, verifying we found the proper user account. But the point of all this is to give you some baby steps and with that maybe you can build something a lot more powerful.
Thanks Sean. That wraps up another weekend.
I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.
Ed Wilson, Microsoft Scripting Guy
Use PowerShell to Monitor Your SQL Server Performance
Summary: Guest Blogger Kevin Kline talks about using Windows PowerShell to monitor SQL Server performance.
Hey, Scripting Guy! I am wondering about using Windows PowerShell to work with SQL performance counters. Is this something that can be done?
—SH
Hello SH,
Microsoft Scripting Guy Ed Wilson here. We have a new guest blogger today. Kevin Kline is the technical strategy manager for SQL Server Solutions at Quest Software, a leading provider of award-winning tools for database management and application monitoring.
Kevin is a founding board member and former president of the international Professional Association for SQL Server (PASS) and frequently contributes to database technology magazines, websites, and discussion forums. Kevin was the recipient of the PASS 2009 Lifetime Achievement Award.
Kevin’s most popular book is SQL in a Nutshell (now in its third edition) published by O’Reilly Media. Kevin is also author or co-author of seven other IT books, including Transact-SQL Programming.
Tuning SQL Server: Windows PowerShell + PerfMon <> 2P. It’s = P2!
Performance tuning is one of the most rewarding and interesting parts of the IT pro’s job. Don’t like performance tuning? Then you’re missing out on an activity that provides endless variety, opportunity for innovation, and a chance to show the boss that you rock.
So what’s keeping you from jumping into performance tuning? Whenever I talk to customers—in every environment from small four-person IT shops to huge Fortune 50 enterprises—about why they don’t spend more time doing performance tuning, it usually comes down to two obstacles:
- The tyranny of the urgent: Boiled down to its essence, IT pros have too many fires to put out to actually get on top of performance. The analogy I like to use is, “When you barely have enough time to bail the water out of your sail boat to keep it from sinking, you certainly won’t have time to adjust the sails for better speed.”
- Skills: Or as my buddy Jermaine says “mad skilz.” Simply put, IT pros have so much to do and so little time to do it that they don’t have time to learn how to do one more thing. Our brains are full and learning something new not only requires a big investment in brain power, it will very likely lead to more responsibility without removing any of our other responsibilities. So not only do we not have the time to learn performance tuning, we don’t have the time for all of the follow-up responsibilities we will have to commit to in the future. The analogy I like to use is, “Ignorance is bliss, or at least less painful than not being ignorant.”
But there’s a tradeoff that PowerShell plus Performance Monitor (PerfMon, also known as System Monitor and, most recently, Windows Reliability and Performance Monitor) offers you to mitigate the obstacles mentioned above.
First, and in answer to the tyranny of the urgent, when you get a handle on performance tuning, urgent issues crop up less often—a lot less often. When you don’t know how to collect and interpret performance issues, every snag seems like an emergency. But many times, those red flags turn out to be red herrings instead. Instead of going into emergency firefighter mode every now and then, you’re forced to put on your fire hat every time the phone rings. Too much stress! Second, and in response to acquiring mad skilz, Scripting Guy Ed Wilson and I have teamed up to make performance tuning a “no-skilz required” activity. Why learn a whole new set of skills when you can learn just 20 percent of the content but get 80 percent of the productivity boost of being an expert? Invoke the power of the 80/20 rule by reusing these lessons freely.
What’s our objective?
We’re going to teach you how to:
- Perform a quick health check of SQL Server.
- Find the best performance counters for SQL Server performance tuning.
- Develop robust monitoring solutions.
Once you’ve followed our guidance here, performance tuning on SQL Server is no longer a black art. That’s because you’ll be able to shine the bright light of good instrumentation on your Windows and SQL Server instances.
Step 1: Perform a quick health check of SQL Server
Knowing which PerfMon counters to keep an eye on is half the battle. Maybe even more than half the battle. I’m going to start off by teaching you the handful of PerfMon counters, as well as their usage and values, that enable you to perform a quick health check of Windows and SQL Server.
A little later in the article, I’ll also show you where you can get a huge best practices collection for SQL Server PerfMon counters if you’re interested in looking at more than a handful of counters or are ready to go to the next step of hardcore performance tuning.
As a practical recommendation from a long-time SQL Server expert, SQL Server usually encounters problems in IO, memory, and then CPU (in order of commonality). The reason that SQL Server consumes an inordinate amount of IO, memory, and CPU is usually because of poor-quality SQL code, poor database design, or inadequate hardware (in order of commonality). Notice that poor code and poor design are usually the problem. So that means that if you usually fix performance issues by throwing more hardware at the problem, you’ll eventually end where you started—with an underperforming application. This is because poor code and poor design will always use more resources than they have available.
So here’s your quick list of PerfMon counters to determine SQL Server health relating to IO, memory, and CPU. These three sections are analogous to every visit to the medical clinic that starts with pulse, blood pressure, and temperature checks. If one of these is out of whack, something is definitely wrong. We haven’t necessarily diagnosed an illness, but we know there’s definitely a problem.
IO
When assessing the top level of IO health, latency is your best quick health check. Latency means the amount of time measured between the initiation of an IO operation and its completion. Though these numbers can vary widely due to all sorts of variations in the underlying storage systems, here’s a good place to start:
Physical Disk: Average Disk/sec Reads
Physical Disk: Average Disk/sec Write
The lower these values, the better. Microsoft recommends that a well-tuned IO subsystem should deliver IOs at 5 milliseconds or below on the disks holding the transaction log files and at 20 milliseconds or less on the disks holding the data files. (The transaction logs and database files are on separate disks, right?) In real world systems, I’ve seen applications perform well when reads and writes average a bit higher than their white paper recommended values, but these are great rules of thumb. Numbers a lot higher than the recommendations means that the IO subsystem is under stress.
Memory
A lot of IT pros new to SQL Server get quite alarmed when they see that SQL Server is gobbling up all of a server’s available memory. Not to worry—that’s by design. By default, SQL Server is configured to grab all of a server’s available physical memory (even if it doesn’t immediately use it all), and then give it back to other Windows processes whenever they ask for it. That way, SQL Server can optimize large blocks of memory for queries and major processes, such as big reporting jobs and backup processes.
When doing a quick check of SQL Server memory, I like to corroborate the findings by checking more than one indicator. In this case, these two counters provide an excellent quick indicator of memory pressure inside SQL Server:
SQL Server: Memory Manager >> Free List Stalls/sec
This counter monitors the number of requests per second where data requests stall because no buffers are available. Any value greater than 2 indicates that SQL Server needs more memory:
SQL Server: Memory Manager >> Memory Grants Pending
This counter shows the total number of processes per second waiting for a workspace memory grant. Numbers greater than 0 indicate a lack of memory.
CPU
It’s not very common for SQL Server, when running OLTP workloads, to use up major amounts of CPU. Even a busy SQL Server will usually use between 25-45 percent of CPU when responding to a transaction heavy workload. SQL Server will use more CPU for BI applications, but even then it usually consumes an added 20-30 percent of CPU. To find out the total amount of CPU being used and the proportion of that being used by SQL Server, check these counters:
Processor(_total): % Processor Time
This is the percentage of elapsed time the processor spends executing work (in other words, nonidle threads). On a box dedicated to SQL Server, I raise a red flag if this is frequently above 80 percent:
Process (sqlservr): % Processor Time
This shows the percentage of processor time spent exclusively on SQL Server process threads. Combining this value with the Processor:% Processor Time value will show you conclusively the overall CPU utilization of the server overall compared to how much SQL Server is using.
As I pointed out earlier in the memory section, it’s sometimes very useful to corroborate a finding with overmeasures to ensure you have a clear picture of the issue. Any quick and easy check of CPU pressure to add to the mix follows:
System: Processor Queue Length
It’s sometimes a bit of hassle trying to figure out CPU utilization based on percentages when you factor in multiple cores, hyperthreading, and virtualization. So it’s often valuable not to check a metric based on percentages, but on the raw number of threads waiting for access to that resource. In this case, the processor queue length is a great resource because it represents the number of threads waiting for CPU: anything about 12 is a red flag; values between 9 to 12 per CPU are good or fair; 5–8 is better; and 4 or less is best.
Step 2: Find the best performance counters for SQL Server performance tuning
In the previous section, I’ve pointed out a couple PerfMon counters that quickly assess the health of your SQL Server’s IO, memory, and CPU. These quick checks basically show whether your SQL Server is overextended or not. But they don’t reveal the root cause that is contributing to overconsumption of IO, memory, or CPU. For example, you might have memory issues on a SQL Server because stored procedures running on the server are constantly recompiling and are never able to stay in the cache for long. Similarly, a SQL Server might by showing high IO consumption when, at the root of the problem, the transactions running on the server are constantly blocking each other and preventing each other from completing quickly. There are PerfMon counters for that, too.
I’ve spent a lot of time building the ultimate list of SQL Server PerfMon counters and the troubleshooting scenarios when they’re most useful. Rather than run through all of them here, please take a look at my website to take a deeper dive.
Step 3: Develop robust monitoring solutions
Now that you know what to look for, it’s time to use Windows PowerShell to regularly poll your SQL Servers for these PerfMon values. I strongly encourage you to use Windows PowerShell not just for an occasional spot check of these values but to build a regular monitoring solution that runs at a rather frequent polling interval—say, 5 or 15 minutes, saving the data as you go along. That way you can save long-term performance information and look for trends and problem areas in your system.
To get PerfMon information about using Windows PowerShell, we’ll use the Get-Counter cmdlet. Without parameters, this cmdlet retrieves a handy set of summary information about the server. However, we want to retrieve performance information from specific computers. We do that by providing the name of the server along with the path of the counter, as I’ve shown earlier, using the –Computer parameter:
PS > $computer = $ENV:Computername
PS > Get-Counter "\\$computer\processor(_total)\% processor time"
Timestamp CounterSamples
--------- --------------
6/11/2011 11:16:44 AM \\...\processor(_total)\% processor time :
25.4520932356424
Don’t know the path to the performance counter you want? Then use the -ListSet parameter to search for just the right counter or set of counters. To see all counter sets, use an asterisk as the parameter value.
When building your database of monitored values, export the retrieved counter information using the Export-Counter cmdlet to save your data in a format that other tools can read, such as the .blg format used by Windows Performance Monitor. That way you can use the nice graphic tools in Windows Performance Monitor for easy graphic analysis of performance and problems.
I hope you find this quick introduction to SQL Server performance monitor counters and Windows PowerShell to be useful. Please look me up on Twitter and on my blog at http://KevinEKline.com.
Thanks, Kevin, that was exactly what I was looking for.
I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.
Ed Wilson, Microsoft Scripting Guy
Import Counters from a Perfmon Chart into PowerShell
Summary: Learn how to automatically import performance counters from a Perfmon chart into Windows PowerShell for ease of analysis.
Hey, Scripting Guy! I have several custom Perfmon charts set up. Like many network administrators, I spent a decent amount of time setting up this chart. I would like to see if there is a way I can automatically import those counters into a Windows PowerShell script. It may sound like a goofy request, but if you saw the actual number of charts and counters, this would save me a huge amount of time. I am in the same hemisphere as you, which means it is summer: I would rather spend my time playing golf than copying Perfmon counters into Windows PowerShell scripts. My golf handicap needs your assistance. Please!
—YM
Hello YM,
Microsoft Scripting Guy Ed Wilson here. As someone who used to be a scratch golfer (at least when playing a golf simulator on my computer), I sympathize with your predicament. Actually, as someone who absolutely despises rework, I really really sympathize with your situation. I would almost be willing to say, dude (or dudette as the case may be), but you will gain so much more in productivity if you convert your Perfmon charts to Windows PowerShell scripts that it will actually be worth the effort.
Before I get too carried away, I need to tell you a little bit about the Performance Monitor tool. Perfmon is one of my favorite Microsoft tools. I fell in love with it when I was writing a chapter for the MSCE for Dummies book for the NT 4 in the Enterprise exam. I was so excited about the tool, I thought about writing an entire book about the subject; instead, I wrote a book called Network Monitoring and Analysis: A Protocol Approach to Troubleshooting.
Unfortunately, there is no really straightforward way to export counters from Perfmon. There are lots of ways to export your settings and various configurations, but no easy way to export only the settings. I will use an example to make this clear.
I have created a custom performance counter set, as is shown in the following figure.
I can right-click the graph portion and choose Save Counters As from the shortcut menu. The Web Page option is probably the easiest to work with.
As an aside, I can open the web page in Internet Explorer, click the Unfreeze Display button, and start a new real-time trace. It is really cool, and is shown in the following figure.
To create a new data collector set from a Performance Monitor graph, right-click Performance Monitor in the left pane, and click New Data Collector Set in the shortcut menu. All the counters from the graph will automatically appear in the new Data Collector Set.
When I have a custom Data Collector Set, I can start the data collection in the log file that was configured during the creation of the Data Collector Set. I prefer the binary file type (.blg) because it imports easily via the Import-Counter cmdlet.
These steps are not necessarily something that every network administrator will need to accomplish. In fact, if you already have custom Performance Monitor charts and Data Collector Sets, there are probably already trace log files that have been created. All I need to pull the counters is a single snapshot.
Now for the fun part.
After I have a .blg file, I can use the Import-Counter cmdlet to import the log. I can then use normal Windows PowerShell techniques to parse the file and analyze the data. Windows PowerShell makes it easy to look through massive amounts of data and search for anomalies or patterns. But the specific task at hand is to use Windows PowerShell to query the same counters that are defined in the custom Perfmon trace. To do this, I provide the path to the .blg file, and I store the returned performance data in a variable. This technique is shown here:
$counters = Import-Counter -Path "C:\Users\edwilson\Perf\System Monitor Log.blg"
After I have the data stored in a variable, I can explore the contents as illustrated in the following figure.
It is obvious I am dealing with a collection, so I am able to index directly into the collection by using a square bracket and a number. Each slice will contain the same types of data (of course, the actual values will vary). As a result, I can simply index into the first element by using a zero. This technique is shown here (with truncated output):
PS C:\> $counters[0].countersamples
Path InstanceName CookedValue
\\edwils1\tcpv4\connection fai... 0
\\edwils1\tcpv4\segments/sec 0
<output is truncated … bigtime …>
The property I want is the path to the counters. I can use the Select-Object cmdlet to retrieve them and store them into a variable. This technique is shown here:
$paths = $counters[0].countersamples | % {$_.path}
Now that I have a collection of paths, I might be inclined to examine them. I can do this by displaying the value of the $paths variable. This is shown here with truncated output:
PS C:\> $paths
Path
\\edwils1\tcpv4\connection failures
\\edwils1\tcpv4\segments/sec
\\edwils1\tcpv4\connections established
\\edwils1\tcpv4\connections reset
\\edwils1\tcpv4\segments received/sec
How many paths do I have? I can use the Measure-Object cmdlet to find:
PS C:\> $paths | Measure-Object
Count : 81
Average :
Sum :
Maximum :
Minimum :
Property :
The cool thing about the Get-Counter cmdlet is I can easily pipe to it. The command and associated output are shown here (output is truncated):
PS C:\> Get-Counter -Counter $paths
Timestamp CounterSamples
7/20/2011 7:34:06 PM \\edwils1\network interface(intel[r] 82566mm gigabit network connection)\
bytes total/sec :
0
\\edwils1\network interface(local area connection* 9)\bytes total/sec :
0
\\edwils1\network interface(6to4 adapter)\bytes total/sec :
0
The complete sequence of commands required to import the performance monitor log, store the results in a variable, choose the path from the first instance of the counter samples, store the resultant custom object in a variable, and pipe the variable to the Get-Counter cmdlet to perform a query is shown here:
$counters = Import-Counter -Path "C:\Users\edwils\Desktop\HSG-New\System Monitor Log.blg"
$paths = $counters[0].countersamples | % {$_.path}
$counterData = Get-Counter -Counter $paths
I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.
Ed Wilson, Microsoft Scripting Guy
Capture Performance Counter Data and Write to SQL Server
Summary: Learn how to use Windows PowerShell to capture performance counter information and write the saved data to a SQL Server database.
Hey, Scripting Guy! I have been enjoying your PoshMon articles this week, but I have found them to be of limited value. I would love to see how I could store this performance data in a SQL database. I could then use SQL Reporting Services to parse the data and create reports.
—RS
Hello RS,
Microsoft Scripting Guy Ed Wilson here. I am in the middle of a two-week road trip. The engagement with the Central Ohio Windows PowerShell Users group was great, and so was SQLSaturday in Wheeling, West Virginia. I had never been to Wheeling—it is actually a pretty cool place and is relatively near both Columbus, Ohio, and Pittsburgh, Pennsylvania. Now I am in Seattle, Washington, for the internal-to-Microsoft TechReady conference, and it has been a great experience. I have gotten to see many of my friends from other locations, so it has been like a reunion in one respect. I have also learned a lot both in terms of perceived customer needs and technical aspects.
RS, I am going to use the technique from yesterday’s blog post wherein I automatically glean the performance counters from a binary performance trace file. The reason for doing this is that it greatly simplifies the task of specifying the performance counters. For that post, I carefully selected a nice collection of counters by using the graphical selector tool in the Performance Monitor tool, and I do not want to repeat that task. The original code is shown here:
$counters = Import-Counter -Path "C:\Users\edwils\Desktop\HSG-New\System Monitor Log.blg"
$paths = $counters[0].countersamples | % {$_.path}
Get-Counter -Counter $paths
For a more permanent solution, it would be trivial to export the counter paths to a text file. You could then use the Get-Content cmdlet to read the text file and populate the $counterPath variable when needed. This would save needing to parse the .blg file on a repetitive basis. The code to do this is shown here:
$paths | Out-File -FilePath c:\fso\ExportPaths.txt -Encoding ascii -Append
In fact, I like the idea of writing the counters to a data file. I decided to modify the original code to use the Tee-Object to send the output to both a variable and a text file at the same time. This is not something that Tee-Object is normally used for, but because Tee-Object will output to the screen if it is the last command in the pipeline, I save the output to a variable. This gives me both a file and a variable at the same time. The code to write to both a variable and a file is shown here (the % symbol is an alias for the ForEach-Object cmdlet):
$paths = $counters[0].countersamples | % {$_.path} | Tee-Object -FilePath c:\fso\testpaths.txt
The advantage of using Tee-Object is the code is still three lines long, but I now get both the variable populated and the text file created all at once. The modified code is shown here:
$counters = Import-Counter -Path "C:\Users\edwils\Desktop\HSG-New\System Monitor Log.blg"
$paths = $counters[0].countersamples | % {$_.path} | Tee-Object -FilePath c:\fso\counterpaths.txt
Get-Counter -Counter $paths
The text file that contains the counter paths is shown here.
After I have my counters in a text file, I can use the Get-Content cmdlet to read the counter text file for the counter paths. I specify 20 samples and a sample interval of 6 seconds. This will give me 20 readings over a two-minute period. When I have completed gathering my data, I use the Export-Counter cmdlet to export my performance data to a CSV file. It is much better to use this cmdlet than to attempt manually creating the CSV data.
These four lines of code are really two logical lines of code. I use the line continuation character at the end of the first and third lines to break the code to the next lines to make the code easier to read on the blog. The code to read counter paths from a text file, pass the paths to the Get-Counter cmdlet to retrieve 20 samples at 6-second intervals, store the results in a variable, and export the data to a CSV file is shown here:
$counterData = Get-Counter -Counter (Get-Content C:\fso\counterPaths.txt) `
-MaxSamples 20 -SampleInterval 6
Export-Counter -Path c:\fso\counterData.csv -FileFormat csv `
-InputObject $counterData
The script and associated output (there is no output) are shown here in a screen shot of the Windows PowerShell ISE.
The data that is collected outputs to a CSV file. By default, CSV files associate with Microsoft Excel (if installed), and the CSV file created by running the script and exporting the counter information are shown in the following figure.
After I have verified that I have captured the data and have it in an acceptable format, I am ready to import it into a database. One note: if anything changes from the .blg file to the counter path file to the time when you run your script to capture data, you might receive an error while running the script. The error might be associated with an “invalid” counter path. This might happen, for example, if you were running this on a laptop and the counter is supposed to use a wired network connection, and later you were on a wireless network connection and disabled the wired network adapter. All the other data would be captured; only the invalid Perfmon counters would be mentioned in the error message and no data would be captured for those instances. A sampling of such errors is shown in the following figure.
To import my newly created CSV file into my database, I am going to use the SQL Server Import and Export Wizard. I created my database earlier by using Microsoft SQL Server Management Studio, but I could have used the SQLPSX cmdlets from CodePlex. I am using SQL Server 2008 R2 Express Edition, which is a free download. One disadvantage of using the Express Edition is that I cannot save my import package.
After I have chosen my data source, selected the file, and specified that the column names are in the first data row, I go to the next page where I can view the way the data will be imported. If it does not look acceptable, I can go back and make advanced changes. In this case, however, everything looks fine, as is shown in the following figure.
Now I need to choose which database I am going to use. I am going to use my PoshMon database on my SQL Server Express Edition, and connect via Windows Authentication. This is shown in the following figure.
Now I need to map the data to a particular table. I am going to wimp out and allow the wizard to automatically create a new table for me. The table will be called counterData, which was the name from the spreadsheet. The table mapping is shown in the following figure.
It is time to click a few more times and allow the import to run. The results appear on the final screen. If a problem occurs, a report will be available under the Message column. The successful conclusion page is shown here.
After the import has completed, I like to run a query from inside the SQL Server Management Studio. I can right-click the newly created table and choose Select Top 1000 Rows from the shortcut menu. The generated query and associated data are shown in the following figure.
That is it for querying performance counters and writing them to a SQL database. It can be lots of fun. Download SQL Express today, and give it a whirl.
I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.
Ed Wilson, Microsoft Scripting Guy
Use Performance Counter Sets and PowerShell to Ease Baselining
Summary: Microsoft Scripting Guy Ed Wilson teaches how to use performance counter sets with Windows PowerShell to simplify profiling a system.
Hey, Scripting Guy! I am wondering if there is an easier way to work with performance counters? For example, rather than having to pick out a whole bunch of counters, are there groups of counters that I can use? If not, that is ok, but I feel like I had to ask. By the way, the Scripting Guys rock. I just had to say it.
—BB
Hello BB,
Microsoft Scripting Guy Ed Wilson here. I want to thank you for your vote of confidence. One of the things I enjoy so much about speaking in person to users groups (like I did in Columbus, Ohio) or appearing at events (like SQLSaturday in Wheeling, West Virginia) is getting to meet people who come up to me and say, “I have been reading your stuff for years, and you have saved me so many times.” And though it is fun to meet “groupies,” it is more fulfilling to meet people who have read my material and have been able to reduce their workload.
Anyway, BB, if you don’t ask, I cannot help you. Yes, you can query groups of counters! To see a list of all the counter sets, use the ListSet parameter and use a wildcard character. The ListSet parameter returns detailed information about counter sets. To limit the output to only the counter set names, select only the counterset property. The commands that will return counter set information or limit to only counter set names are shown here:
#Produce listing of all counter sets
Get-Counter -ListSet *
#to see only the counter set names
Get-Counter -ListSet * | select countersetname
Using the wildcard character trick, it is possible to search for specific counter sets that relate to a particular technology. A few sample searches and their associated output are shown here.
PS C:\Users\edwilson> Get-Counter -ListSet *disk*| select countersetname
CounterSetName
LogicalDisk
PhysicalDisk
PS C:\Users\edwilson> Get-Counter -ListSet *processor* |select countersetname
CounterSetName
Processor Information
Per Processor Network Activity Cycles
Per Processor Network Interface Card Activity
Processor
PS C:\Users\edwilson> Get-Counter -ListSet *memory* | select countersetname
CounterSetName
.NET CLR Memory
MSSQL$SQLEXPRESS:Memory Manager
Memory
.NET Memory Cache 4.0
If I use a wildcard character pattern that matches a single countersetname value and I do not select only the countersetname with the Select-Object cmdlet (as I was doing earlier), I receive detailed information about the counter set. The properties and values associated with the LogicalDisk counter set (I use the wild card pattern L*disk) are shown in the output here:
PS C:\> Get-Counter -ListSet l*disk
CounterSetName : LogicalDisk
MachineName : .
CounterSetType : MultiInstance
Description : The Logical Disk performance object consists of counters that monitor logical partitions of hard or fixed disk drives. Performance Monitor identifies logical disks by their drive letter, such as C.
Paths : {\LogicalDisk(*)\% Free Space, \LogicalDisk(*)\Free Megabytes, \LogicalDisk(*)
\Current Disk Queue Length, \LogicalDisk(*)\% Disk Time...}
PathsWithInstances : {\LogicalDisk(HarddiskVolume1)\% Free Space, \LogicalDisk(C:)\% Free Space, \LogicalDisk(_Total)\% Free Space, \LogicalDisk(HarddiskVolume1)\Free Megabytes...}
Counter : {\LogicalDisk(*)\% Free Space, \LogicalDisk(*)\Free Megabytes, \LogicalDisk(*)
\Current Disk Queue Length, \LogicalDisk(*)\% Disk Time...}
Two properties are of particular interest: the paths property and the pathsWithInstances properties. The counter paths in the paths property use a wildcard character mapping, and do not map to specific instances of the resource. The command and associated output are shown here:
PS C:\Users\edwils> (Get-Counter -ListSet l*disk).paths
\LogicalDisk(*)\% Free Space
\LogicalDisk(*)\Free Megabytes
\LogicalDisk(*)\Current Disk Queue Length
\LogicalDisk(*)\% Disk Time
\LogicalDisk(*)\Avg. Disk Queue Length
\LogicalDisk(*)\% Disk Read Time
\LogicalDisk(*)\Avg. Disk Read Queue Length
\LogicalDisk(*)\% Disk Write Time
\LogicalDisk(*)\Avg. Disk Write Queue Length
\LogicalDisk(*)\Avg. Disk sec/Transfer
\LogicalDisk(*)\Avg. Disk sec/Read
\LogicalDisk(*)\Avg. Disk sec/Write
\LogicalDisk(*)\Disk Transfers/sec
\LogicalDisk(*)\Disk Reads/sec
\LogicalDisk(*)\Disk Writes/sec
\LogicalDisk(*)\Disk Bytes/sec
\LogicalDisk(*)\Disk Read Bytes/sec
\LogicalDisk(*)\Disk Write Bytes/sec
\LogicalDisk(*)\Avg. Disk Bytes/Transfer
\LogicalDisk(*)\Avg. Disk Bytes/Read
\LogicalDisk(*)\Avg. Disk Bytes/Write
\LogicalDisk(*)\% Idle Time
\LogicalDisk(*)\Split IO/Sec
If I use the pathsWithInstances property, it will triple the number of counters (on my laptop anyway). The pathsWithInstances property returns the counter path, and then a path for each instance. On my Hyper-V server with seven drives, the pathsWithInstances property will return an instance for each logical drive, as well as an instance for the _Total instance. Compare the truncated output here with the first few lines of output from the previous command:
PS C:\Users\edwils> (Get-Counter -ListSet l*disk).PathsWithInstances
\LogicalDisk(HarddiskVolume1)\% Free Space
\LogicalDisk(C:)\% Free Space
\LogicalDisk(_Total)\% Free Space
\LogicalDisk(HarddiskVolume1)\Free Megabytes
\LogicalDisk(C:)\Free Megabytes
\LogicalDisk(_Total)\Free Megabytes
To verify the number of paths, I piped the Get-Counter command to the Measure-Object cmdlet and returned only the count property. The two commands I used are shown here:
PS C:\Users\edwils> ((Get-Counter -ListSet l*disk).Paths | Measure-Object).count
23
PS C:\Users\edwils> ((Get-Counter -ListSet l*disk).PathsWithInstances | Measure-Object).count
69
After I have picked out the counter set, I can use either the paths property or the pathsWithInstances property to query. All I need to do is to supply the counter paths directly to the counter parameter of the Get-Counter cmdlet. I like to pipe the results to More so that I can page through the output (the pager does not work in the output pane of the Windows PowerShell ISE). The code to do this is shown here:
Get-Counter -Counter (Get-Counter -ListSet l*disk).Paths | more
The command and associated output are shown in the following figure.
I can use the Get-Counter cmdlet to return processor information, memory utilization, and logical and physical disk activity. The following commands accomplish this task:
Get-Counter -Counter(Get-Counter -ListSet "Processor Information").paths
Get-Counter -counter (Get-Counter -listSet "memory").paths
Get-Counter -counter (Get-Counter -listSet "LogicalDisk").paths
Get-Counter -counter (Get-Counter -listSet "PhysicalDisk").paths
Because the counter property accepts an array (in fact, each command above supplies an array), I can create my own array of counter paths and make a single query with the Get-Counter cmdlet. In the following command, I get all of the paths from the above four commands and store them in a single variable.
$a = (Get-Counter -ListSet "Processor Information").paths
$a += (Get-Counter -listSet "memory").paths
$a += (Get-Counter -listSet "LogicalDisk").paths
$a += (Get-Counter -listSet "PhysicalDisk").paths
The $a variable contains 99 paths on my laptop. I found this by using the following command:
($a | Measure-Object).count
I can now query for performance information from all 99 counters at once:
Get-Counter -Counter $a
What about performance? On my laptop, the first command takes about four seconds. This command is shown here:
PS C:\Users\edwils> Measure-command -expression `
{
Get-Counter -Counter(Get-Counter -ListSet "Processor Information").paths
Get-Counter -counter (Get-Counter -listSet "memory").paths
Get-Counter -counter (Get-Counter -listSet "LogicalDisk").paths
Get-Counter -counter (Get-Counter -listSet "PhysicalDisk").paths
}
Days : 0
Hours : 0
Minutes : 0
Seconds : 4
Milliseconds : 96
Ticks : 40963324
TotalDays : 4.74112546296296E-05
TotalHours : 0.00113787011111111
TotalMinutes : 0.0682722066666667
TotalSeconds : 4.0963324
TotalMilliseconds : 4096.3324
After I have put all of the paths in a variable, the command takes about a second, as shown here:
PS C:\Users\edwils> Measure-command -Expression { Get-Counter -Counter $a }
Days : 0
Hours : 0
Minutes : 0
Seconds : 1
Milliseconds : 69
Ticks : 10690628
TotalDays : 1.2373412037037E-05
TotalHours : 0.000296961888888889
TotalMinutes : 0.0178177133333333
TotalSeconds : 1.0690628
TotalMilliseconds : 1069.0628
Okay, that is cheating a bit because I did not time obtaining the paths first. But when I run the command to pick up all the paths and then perform the query, I still see a huge improvement in performance. (I would need to do some additional testing, such as rebooting my machine between each query, to get rid of any caching that might be taking place. It would bear checking, and this might be a great way to speed up querying for lots of performance counters).
PS C:\Users\edwils> Measure-command `
{
$a = (Get-Counter -ListSet "Processor Information").paths
$a += (Get-Counter -listSet "memory").paths
$a += (Get-Counter -listSet "LogicalDisk").paths
$a += (Get-Counter -listSet "PhysicalDisk").paths
Get-Counter -Counter $a
}
Days : 0
Hours : 0
Minutes : 0
Seconds : 1
Milliseconds : 102
Ticks : 11021647
TotalDays : 1.27565358796296E-05
TotalHours : 0.000306156861111111
TotalMinutes : 0.0183694116666667
TotalSeconds : 1.1021647
TotalMilliseconds : 1102.1647
Well, BB, this ends another Hey, Scripting Guy! Blog post. I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.
Ed Wilson, Microsoft Scripting Guy