I have been working on various scripts that populate information into a SQL table. The other day I was attempting to see how long some of these tables took to populate. I could have updated the PowerShell script to include some Measure-Command information, but then I’d have to re-execute my tasks and see how long they took to run. Since I already had all of the information populated into a table, and each row had a time stamp of when the data was entered, I could just use a T-SQL select statement to get the run time data I wanted.
SELECT CAST( (MAX(dataCollected) - MIN(dataCollected)) as time) as TotalRunTime, COUNT(*) as RecCount FROM tblTestInserts WHERE collectionID = MAX(collectionID)
The above example looks at two columns in my table. The first column (dataCollected) is of the datetime type and has a default binding of GetDate() — so it is automatically populated with the SQL Servers date and time every time a record is inserted. The second column (collectionID) is an integer that is calculated each time I execute the script — inside of powershell I select the maximum value of this column and then add 1. This provides a way know how many times the task has been executed and which data is the most current.
This gave me exactly the information I was looking for without needing to modify my PowerShell script and re-run the collection. It also allows me to compare current runs to previous runs (by changing the value of the collectionID in the WHERE clause). I can also be more specific…I can add an ‘AND’ to the where clause and see how long a specific part of the collection took to run.
In a future post I plan to share how I was able to take a vbscript that took 8 hours to complete condense it into a 28 minute execution time powershell script. (Here is a preview of the magic sauce — Group-Object -Property DN -AsHashTable -AsString)