PowerShell (My) Best Practices
A few Best Practices that work well for the majority of my Scripts.
PowerShell is one of my favorite scripting languages, there are many reasons for this. The first is that it’s the language I know the best, it gets points for that. I like the modules that you can import, how it works with Task Scheduler for automated runs, and the fact that I can do most of a module’s function natively by writing my own code. I’ve written literally thousands of scripts over the years to basically take humans out of a tedious process. This allows standard operations to be automated, reducing the likelihood of human errors. The only downside is multithreading and some specific security and network-based operations that require or at least are easier to do in Python that are better documented. PowerShell is also made for linear execution not multithreading. I have only seen a need to multithread a few times for Infrastructure Management, and you can do it in PowerShell, but it gets complicated fast.
Powershell (My)Best Practices
- Import Modules at the top of the script.
- This lets you know what modules are available so you can choose the cmdlets you want to run later in the script based on the module.
- Variables should be placed under the modules so that you can call them as needed within the script.
- Having them at the top, makes them easier to change without having to go through all of your code just to change a target or setting.
- I rarely use define variables in the body of a script that will regularly change, those will be at the top while dynamic variables will be in the body.
- Things like pulling a list of variables from a .csv or .json file.
- Create Functions under the Variables where appropriate, this allows you to reuse code throughout your script without having to duplicate it several times. You can call the function and feed the variables into it using the parameters that have been defined.
- When you need to connect to services such as M365 or Remote Powershell, put the code use to connect to the remote system next.
- This allows you to get all the logons done before you start your script, minimizing interruptions asking for authentication when the script runs.
- This works for most scripts but occasionally you may have to change this workflow if you run into issues but is my go to.
- The body of the script should use variables for anything that will change, since they will be at the top of the script, you can modify them easier and in a single location. I have a few suggestions about how to craft the script.
- Create Functions where possible, this will cut down on coding time and they can be copied into other scripts but they also make your code more easily read.
- They also are nice if you need to change something, you update the function once and it is updated throughout your script.
- Script as you will not be the person running it or understands what it does.
- This cuts down on the need to document the script sections and makes it easier to modify in the future.
- Less is More
- While you can create an extremely elegant script, most of the time you just need it do a function well.
- If you need the code to be used by others, or a process make it clean.
- Error Checking for production scripts.
- While I don’t waste time doing Error Checking on scripts that I am the only person that uses it, if I am having another user or process, I will error check and log to a database or a .csv to make sure I know where the Script failed.
- Create Functions where possible, this will cut down on coding time and they can be copied into other scripts but they also make your code more easily read.
Code Examples: I will be adding usable code and examples in future Blog Posts. Layout below is to be used as a guide, the code doesn’t do anything and is more of an example of what was discussed above.
## Modules
Import-Module WhateverModuleYouWant
## Variables
$TestVar1 = "Testing1"
$TestVar2 = "Testing2"
$outputFile = "C:\Errors.csv"
## Functions
function TestFunction {
param (
$Test,
$Test123 = "Testing 123"
)
}
## Connect to Services
$CredM365= Connect-ExchangeOnline
## Start Script Body
try {
$Testing = TestFunction -Test 123456
} catch {
# Capture error details
$errorDetails = [PSCustomObject]@{
TimeStamp = Get-Date
Item = $item
Message = $_.Exception.Message
}
$errorList += $errorDetails
}
}
# Export the errors to a CSV file
if ($errorList.Count -gt 0) {
$errorList | Export-Csv -Path $outputFile -NoTypeInformation -Force
Write-Output "Errors have been logged to $outputFile"
} else {
Write-Output "No errors occurred."
}
Leave a Reply