Yesterday we ended with a developerish Powershell script:
1 2 3 4 5 6
It loops over all objects in the $pst array. Foreach item in the $pst array, it generates a new file name and the item is copied to another folder with the new file name.
There is another way to accomplish just this, by using the pipeline and instead of foreach, we will be using Foreach-Object.
1 2 3 4
Each method will yield the same result. Foreach-Object is a bit slower though, and has the ‘$_.’ notation which is typically Powershell. I prefer the first method because it feels more natural to me. But it does not mean I will not use the pipeline.
What about the pipeline?
In Powershell you can pass an array of objects as input to another function (cmdlet). This is really cool. With DOS, Vbscript and other scripting languages, there are only 3 types of interfaces:
..which accept a textstream as input and as output. Powershell accepts objects as input and output. Using objects as input and executing stages within the PowerShell runtime eliminates the need to serialize data structures, or to extract them by explicitly parsing text output.
These objects expose a set of properties and methods so you can do cool stuff with it. Most of the time, we also want to treat a collection of objects like a database table and query it. For this purpose, there are functions like
So I can create an array of file system objects like this:
Which will return an array of objects of all my Dropbox data. Suppose I want to filter the .docx files and copy them all to c:\temp:
Of course, you can make all kinds of smart collections, e.g. only copy the docs which have a LastAccessTime before 2008 or so (remember, you can query all properties and methods with get-member).
So this article should explain the Powershell pipeline and why it is cool. Next, let’s learn about the objects and how you can create custom ones for your system reports.