Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I process the content of a CSV file as Pipeline input in Powershell cmdlet

I want to use a CSV file to feed the parameters of powershell cmdlet

Role, email, fname, lname
Admin, [email protected], John, Smith

I want to process a cmdlet as follows:

import-csv myFile| mycmdlet | export-csv myresults

I also want to be able to call the cmdlet like this

mycmdlet -role x -email [email protected] -fname John -lname Smith

and see a result as an object like:

lname: "Smith"
fname: "John"
email: "[email protected]"
role: "X"
ResultData: "something else"

I didn't want to have to do this:

import-csv X.txt | foreach-object { mycmdlet -email $_.email } 

In the powershell I wanted to do something line this:

function global:test-Pipeline{
param(  
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String[]]$role, 
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String[]]$email, 
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String[]]$fname, 
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String[]]$lname ) 

$result = new-object psobject

process {
   foreach($address in $email)
   {
       Do something with each role, email, fname lname
       Add output to $result
   }
}

End {
    return $result
}

}

I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?

like image 201
James Moore Avatar asked Sep 05 '25 02:09

James Moore


1 Answers

Yes, you almost have it right. Your parameters should not use ValueFromPipeline but should use ValueFromPipelineByPropertyName. They should be [String] but not [String[]]. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".

You also don't need the End{} block here it should all be done in Process{}.

function Test-Pipeline{
    param(  
        [Parameter(ValueFromPipelineByPropertyName=$true)][String]$role, 
        [Parameter(ValueFromPipelineByPropertyName=$true)][String]$email, 
        [Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname, 
        [Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname 
    ) 


    Process {
        New-Object PSObject -Property @{
            role = "~$role~"
            email = "mailto:$email"
            fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
            lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)
        }    
    }
}

Usage:

Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation

Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json

What's going on?

Import-Csv gives you one object for every row in the CSV, and each object has a property for each column in the CSV.

When you pipe that into another command, including your function each individual object gets sent through one at a time.

You could accept the one object, and process its properties in the command.

What you have here, using ValueFromPipelineByPropertyName, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).

Since you want objects again as a result, you create a new object in the Process block, which will be passed out through the pipeline to the next command.

like image 72
briantist Avatar answered Sep 07 '25 19:09

briantist