user898205 user898205 - 3 months ago 14
PowerShell Question

Replace headers on export-csv from selectable list using powershell

fairly new to powershell and I have given myself a bit of a challenge which I think should be possible, I'm just not sure about the best way around it.

We have a user who has a large number of columns in a csv (can vary from 20-50), rows can vary between 1 and 10,000. the data is say ClientName,Address1,Address2,Postcode etc.. (although these can vary wildly depending on the source of the data - external companies) This needs importing into a system using a pre-built routine which looks at the file and needs the database column headers as the csv headers. so say ClientDisplay,Ad_Line1,Ad_Line2,PCode etc..

I was thinking along the lines of either a generic powershell 'mapping' form which could read the headers from ExternalSource.csv and either a DatabaseHeaders.csv (or a direct sql query lookup) display them as columns in a form and then highlight one from each column and a 'link' button, once you have been through all the columns in ExternalSource.csv a 'generate csv' button which takes the mapped headers an appends the correct data columns from ExternalSource.csv

Am I barking up the wrong tree completely trying to use powershell? at the moment its a very time consuming process so just trying to make life easier for users.

Any advice appreciated..

Thanks,

Jon

Answer

You can use the Select-Object cmdlet with dynamic columns to shape the data into the form you need.

Something like:

Import-Csv -Path 'source.svc' | 
    Select-Object Id, Name, @{ Name='Ad_Line1'; Expression={ $_.Address1 } } |
    Export-Csv -Path 'target.csv'

In this example, the code @{ Name='Ad_Line1'; Expression={ $_.Address1 } } is a dynamic column, that creates a column with name AD_Line1' and the value ofAddress1`

It is possible to read the column mappings from a file, you will have to write some code to read the file, select the properties and create the format.

A very simple solution could be to read the Select-Object part from another script file, so you can differentiate that part for each import.

A (simple, naive, low performant) solution could look like this (untested code):

# read input file    
$input = Import-Csv -Path $inputFile

# read source, target name columns from mapping file
$mappings = Import-Csv -Path $mappingFile | Select Source, Target

# apply transformations
$transformed = $input
foreach($mapping in $mappings) {
    # collect the data, add an extra column for each mapping
    $transformed = $transformed | Select-Object *, @{ Name = $mapping.Target; Expression = { $_.$($mapping.Source) } }
}

#export transformed data
$transformed | Export-Csv -Path $outputFile

Alternatively; It is possible to convert the data into XML with Import-Csv | Export-CliXml, apply an Xslt template on the Xml to perform a transformation, and save the Xml objects into Csv again with Import-CliXml | Export-Csv.

See this blog by Scott Hansleman on how you can use XSLT with PowerShell.

Comments