I have a big delimited flat file which has 170 columns. I have loaded this flat file into SQL Server using import/export wizard.
When I load this flat file, all columns are stored as
You can do it in t-sql and as someone who does ETL via SSIS, t-sql, c# etc. I will say this task is strongly preferred in t-sql over SSIS! in SSIS you would have to create a data transformation for all 170 columns and specify type on source as well as destination.....
Do you already know the datatypes you need for the columns? If so and you have the table built it gets easier yet. Just write a really really long insert statement with a select from your original table and in the select
varchar to the desired datatype. The conversion is the tricky part as the insert will fail if the conversion fails do to data in the columns. You can use sys.columns and sys.types to do it dynamically so you don't have 170 different columns a few times.....
This should get you started if you don't want to write all of the data types etc.... Still needs work but a guide.
;WITH cteColumns AS ( SELECT ColumnName = c.name ,DataType = CASE WHEN ty.name LIKE '%CHAR%' THEN ty.Name + '(' + CAST(c.max_length AS VARCHAR(10)) + ')' WHEN ty.name IN ('DECIMAL','NUMERIC') THEN ty.name + '(' + CAST(c.precision AS VARCHAR(10)) + ',' + CAST(c.scale AS VARCHAR(10)) + ')' ELSE ty.name END FROM sys.columns c INNER JOIN sys.types ty ON c.system_type_id = ty.system_type_id WHERE c.object_id = OBJECT_ID('TableName') ) SELECT ColumnName + ' = CAST(source.' + ColumnName + ' AS ' + DataType + ')' FROM cteColumns