roy roy - 2 months ago 10x
SQL Question

skip bad record in redshift data load

I am trying to load data into AWS redshift using following command

copy venue from 's3://mybucket/venue'
credentials 'aws_access_key_id=;aws_secret_access_key='
delimiter '\t';

but data load is failing, when I checked Query section for that specific load i noticed it failed because of "Bad UTF8 hex sequence: a4 (error 3)"

Is there a way to skip bad records in data load into redshift ?


Yes, you can use the maxerror parameter. This example will allow up to 250 bad records to be skipped (the errors are written to stl_load_errors):

copy venue from 's3://mybucket/venue' credentials 'aws_access_key_id=;aws_secret_access_key=' delimiter '\t' maxerror as 250;