Lain Lain - 5 months ago 20x
Perl Question

Perl many insert statments into mysql database

I currently have code for perl that looks like this:

@valid = grep { defined($column_mapping{ $headers[$_] }) } 0 .. $#headers;


my $sql = sprintf 'INSERT INTO tablename ( %s ) VALUES ( %s )',
join( ',', map { $column_mapping{$_} } @headers[@valid] ),
join( ',', ('?') x scalar @valid);
my $sth = $dbh->prepare($sql);


my @row = split /,/, <INPUT>;
$sth->execute( @row[@valid] );

(Taken from mob's answer to a previous question.)

That is basically dynamically building a sql insert statement from csv data, and only allowing the csv data with proper headers from my column mapping to be picked.

I have been looking for examples on how to do an insert statment with multiple rows of data at once.

My perl script needs to run around a few hundred million insert statments, and doing it one at a time seems really slow, especially since the server I am running it on only has 6gb of ram and a slowish internet connection.

Is there a way I can upload more than 1 row at a time of data? So one insert statment uploads maybe 50 rows, or 100 rows at once? I cant find out how with perl DBI.

my $sql_values = join( ' ', ('(?, ?, ?)') x scalar(@array) );

As said before, then you can just flatten it.