prateek k prateek k - 1 month ago 14
MySQL Question

copying table from one database to another(sybase to mysql) in java

I have written code to connect to sybase database and mysql database and copy one table from sybase database to mysql database. My program is working fine and i am getting done what i waned but not in sufficient time. Sybase has total around 10000 rows in table that i am copying and it is taking around 4 mins to copy.
Can you guys suggest any improvement that can decrease the copying time.
Following is my code:

package jdbcexmple;
import java.sql.*;
import java.util.*;




public class Jdbcexmple {

static final String JDBC_DRIVER = "com.mysql.jdbc.Driver";
static final String DB_URL = "jdbc:mysql://localhost:3306/alarm";
static final String JDBC_DRIVER_SECOND = "net.sourceforge.jtds.jdbc.Driver";
static final String DB_URL_SECOND = "jdbc:jtds:sybase://11.158.251.19:4100/fmdb";

static final String USER = "root";
static final String PASS = "abc";
static final String USER_SECOND = "your";
static final String PASS_SECOND = "xyz";


/**
* @param args the command line arguments
*/
public static void main(String[] args) {
String a;
String b;
String c;
String d;
Connection conn = null;
Connection conn_2 = null;

PreparedStatement stmt = null;
try{






Class.forName(JDBC_DRIVER);
System.out.println("connecting to database mysql");
conn = DriverManager.getConnection(DB_URL, USER, PASS);
System.out.println("connected to database successfully");

Class.forName(JDBC_DRIVER_SECOND);
System.out.println("connecting to database SYBASE");
conn_2 = DriverManager.getConnection(DB_URL_SECOND, USER_SECOND, PASS_SECOND);
System.out.println("connected to database successfully");


System.out.println("creating table in given database");



String sql = "CREATE TABLE newtable (CSN VARCHAR(255), IsCleared VARCHAR(255), ID VARCHAR(255), IP VARCHAR(255), PRIMARY KEY ( ID ))";
stmt = conn.prepareStatement(sql);
stmt.executeUpdate(sql);

System.out.println("created table in database");


Statement stmt_1= conn_2.createStatement();
String sql_1 = "select tbl_alm_log_2000000000.Csn, tbl_alm_log_2000000000.IsCleared, tbl_alm_log_2000000000.Id From fmdb.dbo.tbl_alm_log_2000000000 Where IsCleared = 0";

ResultSet rs = stmt_1.executeQuery(sql_1);



//below loop is taking 4 mins ie copying

while (rs.next())
{
a = rs.getString(1);
b = rs.getString(2);
c = rs.getString(3);
d = rs.getString(4);
sql = "INSERT INTO newtable values "+"("+"\""+a+"\","+"\""+b+"\","+"\""+c+"\","+"\""+d+"\""+")";
stmt = conn.prepareStatement(sql);
stmt.executeUpdate(sql);
System.out.println(a+" "+b+" "+c+" "+d);



}



}catch(SQLException se){
se.printStackTrace();
}catch(Exception e){
e.printStackTrace();
}finally{
try{
if(stmt!=null)
conn.close();
conn_2.close();

}catch(SQLException se){

}
try{
if(conn!=null)
conn.close();
conn_2.close();

}catch(SQLException se){
se.printStackTrace();
}
}



}


}

Answer

Use Batch execution to insert data into mysql without execute one by one. You have already used PreparedStatement. That is fine.

There are two solutions:

Solution 1:-

String sql = "INSERT INTO newtable values (col1, col2,col3) values (?, ?, ?)";
Connection connection = new getConnection();
connection.setAutoCommit(false);
PreparedStatement ps = connection.prepareStatement(sql);

final int batchSize = 1000;
int count = 0;

while (rs.next()){

  ps.setString(1, rs.getString(1));
  ps.setString(2, rs.getString(2));
  ps.setString(3, rs.getString(3));
  ps.addBatch();

  if(++count % batchSize == 0) {
    ps.executeBatch();
  }
}
ps.executeBatch(); // insert remaining records
connection.commit();
ps.close();
connection.close();

Your insert will be fast further with transaction handling. (connection.setAutoCommit(false); and connection.commit();)

http://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html#addBatch--

http://docs.oracle.com/javase/8/docs/api/java/sql/Statement.html#executeBatch--

http://viralpatel.net/blogs/batch-insert-in-java-jdbc/

Solution 2:-

rewriteBatchedStatements can be set with DB_URL this way.

jdbc:mysql://localhost:3306/alarm?rewriteBatchedStatements=true

So here rewriting to data bulk insert. Table lock once and indexes update once. This is another fastest way.

Comments