I am trying to import data from large excel file of size 80MB contains approx. 2 million rows into SQL server table using VB.NET.
I used BulkCopy method to copy the data into SQL server table. But it gives me error of OutOFMemoryException Issue while reading large Excel file data. Is there any better approach while loading large data from Excel to resolve this issue.
Also I am specifying Column data type as nvarchar(max). Is there any workaround where I can select Appropriate Data type while Creating table dynamically.
Here is Code Snippet:
Public Function BulkInsertDataTable(ByVal connectionString As String, ByVal tableName As String, ByVal table As DataTable, ByVal con1 As SqlConnection, ByVal tran As SqlTransaction) As Boolean
''IN USE - DO NOT REMOVE
Dim cmd1 As SqlCommand
Dim qry1 As String
If con1.State <> ConnectionState.Open Then
Dim tname As String = tableName
qry1 = "CREATE TABLE [" & tname & "] ("
Dim bulkCopy As New SqlBulkCopy(con1, SqlBulkCopyOptions.Default, tran)
bulkCopy.DestinationTableName = "[" & tableName & "]"
For j = 0 To table.Columns.Count - 1
If j <> table.Columns.Count - 1 Then
qry1 = qry1 & "[" & table.Columns(j).ColumnName & "] nvarchar(max),"
qry1 = qry1 & "[" & table.Columns(j).ColumnName & "] nvarchar(max))"
cmd1 = New SqlCommand(qry1, con1)
cmd1.Transaction = tran
BulkInsertDataTable = True
For ETL (Extract Transfer & Load) programs, the amount of data being loaded may require it to be processed in smaller chunks rather than read in all at once. This reduces the overall memory requirements of the program, though this does require extra logic to handle the transition from one chunk of data to the next. The program needs to have a means to determine what data still needs to be loaded after completing a partial load. By doing so the program should avoid the out of memory exception that had been encountered.