Efficient way to do batch INSERTS with JDBC
In my app I need to do a lot of INSERTS. Its a Java app and I am using plain JDBC to execute the queries. The DB being Oracle. I have enabled batching though, so it saves me network latencies to execute queries. But the queries execute serially as separate INSERTs:
insert into some_table (col1, col2) values (val1, val2)
insert into some_table (col1, col2) values (val3, val4)
insert into some_table (col1, col2) values (val5, val6)
I was wondering if the following form of INSERT might be more efficient:
insert into some_table (col1, col2) values (val1, val2), (val3, val4), (val5, val6)
i.e. collapsing multiple INSERTs into one.
Any other tips for making batch INSERTs faster?
java sql performance jdbc
add a comment |
In my app I need to do a lot of INSERTS. Its a Java app and I am using plain JDBC to execute the queries. The DB being Oracle. I have enabled batching though, so it saves me network latencies to execute queries. But the queries execute serially as separate INSERTs:
insert into some_table (col1, col2) values (val1, val2)
insert into some_table (col1, col2) values (val3, val4)
insert into some_table (col1, col2) values (val5, val6)
I was wondering if the following form of INSERT might be more efficient:
insert into some_table (col1, col2) values (val1, val2), (val3, val4), (val5, val6)
i.e. collapsing multiple INSERTs into one.
Any other tips for making batch INSERTs faster?
java sql performance jdbc
2
WOW! I tested your "collapse multiple inserts into one" while inserting to SQL Server, and I went from 107 rows/second to 3333 rows per second!
– Wouter
Nov 7 '16 at 16:03
1
That's a stunning 31x increase.
– gaurav
Aug 13 '18 at 11:06
add a comment |
In my app I need to do a lot of INSERTS. Its a Java app and I am using plain JDBC to execute the queries. The DB being Oracle. I have enabled batching though, so it saves me network latencies to execute queries. But the queries execute serially as separate INSERTs:
insert into some_table (col1, col2) values (val1, val2)
insert into some_table (col1, col2) values (val3, val4)
insert into some_table (col1, col2) values (val5, val6)
I was wondering if the following form of INSERT might be more efficient:
insert into some_table (col1, col2) values (val1, val2), (val3, val4), (val5, val6)
i.e. collapsing multiple INSERTs into one.
Any other tips for making batch INSERTs faster?
java sql performance jdbc
In my app I need to do a lot of INSERTS. Its a Java app and I am using plain JDBC to execute the queries. The DB being Oracle. I have enabled batching though, so it saves me network latencies to execute queries. But the queries execute serially as separate INSERTs:
insert into some_table (col1, col2) values (val1, val2)
insert into some_table (col1, col2) values (val3, val4)
insert into some_table (col1, col2) values (val5, val6)
I was wondering if the following form of INSERT might be more efficient:
insert into some_table (col1, col2) values (val1, val2), (val3, val4), (val5, val6)
i.e. collapsing multiple INSERTs into one.
Any other tips for making batch INSERTs faster?
java sql performance jdbc
java sql performance jdbc
edited Sep 24 '10 at 10:11
Bozho
485k1089521064
485k1089521064
asked Sep 24 '10 at 4:29
Aayush PuriAayush Puri
7441918
7441918
2
WOW! I tested your "collapse multiple inserts into one" while inserting to SQL Server, and I went from 107 rows/second to 3333 rows per second!
– Wouter
Nov 7 '16 at 16:03
1
That's a stunning 31x increase.
– gaurav
Aug 13 '18 at 11:06
add a comment |
2
WOW! I tested your "collapse multiple inserts into one" while inserting to SQL Server, and I went from 107 rows/second to 3333 rows per second!
– Wouter
Nov 7 '16 at 16:03
1
That's a stunning 31x increase.
– gaurav
Aug 13 '18 at 11:06
2
2
WOW! I tested your "collapse multiple inserts into one" while inserting to SQL Server, and I went from 107 rows/second to 3333 rows per second!
– Wouter
Nov 7 '16 at 16:03
WOW! I tested your "collapse multiple inserts into one" while inserting to SQL Server, and I went from 107 rows/second to 3333 rows per second!
– Wouter
Nov 7 '16 at 16:03
1
1
That's a stunning 31x increase.
– gaurav
Aug 13 '18 at 11:06
That's a stunning 31x increase.
– gaurav
Aug 13 '18 at 11:06
add a comment |
9 Answers
9
active
oldest
votes
This is a mix of the two previous answers:
PreparedStatement ps = c.prepareStatement("INSERT INTO employees VALUES (?, ?)");
ps.setString(1, "John");
ps.setString(2,"Doe");
ps.addBatch();
ps.clearParameters();
ps.setString(1, "Dave");
ps.setString(2,"Smith");
ps.addBatch();
ps.clearParameters();
int results = ps.executeBatch();
3
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
30
Theps.clearParameters();
is unnecessary in this particular case.
– BalusC
Sep 24 '10 at 12:21
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
add a comment |
Though the question asks inserting efficiently to Oracle using JDBC, I'm currently playing with DB2 (On IBM mainframe), conceptually inserting would be similar so thought it might be helpful to see my metrics between
inserting one record at a time
inserting a batch of records (very efficient)
Here go the metrics
1) Inserting one record at a time
public void writeWithCompileQuery(int records) {
PreparedStatement statement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
statement = connection.prepareStatement(compiledQuery);
long start = System.currentTimeMillis();
for(int index = 1; index < records; index++) {
statement.setInt(1, index);
statement.setString(2, "emp number-"+index);
statement.setInt(3, index);
statement.setInt(4, index);
statement.setString(5, "username");
long startInternal = System.currentTimeMillis();
statement.executeUpdate();
System.out.println("each transaction time taken = " + (System.currentTimeMillis() - startInternal) + " ms");
}
long end = System.currentTimeMillis();
System.out.println("total time taken = " + (end - start) + " ms");
System.out.println("avg total time taken = " + (end - start)/ records + " ms");
statement.close();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
}
}
The metrics for 100 transactions :
each transaction time taken = 123 ms
each transaction time taken = 53 ms
each transaction time taken = 48 ms
each transaction time taken = 48 ms
each transaction time taken = 49 ms
each transaction time taken = 49 ms
...
..
.
each transaction time taken = 49 ms
each transaction time taken = 49 ms
total time taken = 4935 ms
avg total time taken = 49 ms
The first transaction is taking around 120-150ms
which is for the query parse and then execution, the subsequent transactions are only taking around 50ms
. (Which is still high, but my database is on a different server(I need to troubleshoot the network))
2) With insertion in a batch (efficient one) - achieved by preparedStatement.executeBatch()
public int writeInABatchWithCompiledQuery(int records) {
PreparedStatement preparedStatement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.setString(2, "empo number-"+index);
preparedStatement.setInt(3, index+100);
preparedStatement.setInt(4, index+200);
preparedStatement.setString(5, "usernames");
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int inserted = preparedStatement.executeBatch();
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.close();
return inserted;
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
The metrics for a batch of 100 transactions is
total time taken to insert the batch = 127 ms
and for 1000 transactions
total time taken to insert the batch = 341 ms
So, making 100 transactions in ~5000ms
(with one trxn at a time) is decreased to ~150ms
(with a batch of 100 records).
NOTE - Ignore my network which is super slow, but the metrics values would be relative.
1
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
add a comment |
The Statement
gives you the following option:
Statement stmt = con.createStatement();
stmt.addBatch("INSERT INTO employees VALUES (1000, 'Joe Jones')");
stmt.addBatch("INSERT INTO departments VALUES (260, 'Shoe')");
stmt.addBatch("INSERT INTO emp_dept VALUES (1000, 260)");
// submit a batch of update commands for execution
int updateCounts = stmt.executeBatch();
6
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
add a comment |
You'll have to benchmark, obviously, but over JDBC issuing multiple inserts will be much faster if you use a PreparedStatement rather than a Statement.
add a comment |
How about using the INSERT ALL statement ?
INSERT ALL
INTO table_name VALUES ()
INTO table_name VALUES ()
...
SELECT Statement;
I remember that the last select statement is mandatory in order to make this request succeed. Don't remember why though.
You might consider using PreparedStatement instead as well. lots of advantages !
Farid
add a comment |
You can use addBatch and executeBatch for batch insert in java See the Example : Batch Insert In Java
add a comment |
In my code I have no direct access to the 'preparedStatement' so I cannot use batch, I just pass it the query and a list of parameters. The trick however is to create a variable length insert statement, and a LinkedList of parameters. The effect is the same as the top example, with variable parameter input length.See below (error checking omitted).
Assuming 'myTable' has 3 updatable fields: f1, f2 and f3
String args={"A","B","C", "X","Y","Z" }; // etc, input list of triplets
final String QUERY="INSERT INTO [myTable] (f1,f2,f3) values ";
LinkedList params=new LinkedList();
String comma="";
StringBuilder q=QUERY;
for(int nl=0; nl< args.length; nl+=3 ) { // args is a list of triplets values
params.add(args[nl]);
params.add(args[nl+1]);
params.add(args[nl+2]);
q.append(comma+"(?,?,?)");
comma=",";
}
int nr=insertIntoDB(q, params);
in my DBInterface class I have:
int insertIntoDB(String query, LinkedList <String>params) {
preparedUPDStmt = connectionSQL.prepareStatement(query);
int n=1;
for(String x:params) {
preparedUPDStmt.setString(n++, x);
}
int updates=preparedUPDStmt.executeUpdate();
return updates;
}
add a comment |
Using PreparedStatements will be MUCH slower than Statements if you have low iterations. To gain a performance benefit from using a PrepareStatement over a statement, you need to be using it in a loop where iterations are at least 50 or higher.
6
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
add a comment |
Batch insert using statement
int a= 100;
try {
for (int i = 0; i < 10; i++) {
String insert = "insert into usermaster"
+ "("
+ "userid"
+ ")"
+ "values("
+ "'" + a + "'"
+ ");";
statement.addBatch(insert);
System.out.println(insert);
a++;
}
dbConnection.commit();
} catch (SQLException e) {
System.out.println(" Insert Failed");
System.out.println(e.getMessage());
} finally {
if (statement != null) {
statement.close();
}
if (dbConnection != null) {
dbConnection.close();
}
}
20
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f3784197%2fefficient-way-to-do-batch-inserts-with-jdbc%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
9 Answers
9
active
oldest
votes
9 Answers
9
active
oldest
votes
active
oldest
votes
active
oldest
votes
This is a mix of the two previous answers:
PreparedStatement ps = c.prepareStatement("INSERT INTO employees VALUES (?, ?)");
ps.setString(1, "John");
ps.setString(2,"Doe");
ps.addBatch();
ps.clearParameters();
ps.setString(1, "Dave");
ps.setString(2,"Smith");
ps.addBatch();
ps.clearParameters();
int results = ps.executeBatch();
3
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
30
Theps.clearParameters();
is unnecessary in this particular case.
– BalusC
Sep 24 '10 at 12:21
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
add a comment |
This is a mix of the two previous answers:
PreparedStatement ps = c.prepareStatement("INSERT INTO employees VALUES (?, ?)");
ps.setString(1, "John");
ps.setString(2,"Doe");
ps.addBatch();
ps.clearParameters();
ps.setString(1, "Dave");
ps.setString(2,"Smith");
ps.addBatch();
ps.clearParameters();
int results = ps.executeBatch();
3
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
30
Theps.clearParameters();
is unnecessary in this particular case.
– BalusC
Sep 24 '10 at 12:21
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
add a comment |
This is a mix of the two previous answers:
PreparedStatement ps = c.prepareStatement("INSERT INTO employees VALUES (?, ?)");
ps.setString(1, "John");
ps.setString(2,"Doe");
ps.addBatch();
ps.clearParameters();
ps.setString(1, "Dave");
ps.setString(2,"Smith");
ps.addBatch();
ps.clearParameters();
int results = ps.executeBatch();
This is a mix of the two previous answers:
PreparedStatement ps = c.prepareStatement("INSERT INTO employees VALUES (?, ?)");
ps.setString(1, "John");
ps.setString(2,"Doe");
ps.addBatch();
ps.clearParameters();
ps.setString(1, "Dave");
ps.setString(2,"Smith");
ps.addBatch();
ps.clearParameters();
int results = ps.executeBatch();
edited Oct 1 '13 at 8:12
Matthias
9,75023676
9,75023676
answered Sep 24 '10 at 10:26
TuscTusc
1,227187
1,227187
3
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
30
Theps.clearParameters();
is unnecessary in this particular case.
– BalusC
Sep 24 '10 at 12:21
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
add a comment |
3
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
30
Theps.clearParameters();
is unnecessary in this particular case.
– BalusC
Sep 24 '10 at 12:21
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
3
3
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
This is perfect solutions as statement is prepared (parsed) only once.
– Ashish Patil
Sep 24 '10 at 12:06
30
30
The
ps.clearParameters();
is unnecessary in this particular case.– BalusC
Sep 24 '10 at 12:21
The
ps.clearParameters();
is unnecessary in this particular case.– BalusC
Sep 24 '10 at 12:21
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
Be sure to measure it. Depending on the JDBC driver's implementation this might be the expected one roundtrip per batch but can also end up being one roundtrip per statement.
– stracktracer
Jun 20 '14 at 0:00
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
prepareStatement/setXXX - that's the way it should be!
– msciwoj
Jul 8 '16 at 12:49
add a comment |
Though the question asks inserting efficiently to Oracle using JDBC, I'm currently playing with DB2 (On IBM mainframe), conceptually inserting would be similar so thought it might be helpful to see my metrics between
inserting one record at a time
inserting a batch of records (very efficient)
Here go the metrics
1) Inserting one record at a time
public void writeWithCompileQuery(int records) {
PreparedStatement statement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
statement = connection.prepareStatement(compiledQuery);
long start = System.currentTimeMillis();
for(int index = 1; index < records; index++) {
statement.setInt(1, index);
statement.setString(2, "emp number-"+index);
statement.setInt(3, index);
statement.setInt(4, index);
statement.setString(5, "username");
long startInternal = System.currentTimeMillis();
statement.executeUpdate();
System.out.println("each transaction time taken = " + (System.currentTimeMillis() - startInternal) + " ms");
}
long end = System.currentTimeMillis();
System.out.println("total time taken = " + (end - start) + " ms");
System.out.println("avg total time taken = " + (end - start)/ records + " ms");
statement.close();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
}
}
The metrics for 100 transactions :
each transaction time taken = 123 ms
each transaction time taken = 53 ms
each transaction time taken = 48 ms
each transaction time taken = 48 ms
each transaction time taken = 49 ms
each transaction time taken = 49 ms
...
..
.
each transaction time taken = 49 ms
each transaction time taken = 49 ms
total time taken = 4935 ms
avg total time taken = 49 ms
The first transaction is taking around 120-150ms
which is for the query parse and then execution, the subsequent transactions are only taking around 50ms
. (Which is still high, but my database is on a different server(I need to troubleshoot the network))
2) With insertion in a batch (efficient one) - achieved by preparedStatement.executeBatch()
public int writeInABatchWithCompiledQuery(int records) {
PreparedStatement preparedStatement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.setString(2, "empo number-"+index);
preparedStatement.setInt(3, index+100);
preparedStatement.setInt(4, index+200);
preparedStatement.setString(5, "usernames");
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int inserted = preparedStatement.executeBatch();
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.close();
return inserted;
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
The metrics for a batch of 100 transactions is
total time taken to insert the batch = 127 ms
and for 1000 transactions
total time taken to insert the batch = 341 ms
So, making 100 transactions in ~5000ms
(with one trxn at a time) is decreased to ~150ms
(with a batch of 100 records).
NOTE - Ignore my network which is super slow, but the metrics values would be relative.
1
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
add a comment |
Though the question asks inserting efficiently to Oracle using JDBC, I'm currently playing with DB2 (On IBM mainframe), conceptually inserting would be similar so thought it might be helpful to see my metrics between
inserting one record at a time
inserting a batch of records (very efficient)
Here go the metrics
1) Inserting one record at a time
public void writeWithCompileQuery(int records) {
PreparedStatement statement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
statement = connection.prepareStatement(compiledQuery);
long start = System.currentTimeMillis();
for(int index = 1; index < records; index++) {
statement.setInt(1, index);
statement.setString(2, "emp number-"+index);
statement.setInt(3, index);
statement.setInt(4, index);
statement.setString(5, "username");
long startInternal = System.currentTimeMillis();
statement.executeUpdate();
System.out.println("each transaction time taken = " + (System.currentTimeMillis() - startInternal) + " ms");
}
long end = System.currentTimeMillis();
System.out.println("total time taken = " + (end - start) + " ms");
System.out.println("avg total time taken = " + (end - start)/ records + " ms");
statement.close();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
}
}
The metrics for 100 transactions :
each transaction time taken = 123 ms
each transaction time taken = 53 ms
each transaction time taken = 48 ms
each transaction time taken = 48 ms
each transaction time taken = 49 ms
each transaction time taken = 49 ms
...
..
.
each transaction time taken = 49 ms
each transaction time taken = 49 ms
total time taken = 4935 ms
avg total time taken = 49 ms
The first transaction is taking around 120-150ms
which is for the query parse and then execution, the subsequent transactions are only taking around 50ms
. (Which is still high, but my database is on a different server(I need to troubleshoot the network))
2) With insertion in a batch (efficient one) - achieved by preparedStatement.executeBatch()
public int writeInABatchWithCompiledQuery(int records) {
PreparedStatement preparedStatement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.setString(2, "empo number-"+index);
preparedStatement.setInt(3, index+100);
preparedStatement.setInt(4, index+200);
preparedStatement.setString(5, "usernames");
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int inserted = preparedStatement.executeBatch();
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.close();
return inserted;
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
The metrics for a batch of 100 transactions is
total time taken to insert the batch = 127 ms
and for 1000 transactions
total time taken to insert the batch = 341 ms
So, making 100 transactions in ~5000ms
(with one trxn at a time) is decreased to ~150ms
(with a batch of 100 records).
NOTE - Ignore my network which is super slow, but the metrics values would be relative.
1
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
add a comment |
Though the question asks inserting efficiently to Oracle using JDBC, I'm currently playing with DB2 (On IBM mainframe), conceptually inserting would be similar so thought it might be helpful to see my metrics between
inserting one record at a time
inserting a batch of records (very efficient)
Here go the metrics
1) Inserting one record at a time
public void writeWithCompileQuery(int records) {
PreparedStatement statement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
statement = connection.prepareStatement(compiledQuery);
long start = System.currentTimeMillis();
for(int index = 1; index < records; index++) {
statement.setInt(1, index);
statement.setString(2, "emp number-"+index);
statement.setInt(3, index);
statement.setInt(4, index);
statement.setString(5, "username");
long startInternal = System.currentTimeMillis();
statement.executeUpdate();
System.out.println("each transaction time taken = " + (System.currentTimeMillis() - startInternal) + " ms");
}
long end = System.currentTimeMillis();
System.out.println("total time taken = " + (end - start) + " ms");
System.out.println("avg total time taken = " + (end - start)/ records + " ms");
statement.close();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
}
}
The metrics for 100 transactions :
each transaction time taken = 123 ms
each transaction time taken = 53 ms
each transaction time taken = 48 ms
each transaction time taken = 48 ms
each transaction time taken = 49 ms
each transaction time taken = 49 ms
...
..
.
each transaction time taken = 49 ms
each transaction time taken = 49 ms
total time taken = 4935 ms
avg total time taken = 49 ms
The first transaction is taking around 120-150ms
which is for the query parse and then execution, the subsequent transactions are only taking around 50ms
. (Which is still high, but my database is on a different server(I need to troubleshoot the network))
2) With insertion in a batch (efficient one) - achieved by preparedStatement.executeBatch()
public int writeInABatchWithCompiledQuery(int records) {
PreparedStatement preparedStatement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.setString(2, "empo number-"+index);
preparedStatement.setInt(3, index+100);
preparedStatement.setInt(4, index+200);
preparedStatement.setString(5, "usernames");
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int inserted = preparedStatement.executeBatch();
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.close();
return inserted;
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
The metrics for a batch of 100 transactions is
total time taken to insert the batch = 127 ms
and for 1000 transactions
total time taken to insert the batch = 341 ms
So, making 100 transactions in ~5000ms
(with one trxn at a time) is decreased to ~150ms
(with a batch of 100 records).
NOTE - Ignore my network which is super slow, but the metrics values would be relative.
Though the question asks inserting efficiently to Oracle using JDBC, I'm currently playing with DB2 (On IBM mainframe), conceptually inserting would be similar so thought it might be helpful to see my metrics between
inserting one record at a time
inserting a batch of records (very efficient)
Here go the metrics
1) Inserting one record at a time
public void writeWithCompileQuery(int records) {
PreparedStatement statement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
statement = connection.prepareStatement(compiledQuery);
long start = System.currentTimeMillis();
for(int index = 1; index < records; index++) {
statement.setInt(1, index);
statement.setString(2, "emp number-"+index);
statement.setInt(3, index);
statement.setInt(4, index);
statement.setString(5, "username");
long startInternal = System.currentTimeMillis();
statement.executeUpdate();
System.out.println("each transaction time taken = " + (System.currentTimeMillis() - startInternal) + " ms");
}
long end = System.currentTimeMillis();
System.out.println("total time taken = " + (end - start) + " ms");
System.out.println("avg total time taken = " + (end - start)/ records + " ms");
statement.close();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
}
}
The metrics for 100 transactions :
each transaction time taken = 123 ms
each transaction time taken = 53 ms
each transaction time taken = 48 ms
each transaction time taken = 48 ms
each transaction time taken = 49 ms
each transaction time taken = 49 ms
...
..
.
each transaction time taken = 49 ms
each transaction time taken = 49 ms
total time taken = 4935 ms
avg total time taken = 49 ms
The first transaction is taking around 120-150ms
which is for the query parse and then execution, the subsequent transactions are only taking around 50ms
. (Which is still high, but my database is on a different server(I need to troubleshoot the network))
2) With insertion in a batch (efficient one) - achieved by preparedStatement.executeBatch()
public int writeInABatchWithCompiledQuery(int records) {
PreparedStatement preparedStatement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.setString(2, "empo number-"+index);
preparedStatement.setInt(3, index+100);
preparedStatement.setInt(4, index+200);
preparedStatement.setString(5, "usernames");
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int inserted = preparedStatement.executeBatch();
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.close();
return inserted;
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
The metrics for a batch of 100 transactions is
total time taken to insert the batch = 127 ms
and for 1000 transactions
total time taken to insert the batch = 341 ms
So, making 100 transactions in ~5000ms
(with one trxn at a time) is decreased to ~150ms
(with a batch of 100 records).
NOTE - Ignore my network which is super slow, but the metrics values would be relative.
edited Dec 16 '18 at 19:40
answered Mar 13 '17 at 3:20
prayagupdprayagupd
19.9k890139
19.9k890139
1
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
add a comment |
1
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
1
1
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
Hi. Does the length of the record play a role in the time to insert ?? I have 3 Varchar columns with URIs as their values and inserting 8555 as a batch still takind ~3.5 min to insert !!
– Prathamesh dhanawade
Oct 30 '17 at 16:47
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
As per my understanding record size might matter during data transfer to database server from your application server but insertion time does not affect much. I tried in local oracle database with 3 columns of size 125 bytes and takes around (145 to 300) ms for batch of 10,000 records. Code here. While multiple transactions for 10,000 records takes 20seconds.
– prayagupd
Dec 16 '18 at 20:41
add a comment |
The Statement
gives you the following option:
Statement stmt = con.createStatement();
stmt.addBatch("INSERT INTO employees VALUES (1000, 'Joe Jones')");
stmt.addBatch("INSERT INTO departments VALUES (260, 'Shoe')");
stmt.addBatch("INSERT INTO emp_dept VALUES (1000, 260)");
// submit a batch of update commands for execution
int updateCounts = stmt.executeBatch();
6
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
add a comment |
The Statement
gives you the following option:
Statement stmt = con.createStatement();
stmt.addBatch("INSERT INTO employees VALUES (1000, 'Joe Jones')");
stmt.addBatch("INSERT INTO departments VALUES (260, 'Shoe')");
stmt.addBatch("INSERT INTO emp_dept VALUES (1000, 260)");
// submit a batch of update commands for execution
int updateCounts = stmt.executeBatch();
6
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
add a comment |
The Statement
gives you the following option:
Statement stmt = con.createStatement();
stmt.addBatch("INSERT INTO employees VALUES (1000, 'Joe Jones')");
stmt.addBatch("INSERT INTO departments VALUES (260, 'Shoe')");
stmt.addBatch("INSERT INTO emp_dept VALUES (1000, 260)");
// submit a batch of update commands for execution
int updateCounts = stmt.executeBatch();
The Statement
gives you the following option:
Statement stmt = con.createStatement();
stmt.addBatch("INSERT INTO employees VALUES (1000, 'Joe Jones')");
stmt.addBatch("INSERT INTO departments VALUES (260, 'Shoe')");
stmt.addBatch("INSERT INTO emp_dept VALUES (1000, 260)");
// submit a batch of update commands for execution
int updateCounts = stmt.executeBatch();
answered Sep 24 '10 at 10:10
BozhoBozho
485k1089521064
485k1089521064
6
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
add a comment |
6
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
6
6
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
While the end result is same, in this method, multiple statements are parsed, which is much slower for bulk, in fact not much efficient than executing each statement individually. Also please use PreparedStatement whenever possible for repeated queries as they perform much better..
– Ashish Patil
Sep 24 '10 at 12:09
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
@AshishPatil: do you have any benchmarks for testing with and without PreparedStatement?
– gaurav
Aug 13 '18 at 11:10
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
Whoa! After 8 years. Nevertheless, @prayagupd has given detailed stats in his answer which is much more recent. stackoverflow.com/a/42756134/372055
– Ashish Patil
Aug 13 '18 at 14:34
add a comment |
You'll have to benchmark, obviously, but over JDBC issuing multiple inserts will be much faster if you use a PreparedStatement rather than a Statement.
add a comment |
You'll have to benchmark, obviously, but over JDBC issuing multiple inserts will be much faster if you use a PreparedStatement rather than a Statement.
add a comment |
You'll have to benchmark, obviously, but over JDBC issuing multiple inserts will be much faster if you use a PreparedStatement rather than a Statement.
You'll have to benchmark, obviously, but over JDBC issuing multiple inserts will be much faster if you use a PreparedStatement rather than a Statement.
answered Sep 24 '10 at 4:33
Burleigh BearBurleigh Bear
2,7261627
2,7261627
add a comment |
add a comment |
How about using the INSERT ALL statement ?
INSERT ALL
INTO table_name VALUES ()
INTO table_name VALUES ()
...
SELECT Statement;
I remember that the last select statement is mandatory in order to make this request succeed. Don't remember why though.
You might consider using PreparedStatement instead as well. lots of advantages !
Farid
add a comment |
How about using the INSERT ALL statement ?
INSERT ALL
INTO table_name VALUES ()
INTO table_name VALUES ()
...
SELECT Statement;
I remember that the last select statement is mandatory in order to make this request succeed. Don't remember why though.
You might consider using PreparedStatement instead as well. lots of advantages !
Farid
add a comment |
How about using the INSERT ALL statement ?
INSERT ALL
INTO table_name VALUES ()
INTO table_name VALUES ()
...
SELECT Statement;
I remember that the last select statement is mandatory in order to make this request succeed. Don't remember why though.
You might consider using PreparedStatement instead as well. lots of advantages !
Farid
How about using the INSERT ALL statement ?
INSERT ALL
INTO table_name VALUES ()
INTO table_name VALUES ()
...
SELECT Statement;
I remember that the last select statement is mandatory in order to make this request succeed. Don't remember why though.
You might consider using PreparedStatement instead as well. lots of advantages !
Farid
answered Sep 24 '10 at 10:27
FaridFarid
71221224
71221224
add a comment |
add a comment |
You can use addBatch and executeBatch for batch insert in java See the Example : Batch Insert In Java
add a comment |
You can use addBatch and executeBatch for batch insert in java See the Example : Batch Insert In Java
add a comment |
You can use addBatch and executeBatch for batch insert in java See the Example : Batch Insert In Java
You can use addBatch and executeBatch for batch insert in java See the Example : Batch Insert In Java
answered Sep 18 '13 at 9:29
user1454294user1454294
11
11
add a comment |
add a comment |
In my code I have no direct access to the 'preparedStatement' so I cannot use batch, I just pass it the query and a list of parameters. The trick however is to create a variable length insert statement, and a LinkedList of parameters. The effect is the same as the top example, with variable parameter input length.See below (error checking omitted).
Assuming 'myTable' has 3 updatable fields: f1, f2 and f3
String args={"A","B","C", "X","Y","Z" }; // etc, input list of triplets
final String QUERY="INSERT INTO [myTable] (f1,f2,f3) values ";
LinkedList params=new LinkedList();
String comma="";
StringBuilder q=QUERY;
for(int nl=0; nl< args.length; nl+=3 ) { // args is a list of triplets values
params.add(args[nl]);
params.add(args[nl+1]);
params.add(args[nl+2]);
q.append(comma+"(?,?,?)");
comma=",";
}
int nr=insertIntoDB(q, params);
in my DBInterface class I have:
int insertIntoDB(String query, LinkedList <String>params) {
preparedUPDStmt = connectionSQL.prepareStatement(query);
int n=1;
for(String x:params) {
preparedUPDStmt.setString(n++, x);
}
int updates=preparedUPDStmt.executeUpdate();
return updates;
}
add a comment |
In my code I have no direct access to the 'preparedStatement' so I cannot use batch, I just pass it the query and a list of parameters. The trick however is to create a variable length insert statement, and a LinkedList of parameters. The effect is the same as the top example, with variable parameter input length.See below (error checking omitted).
Assuming 'myTable' has 3 updatable fields: f1, f2 and f3
String args={"A","B","C", "X","Y","Z" }; // etc, input list of triplets
final String QUERY="INSERT INTO [myTable] (f1,f2,f3) values ";
LinkedList params=new LinkedList();
String comma="";
StringBuilder q=QUERY;
for(int nl=0; nl< args.length; nl+=3 ) { // args is a list of triplets values
params.add(args[nl]);
params.add(args[nl+1]);
params.add(args[nl+2]);
q.append(comma+"(?,?,?)");
comma=",";
}
int nr=insertIntoDB(q, params);
in my DBInterface class I have:
int insertIntoDB(String query, LinkedList <String>params) {
preparedUPDStmt = connectionSQL.prepareStatement(query);
int n=1;
for(String x:params) {
preparedUPDStmt.setString(n++, x);
}
int updates=preparedUPDStmt.executeUpdate();
return updates;
}
add a comment |
In my code I have no direct access to the 'preparedStatement' so I cannot use batch, I just pass it the query and a list of parameters. The trick however is to create a variable length insert statement, and a LinkedList of parameters. The effect is the same as the top example, with variable parameter input length.See below (error checking omitted).
Assuming 'myTable' has 3 updatable fields: f1, f2 and f3
String args={"A","B","C", "X","Y","Z" }; // etc, input list of triplets
final String QUERY="INSERT INTO [myTable] (f1,f2,f3) values ";
LinkedList params=new LinkedList();
String comma="";
StringBuilder q=QUERY;
for(int nl=0; nl< args.length; nl+=3 ) { // args is a list of triplets values
params.add(args[nl]);
params.add(args[nl+1]);
params.add(args[nl+2]);
q.append(comma+"(?,?,?)");
comma=",";
}
int nr=insertIntoDB(q, params);
in my DBInterface class I have:
int insertIntoDB(String query, LinkedList <String>params) {
preparedUPDStmt = connectionSQL.prepareStatement(query);
int n=1;
for(String x:params) {
preparedUPDStmt.setString(n++, x);
}
int updates=preparedUPDStmt.executeUpdate();
return updates;
}
In my code I have no direct access to the 'preparedStatement' so I cannot use batch, I just pass it the query and a list of parameters. The trick however is to create a variable length insert statement, and a LinkedList of parameters. The effect is the same as the top example, with variable parameter input length.See below (error checking omitted).
Assuming 'myTable' has 3 updatable fields: f1, f2 and f3
String args={"A","B","C", "X","Y","Z" }; // etc, input list of triplets
final String QUERY="INSERT INTO [myTable] (f1,f2,f3) values ";
LinkedList params=new LinkedList();
String comma="";
StringBuilder q=QUERY;
for(int nl=0; nl< args.length; nl+=3 ) { // args is a list of triplets values
params.add(args[nl]);
params.add(args[nl+1]);
params.add(args[nl+2]);
q.append(comma+"(?,?,?)");
comma=",";
}
int nr=insertIntoDB(q, params);
in my DBInterface class I have:
int insertIntoDB(String query, LinkedList <String>params) {
preparedUPDStmt = connectionSQL.prepareStatement(query);
int n=1;
for(String x:params) {
preparedUPDStmt.setString(n++, x);
}
int updates=preparedUPDStmt.executeUpdate();
return updates;
}
answered Nov 14 '18 at 12:34
user3211098user3211098
65
65
add a comment |
add a comment |
Using PreparedStatements will be MUCH slower than Statements if you have low iterations. To gain a performance benefit from using a PrepareStatement over a statement, you need to be using it in a loop where iterations are at least 50 or higher.
6
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
add a comment |
Using PreparedStatements will be MUCH slower than Statements if you have low iterations. To gain a performance benefit from using a PrepareStatement over a statement, you need to be using it in a loop where iterations are at least 50 or higher.
6
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
add a comment |
Using PreparedStatements will be MUCH slower than Statements if you have low iterations. To gain a performance benefit from using a PrepareStatement over a statement, you need to be using it in a loop where iterations are at least 50 or higher.
Using PreparedStatements will be MUCH slower than Statements if you have low iterations. To gain a performance benefit from using a PrepareStatement over a statement, you need to be using it in a loop where iterations are at least 50 or higher.
answered Sep 30 '10 at 13:32
MickeyMickey
251
251
6
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
add a comment |
6
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
6
6
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
No, it won't, ever. A normal Statement (not PrepareStatement) object has to do ALL the same things a PreparedStatement does and is in fact a wrapper around PreparedStatement that actually does the prepared part as well. The difference between the two is that a Statement object silently prepares the statement and validates it every time you execute it, where as a prepared statement only does that once and then can be executed multiple times to process each item in the batch.
– David
Oct 12 '15 at 19:29
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
is this answer valid at all??
– prayagupd
Mar 13 '17 at 2:42
add a comment |
Batch insert using statement
int a= 100;
try {
for (int i = 0; i < 10; i++) {
String insert = "insert into usermaster"
+ "("
+ "userid"
+ ")"
+ "values("
+ "'" + a + "'"
+ ");";
statement.addBatch(insert);
System.out.println(insert);
a++;
}
dbConnection.commit();
} catch (SQLException e) {
System.out.println(" Insert Failed");
System.out.println(e.getMessage());
} finally {
if (statement != null) {
statement.close();
}
if (dbConnection != null) {
dbConnection.close();
}
}
20
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
add a comment |
Batch insert using statement
int a= 100;
try {
for (int i = 0; i < 10; i++) {
String insert = "insert into usermaster"
+ "("
+ "userid"
+ ")"
+ "values("
+ "'" + a + "'"
+ ");";
statement.addBatch(insert);
System.out.println(insert);
a++;
}
dbConnection.commit();
} catch (SQLException e) {
System.out.println(" Insert Failed");
System.out.println(e.getMessage());
} finally {
if (statement != null) {
statement.close();
}
if (dbConnection != null) {
dbConnection.close();
}
}
20
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
add a comment |
Batch insert using statement
int a= 100;
try {
for (int i = 0; i < 10; i++) {
String insert = "insert into usermaster"
+ "("
+ "userid"
+ ")"
+ "values("
+ "'" + a + "'"
+ ");";
statement.addBatch(insert);
System.out.println(insert);
a++;
}
dbConnection.commit();
} catch (SQLException e) {
System.out.println(" Insert Failed");
System.out.println(e.getMessage());
} finally {
if (statement != null) {
statement.close();
}
if (dbConnection != null) {
dbConnection.close();
}
}
Batch insert using statement
int a= 100;
try {
for (int i = 0; i < 10; i++) {
String insert = "insert into usermaster"
+ "("
+ "userid"
+ ")"
+ "values("
+ "'" + a + "'"
+ ");";
statement.addBatch(insert);
System.out.println(insert);
a++;
}
dbConnection.commit();
} catch (SQLException e) {
System.out.println(" Insert Failed");
System.out.println(e.getMessage());
} finally {
if (statement != null) {
statement.close();
}
if (dbConnection != null) {
dbConnection.close();
}
}
answered Jun 12 '13 at 9:46
PD Shah 5382PD Shah 5382
34
34
20
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
add a comment |
20
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
20
20
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Oh, Yes. Little Bobby Tables...
– Daniel Dinnyes
Sep 2 '13 at 14:42
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
Dynamic statements are almost always a bad idea. Both for security (although it is not the case in this very simple example) and performance.
– frroland
May 18 '15 at 12:51
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f3784197%2fefficient-way-to-do-batch-inserts-with-jdbc%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
WOW! I tested your "collapse multiple inserts into one" while inserting to SQL Server, and I went from 107 rows/second to 3333 rows per second!
– Wouter
Nov 7 '16 at 16:03
1
That's a stunning 31x increase.
– gaurav
Aug 13 '18 at 11:06