Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE

Qlik Talend Data Integration: How to use the Bulk Copy API for batch insert operations for tJDBCOutput in Spark Job for SQLServer

100% helpful (1/1)
cancel
Showing results for 
Search instead for 
Did you mean: 
wei_guo
Support
Support

Qlik Talend Data Integration: How to use the Bulk Copy API for batch insert operations for tJDBCOutput in Spark Job for SQLServer

Last Update:

Apr 17, 2024 6:24:25 AM

Updated By:

Sonja_Bauernfeind

Created date:

Apr 17, 2024 6:24:25 AM

To take advanced of Batch Size provided performance boost on SQLserver, please make sure using sqlserver jdbc version > 9.2

Microsoft JDBC Driver for SQL Server version 9.2 and above supports using the Bulk Copy API for batch insert operations. This feature allows users to enable the driver to do Bulk Copy operations underneath when executing batch insert operations. The driver aims to achieve improvement in performance while inserting the same data as the driver would have with regular batch insert operation. The driver parses the user's SQL Query, using the Bulk Copy API instead of the usual batch insert operation. Below are various ways to enable the Bulk Copy API for batch insert feature and lists its limitations. This page also contains a small sample code that demonstrates a usage and the performance increase as well.


This feature is only applicable to PreparedStatement and CallableStatement's executeBatch() & executeLargeBatch() APIs.and our  tJDBCOutput "Use Batch"  take advantage of executeBatch()

component mssql-jdbc.png

Job sqlSpark.png

 

Related Content

Using bulk copy API for batch insert operation

 

Environment

Talend Data Integration 
Talend Studio 

Labels (1)
Version history
Last update:
2 weeks ago
Updated by: