Skip to main content
Announcements
Qlik Introduces a New Era of Visualization! READ ALL ABOUT IT
cancel
Showing results for 
Search instead for 
Did you mean: 
Mehdi_Mhedhbi2
Contributor III
Contributor III

tOracleInput TOracleOutput,Oralce database , undo tablespace


Hello, I'am using Talend enterprise edition to load a huge amount of data from an oracle database to a datawarhouse type oracle also.
in my job I'am using a select query to load 100 million rows from a table and trying to load them into the DataWarhouse using tOracleInput and TOracleOutput.
I'am facing an error: ORA-30036: unable to extend segment in the undo tablespace ,I know that it is related to oracle undo tablespace because 100 million rows need to be stored in the undo table space before doing a commit or a rollback operation is there a way to solve this in talend  by doing commit per batch or something  else ? 

Labels (3)
1 Solution

Accepted Solutions
Anonymous
Not applicable

Hi,

 

    There are multiple course of actions you can take.

 

a) The most easy path will be to increase the undo tablespace. But DBAs won't prefer this method unless its really required.

 

http://www.dba-oracle.com/sf_ora_30036_unable_to_extend_segment_by_string_in_undo_tablespace_string....

 

b) You can specify the commit size and batch size in tOracleOutput component in Advanced settings.0683p000009M8Ww.png

 

   Again, the flip side of this method is that the component is not meant for heavy duty data movement. This component is meant to be used when the incoming data volume is low.

 

c) Coming to the recommended approach, it would be always ideal to use Talend DB Bulk components to do heavy duty data migration to Oracle (for any DB for that matter). In the case of Oracle Bulk component, it is generating SQLLDR scripts behind the scenes. And you can specify the Bulk loading parameters in the advanced section of the component.

 

0683p000009M8h0.png

 

The Bulk data processing is a two stage approach. First it will create a file based on incoming data and then it will write it to target DB. You can do bulk processing in two forms.

 

a) Write the file first using tOracleOutputBulk component and then load the file to DB using tOracleBulkExec component (as a two stage process)

b) Combine both stages into one component using tOracleOutputBulkExec component (as shown above).

 

If you are beginner, use the multiple component option so that you can easily identify the area if you are facing any issues. Hope I answered your query. Please spare a second to mark the topic as resolved 🙂

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved

View solution in original post

2 Replies
Anonymous
Not applicable

Hi,

 

    There are multiple course of actions you can take.

 

a) The most easy path will be to increase the undo tablespace. But DBAs won't prefer this method unless its really required.

 

http://www.dba-oracle.com/sf_ora_30036_unable_to_extend_segment_by_string_in_undo_tablespace_string....

 

b) You can specify the commit size and batch size in tOracleOutput component in Advanced settings.0683p000009M8Ww.png

 

   Again, the flip side of this method is that the component is not meant for heavy duty data movement. This component is meant to be used when the incoming data volume is low.

 

c) Coming to the recommended approach, it would be always ideal to use Talend DB Bulk components to do heavy duty data migration to Oracle (for any DB for that matter). In the case of Oracle Bulk component, it is generating SQLLDR scripts behind the scenes. And you can specify the Bulk loading parameters in the advanced section of the component.

 

0683p000009M8h0.png

 

The Bulk data processing is a two stage approach. First it will create a file based on incoming data and then it will write it to target DB. You can do bulk processing in two forms.

 

a) Write the file first using tOracleOutputBulk component and then load the file to DB using tOracleBulkExec component (as a two stage process)

b) Combine both stages into one component using tOracleOutputBulkExec component (as shown above).

 

If you are beginner, use the multiple component option so that you can easily identify the area if you are facing any issues. Hope I answered your query. Please spare a second to mark the topic as resolved 🙂

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved

Mehdi_Mhedhbi2
Contributor III
Contributor III
Author

thank you for responding, I am familiar with bulk loading, but my question now, will it solve my problem cause after all data will always pass per the undo table space no?