Skip to main content
Announcements
Qlik Introduces a New Era of Visualization! READ ALL ABOUT IT
cancel
Showing results for 
Search instead for 
Did you mean: 
Pranita123
Partner - Contributor III
Partner - Contributor III

Hadoop as target

Hello Community,

We're having trouble connecting to HDFS(Hadoop manage endpoint) in Linux.

The error message says: "Failed to connect to HDFS path '/'" and "Failed to curl_easy_perform: curl status = 56, Failure when receiving data from the peer (CONNECT tunnel failed, response 503)."

In addition, initially, we were able to successfully establish a connection with the target and successfully load data into it.

Your assistance in this situation is greatly valued.

 

Regards,

Pranita

 

Labels (2)
1 Reply
john_wang
Support
Support

Hello @Pranita123 ,

Thanks for reaching out to Qlik Community!

There are many reason may lead the same error, includes:

  • Hive ODBC version
    - You might disable the Hive ODBC Access to know if the error caused by it
  • Timeout issue
    - Set internal parameter executetimeout in target endpoint and increasing it to higher value eg 6000
  • if the account have enough permission to access HDFS folder "/"
    - in general the data will be written to a sub-folder under "/"
  • We ever experienced such error caused by a bug on the Cloudera side
    - Did you ever upgrade Hadoop and then got the error
  • Set below components logging level to Trace or Verbose to get great detailed information from task log file
    - COMMUNICATION/FILE_FACTORY/TARGET_LOAD/TARGET_APPLY

Hope this helps.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!