Data loading issue from SFTP | Community
Skip to main content
New Participant
April 27, 2022
Solved

Data loading issue from SFTP

  • April 27, 2022
  • 1 reply
  • 1594 views

Hi,

I am trying to load a .csv file having 1.4m records from SFTP to AEP XDM Schema. I am getting below error

 

ERROR:

CONNECTOR-1001-500: Error in copying the data from Source


Error code 2200: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because the connected host has failed to respond

 

Note: While I tried to load the same file with 20-30 records I can successfully load it.

 

Can anyone please help me to get it resolved?

 

Thanks

Gouri

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by jitendrakhatwani

Hi Gouri, 

 

You can try breaking the file or use FileCollector

Below is the statement from Adobe's KB article.

Uploading huge files on the server using file upload functionality is not recommended. The best is to place the file on SFTP and then use File collector.

If the file is enormous, it will take some time to process. The only possible way to reduce time is to multiple workflows and break the file into smaller chunks. Use each workflow to process one piece, this method will provide you with some parallelism, but eventually, data is inserted inside the same table/database. There will be a performance improvement with the concurrent workflows.

There is no file size restriction when using the SFTP way of uploading data(via File Collector).

 

Here is the link https://experienceleague.adobe.com/docs/experience-cloud-kcs/kbarticles/KA-19430.html%3Flang%3Dnl

 

Regards

Jitendra

 

1 reply

jitendrakhatwani
jitendrakhatwaniAccepted solution
New Participant
May 13, 2022

Hi Gouri, 

 

You can try breaking the file or use FileCollector

Below is the statement from Adobe's KB article.

Uploading huge files on the server using file upload functionality is not recommended. The best is to place the file on SFTP and then use File collector.

If the file is enormous, it will take some time to process. The only possible way to reduce time is to multiple workflows and break the file into smaller chunks. Use each workflow to process one piece, this method will provide you with some parallelism, but eventually, data is inserted inside the same table/database. There will be a performance improvement with the concurrent workflows.

There is no file size restriction when using the SFTP way of uploading data(via File Collector).

 

Here is the link https://experienceleague.adobe.com/docs/experience-cloud-kcs/kbarticles/KA-19430.html%3Flang%3Dnl

 

Regards

Jitendra