Details
-
Bug
-
Status: closed
-
Critical
-
Resolution: Fixed
-
None
-
All
-
All
-
Sprint 14 TDP, Sprint 17 TDP, Sprint 21 TDP, Sprint 22 TDP, Sprint 23 TDP (feb 26), Sprint 24 TDP (mar 19), Sprint 25 TDP (apr 12)
-
Small
-
Waiting on Talend Verification
Description
Steps to reproduce
- Perform a fresh install on Windows using the June 23rd Installer (20170623_1246)
- Create an S3 dataset
- Do 1 preparation step
- Try to export to S3 (CSV file, default record and field delimiters)
Current behavior
- The export fails with the so-helpful error message in the task history
- Logs are attached (Data Prep & TCOMP)
Expected behavior
Something like "I can export my preparation to S3"
Notes
- Interestingly, I don't always get exactly the same error in the logs. I made 4 attempts (all visible in the logs):
- First one at 8:59pm on a dataset containing exactly 10k rows
- The next 3 ones on a dataset containing 100k rows. For each attempt, I tried with different record and field delimiters and I have slightly different errors in the logs.
- Exporting to S3 fails as well if you select Avro or Parquet
- Exporting to S3 fails even if the source dataset is not S3 (I tried with a local CSV file and got the same "output type not supported")
- I've attached the source datasets (both were uploaded to S3 via the S3 management console)
Attachments
Issue Links
- depends on
-
TDI-41193 OutOfMemoryError occurs when reading a large file from S3
-
- Done
-
- duplicates
-
TDP-4461 S3 fullrun fails on dev-ee/cloud with large datasets
-
- closed
-
- is duplicated by
-
TDP-4115 Export to S3 displayed and logged "successful" even if the export has failed in reality.
-
- closed
-
- is parent of
-
TCOMP-595 Failed to locate the winutils binary in the hadoop binary path when running S3 component on Windows
-
- closed
-
- is related to
-
TDI-41193 OutOfMemoryError occurs when reading a large file from S3
-
- Done
-
- prerequisite of
-
TDP-3644 S3 self-service-connector - output - with encryption
-
- closed
-
- this issue is linked by
-
TDP-4977 Exporting to S3 fails (at least when using the Data Prep runtime) - Sustainable fix
-
- closed
-
1.
|
"KMS customert master key" is always mandatory |
|
closed | Unassigned |
2.
|
S3 fullrun fails on dev-ee/cloud with large datasets |
|
closed | Unassigned |