Details
-
Work Item
-
Status: closed
-
Major
-
Resolution: Done
-
None
-
None
-
None
-
All
-
DQ18-FR-E
-
GreenHopper Ranking:0|i16meb:
-
9223372036854775807
-
Small
Description
Aim
- A job with the Spark tDataprepRun component created in the 7.0.1 studio must still be runnable without any modification (scheduled job in TAC for instance) when new versions of Dataprep and DQ dict will be deployed in the Cloud.
some ideas
Current idea:
Have a new attribute Map<String,String> in DqCategoryForValidation to store additional parameters (like PII for instance).
Tests:
- create a job with tDataprepRun Spark version with the old version (7.0Beta) of DqCategoryForValidation
- patch a Dataprep server with the new version of DqCategoryForValidation
- confirm that the old job can still run
another idea:
- manual deserialization on tDataprepRun of the json payload of the DQ category and only instantiate what we need.
Another idea:
- use @JsonIgnoreProperties(ignoreUnknown = true)
https://stackoverflow.com/questions/5455014/ignoring-new-fields-on-json-objects-using-jackson
Attachments
Issue Links
- is related to
-
TDP-5244 Compatibility issue with Studio and Dataprep server when using tDataprepRun Spark
-
- closed
-