Uploaded image for project: 'Talend Component Kit'
  1. Talend Component Kit
  2. TCOMP-1181

tacokit can't pass the long type field from ui rightly

Apply templateInsert Lucidchart DiagramXMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Fixed
    • Icon: Blocker Blocker
    • 1.1.3
    • None
    • None
    • All
    • Small

      I meet this issue when integrate the fileio to datastream platform and run the pipeline

      org.apache.xbean.recipe.ConstructionException: Unable to convert property value from java.lang.String to long for injection public void org.talend.components.fileio.hdfs.SimpleFileIODataSet.setFooterLine4EXCEL(long)
       at org.apache.xbean.recipe.ObjectRecipe.setProperty(ObjectRecipe.java:510)
       at org.apache.xbean.recipe.ObjectRecipe.setProperties(ObjectRecipe.java:378)
       at org.apache.xbean.recipe.ObjectRecipe.internalCreate(ObjectRecipe.java:289)
       at org.apache.xbean.recipe.AbstractRecipe.create(AbstractRecipe.java:96)
       at org.apache.xbean.recipe.AbstractRecipe.create(AbstractRecipe.java:61)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.createObject(ReflectionService.java:599)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.lambda$null$30(ReflectionService.java:347)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.lambda$createContextualSupplier$12(ReflectionService.java:192)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.lambda$createObjectFactory$31(ReflectionService.java:347)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.lambda$null$2(ReflectionService.java:117)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.lambda$null$9(ReflectionService.java:182)
       at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
       at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
       at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
       at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
       at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
       at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
       at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
       at org.talend.sdk.component.runtime.manager.reflect.ReflectionService.lambda$parameterFactory$11(ReflectionService.java:182)
       at org.talend.sdk.component.runtime.manager.ComponentManager$ComponentMetaBuilder.lambda$null$1(ComponentManager.java:1458)
       at org.talend.sdk.component.runtime.manager.ComponentManager.executeInContainer(ComponentManager.java:832)
       at org.talend.sdk.component.runtime.manager.ComponentManager.access$800(ComponentManager.java:171)
       at org.talend.sdk.component.runtime.manager.ComponentManager$ComponentMetaBuilder.lambda$onPartitionMapper$2(ComponentManager.java:1454)
       at org.talend.sdk.component.runtime.manager.ComponentManager.lambda$null$14(ComponentManager.java:696)
       at java.util.Optional.map(Optional.java:215)
       at org.talend.sdk.component.runtime.manager.ComponentManager.lambda$createComponent$15(ComponentManager.java:694)
       at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
       at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
       at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
       at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
       at java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:419)
       at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
       at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270)
       at java.util.concurrent.ConcurrentHashMap$ValueSpliterator.tryAdvance(ConcurrentHashMap.java:3574)
       at java.util.stream.ReferencePipeline.forEachWithCancel(ReferencePipeline.java:126)
       at java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:498)
       at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:485)
       at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
       at java.util.stream.FindOps$FindOp.evaluateSequential(FindOps.java:152)
       at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
       at java.util.stream.ReferencePipeline.findFirst(ReferencePipeline.java:464)
      
      at org.talend.sdk.component.runtime.manager.ComponentManager.createComponent(ComponentManager.java:698)
       at org.talend.datastreams.beam.compiler.util.ComponentsUtil.getRuntime(ComponentsUtil.java:67)
       at org.talend.datastreams.beam.compiler.BeamCompiler.compileComponent(BeamCompiler.java:101)
       at org.talend.datastreams.beam.compiler.runtimeflow.RuntimeFlowBeamCompiler.compile(RuntimeFlowBeamCompiler.java:82)
       at org.talend.datastreams.beam.compiler.runtimeflow.RuntimeFlowBeamCompiler.compile(RuntimeFlowBeamCompiler.java:29)
       at org.talend.datastreams.streamsjob.FullRunJob$.runJob(FullRunJob.scala:79)
       at org.talend.datastreams.streamsjob.FullRunJob$.main(FullRunJob.scala:63)
       at org.talend.datastreams.streamsjob.FullRunJob.main(FullRunJob.scala)
       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
       at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
       at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
       at java.lang.reflect.Method.invoke(Method.java:498)
       at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
       at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:162)
       at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:160)
       at java.security.AccessController.doPrivileged(Native Method)
       at javax.security.auth.Subject.doAs(Subject.java:422)
       at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
       at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:160)
       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      Caused by: org.apache.xbean.propertyeditor.PropertyEditorException: java.lang.NumberFormatException: For input string: "1.0"
       at org.apache.xbean.propertyeditor.LongEditor.toObjectImpl(LongEditor.java:31)
       at org.apache.xbean.propertyeditor.AbstractConverter.toObject(AbstractConverter.java:86)
       at org.apache.xbean.propertyeditor.PropertyEditorRegistry.getValue(PropertyEditorRegistry.java:212)
       at org.apache.xbean.recipe.RecipeHelper.convert(RecipeHelper.java:165)
       at org.apache.xbean.recipe.ObjectRecipe.setProperty(ObjectRecipe.java:504)
       ... 62 more
      Caused by: java.lang.NumberFormatException: For input string: "1.0"
       at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
       at java.lang.Long.parseLong(Long.java:589)
       at java.lang.Long.valueOf(Long.java:803)
       at org.apache.xbean.propertyeditor.LongEditor.toObjectImpl(LongEditor.java:29)
       ... 66 more
      
      

       

      seems ui pass "1.0" string, then long parser fail, though i set 1 only in the ui field and even not visible. you can reproduce it by this branch :

      wwang-talend/TDI-40706 in connectors-ee and integrate it to datastream

            rmannibucau Romain Manni-Bucau
            wwang Wei Wang
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: