Uploaded image for project: 'Talend Component Kit'
  1. Talend Component Kit
  2. TCOMP-413

JDBCInput Exception on large datasets

Apply templateInsert Lucidchart Diagram
    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Major
    • 0.18.0
    • None
    • None
    • None
    • All
    • Small

    Description

      Large datasets with JDBCInput cause exception :

      "java.lang.RuntimeException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.NoSuchElementException: No schema found for Lazy0",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:91)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:20)",
      "at spark.jobserver.SparkJobBase$class.runJob(SparkJob.scala:31)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:20)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:20)",
      "at spark.jobserver.JobManagerActor$$anonfun$getJobFuture$4.apply(JobManagerActor.scala:307)",
      "at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)",
      "at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)",
      "at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)",
      "at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)",
      "at java.lang.Thread.run(Thread.java:745)",
      "Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.NoSuchElementException: No schema found for Lazy0",
      "at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:72)",
      "at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:112)",
      "at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:101)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:93)",
      "... 10 more",
      "Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.NoSuchElementException: No schema found for Lazy0",
      "at org.talend.components.adapter.beam.coders.LazyAvroCoder.getSchema(LazyAvroCoder.java:69)",
      "at org.talend.components.adapter.beam.coders.LazyAvroCoder.decode(LazyAvroCoder.java:91)",
      "at org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:670)",
      "at org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:603)",
      "at org.apache.beam.runners.spark.coders.CoderHelpers.fromByteArray(CoderHelpers.java:85)",
      "at org.apache.beam.runners.spark.coders.CoderHelpers$5$1.apply(CoderHelpers.java:182)",
      "at org.apache.beam.runners.spark.coders.CoderHelpers$5$1.apply(CoderHelpers.java:179)",
      "at com.google.common.collect.Iterators$8.next(Iterators.java:812)",
      "at org.apache.beam.sdk.repackaged.com.google.common.collect.Iterators$6.next(Iterators.java:649)",
      "at org.apache.beam.sdk.repackaged.com.google.common.collect.Iterators$6.next(Iterators.java:635)",
      "at org.apache.beam.runners.core.GroupAlsoByWindowsViaOutputBufferDoFn.processElement(GroupAlsoByWindowsViaOutputBufferDoFn.java:83)",
      "at org.apache.beam.runners.spark.translation.SparkProcessContext$ProcCtxtIterator.invokeProcessElement(SparkProcessContext.java:372)",
      "at org.apache.beam.runners.spark.translation.SparkProcessContext$ProcCtxtIterator.computeNext(SparkProcessContext.java:335)",
      "at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)",
      "at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)",
      "at org.apache.beam.runners.spark.translation.SparkProcessContext.callWithCtxt(SparkProcessContext.java:91)",
      "at org.apache.beam.runners.spark.translation.DoFnFunction.call(DoFnFunction.java:75)",
      "at org.apache.beam.runners.spark.translation.DoFnFunction.call(DoFnFunction.java:43)",
      "at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:159)",
      "at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:159)",
      "at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)",
      "at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)",
      "at org.apache.spark.scheduler.Task.run(Task.scala:89)",
      "at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)",
      "... 3 more",
      "Caused by: java.util.NoSuchElementException: No schema found for Lazy0",
      "... 46 more",
      "[2017-01-12 13:46:29,962] ERROR .jobserver.JobManagerActor [] [akka://JobServer/user/jobManager-2d-9259-2d7492c633aa] - Exception from job 0d23f147-1c7e-4de7-9acb-f36f44794ae8:",
      "java.lang.RuntimeException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.NoSuchElementException: No schema found for Lazy0",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:91)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:20)",
      "at spark.jobserver.SparkJobBase$class.runJob(SparkJob.scala:31)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:20)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:20)",
      "at spark.jobserver.JobManagerActor$$anonfun$getJobFuture$4.apply(JobManagerActor.scala:307)",
      "at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)",
      "at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)",
      "at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)",
      "at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)",
      "at java.lang.Thread.run(Thread.java:745)",
      "Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.NoSuchElementException: No schema found for Lazy0",
      "at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:72)",
      "at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:112)",
      "at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:101)",
      "at org.talend.datastreams.sjs.jobs.DatastreamsJob$.runJob(DatastreamsJob.scala:93)",
      "... 10 more",
      "Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.NoSuchElementException: No schema found for Lazy0",
      "at org.talend.components.adapter.beam.coders.LazyAvroCoder.getSchema(LazyAvroCoder.java:69)",
      "at org.talend.components.adapter.beam.coders.LazyAvroCoder.decode(LazyAvroCoder.java:91)",
      "at org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:670)",
      "at org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:603)",
      "at org.apache.beam.runners.spark.coders.CoderHelpers.fromByteArray(CoderHelpers.java:85)",
      "at org.apache.beam.runners.spark.coders.CoderHelpers$5$1.apply(CoderHelpers.java:182)",
      "at org.apache.beam.runners.spark.coders.CoderHelpers$5$1.apply(CoderHelpers.java:179)",
      "at com.google.common.collect.Iterators$8.next(Iterators.java:812)",
      "at org.apache.beam.sdk.repackaged.com.google.common.collect.Iterators$6.next(Iterators.java:649)",
      "at org.apache.beam.sdk.repackaged.com.google.common.collect.Iterators$6.next(Iterators.java:635)",
      "at org.apache.beam.runners.core.GroupAlsoByWindowsViaOutputBufferDoFn.processElement(GroupAlsoByWindowsViaOutputBufferDoFn.java:83)",
      "at org.apache.beam.runners.spark.translation.SparkProcessContext$ProcCtxtIterator.invokeProcessElement(SparkProcessContext.java:372)",
      "at org.apache.beam.runners.spark.translation.SparkProcessContext$ProcCtxtIterator.computeNext(SparkProcessContext.java:335)",
      "at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)",
      "at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)",
      "at org.apache.beam.runners.spark.translation.SparkProcessContext.callWithCtxt(SparkProcessContext.java:91)",
      "at org.apache.beam.runners.spark.translation.DoFnFunction.call(DoFnFunction.java:75)",
      "at org.apache.beam.runners.spark.translation.DoFnFunction.call(DoFnFunction.java:43)",
      "at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:159)",
      "at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:159)",
      "at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)",
      "at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)",
      "at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)",
      "at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)",
      "at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)",
      "at org.apache.spark.scheduler.Task.run(Task.scala:89)",
      "at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)",
      "... 3 more",
      "Caused by: java.util.NoSuchElementException: No schema found for Lazy0",
      "... 46 more",

      Attachments

        Activity

          People

            tfion Thomas Fion
            jlamiel Jonathan Lamiel
            Ryan Skraba
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: