chrajeshbabu opened a new issue, #14407:
URL: https://github.com/apache/pinot/issues/14407

   When s3 as a deep store enabled and loading data through spark or standalone 
getting following error.  
   
   2024/11/07 10:43:41.262 INFO [HttpClient] [Executor task launch worker for 
task 0.0 in stage 1.0 (TID 31)] Sending request: http://host1:8556/v2/segments 
to controller: host1.visa.com, version: Unknown
   2024/11/07 10:43:41.263 WARN [SegmentPushUtils] [Executor task launch worker 
for task 0.0 in stage 1.0 (TID 31)] Caught temporary exception while pushing 
table: airlineStats segment uri: 
s3a://projects/data/pinot/examples/output/airlineStats/segments/2014/01/01/airlineStats_batch_2014-01-01_2014-01-01.tar.gz
 to http://host1:8556, will retry
   org.apache.pinot.common.exception.HttpErrorStatusException: Got error status 
code: 500 (Internal Server Error) with reason: "Could not find paramName 
tableName in path or query params of the API: http://host1:8556/v2/segments"; 
while sending request: http://host1:8556/v2/segments to controller: 
host1.visa.com, version: Unknown
        at 
org.apache.pinot.common.utils.http.HttpClient.wrapAndThrowHttpException(HttpClient.java:448)
 
~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.common.utils.FileUploadDownloadClient.sendSegmentUri(FileUploadDownloadClient.java:982)
 
~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.segment.local.utils.SegmentPushUtils.lambda$sendSegmentUris$1(SegmentPushUtils.java:234)
 
~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.spi.utils.retry.BaseRetryPolicy.attempt(BaseRetryPolicy.java:50)
 
~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.segment.local.utils.SegmentPushUtils.sendSegmentUris(SegmentPushUtils.java:231)
 
~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.segment.local.utils.SegmentPushUtils.sendSegmentUris(SegmentPushUtils.java:115)
 
~[pinot-all-1.2.0-jar-with-dependencies.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.plugin.ingestion.batch.spark3.SparkSegmentUriPushJobRunner$1.call(SparkSegmentUriPushJobRunner.java:128)
 
~[pinot-batch-ingestion-spark-3-1.2.0-shaded.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.pinot.plugin.ingestion.batch.spark3.SparkSegmentUriPushJobRunner$1.call(SparkSegmentUriPushJobRunner.java:118)
 
~[pinot-batch-ingestion-spark-3-1.2.0-shaded.jar:1.2.0-cc33ac502a02e2fe830fe21e556234ee99351a7a]
        at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$foreach$1(JavaRDDLike.scala:352) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$foreach$1$adapted(JavaRDDLike.scala:352)
 ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at scala.collection.Iterator.foreach(Iterator.scala:943) 
~[scala-library-2.12.17.jar:?]
        at scala.collection.Iterator.foreach$(Iterator.scala:943) 
~[scala-library-2.12.17.jar:?]
        at 
org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at org.apache.spark.rdd.RDD.$anonfun$foreach$2(RDD.scala:1002) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at org.apache.spark.rdd.RDD.$anonfun$foreach$2$adapted(RDD.scala:1002) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at org.apache.spark.scheduler.Task.run(Task.scala:139) 
~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$runWithUgi$3(Executor.scala:589)
 ~[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1540) 
[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
org.apache.spark.executor.Executor$TaskRunner.runWithUgi(Executor.scala:592) 
[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:518) 
[spark-core_2.12-3.4.1.3.3.6.4-2.jar:3.4.1.3.3.6.4-2]
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
 [?:?]
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
 [?:?]
        at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@pinot.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@pinot.apache.org
For additional commands, e-mail: commits-h...@pinot.apache.org

Reply via email to