bajiaolong commented on issue #15637:
URL: 
https://github.com/apache/dolphinscheduler/issues/15637#issuecomment-2513366654

   > > > @bajiaolong Just FYI, I encountered the same problem and resolved it 
by adding an execution environment for the task and configuring "export 
HADOOP_USER_NAME=your spark user".
   > > 
   > > 
   > > good tirk,i chang some conf make ds use worker's deploy user to avoid 
this problem
   > 
   > I encountered the same problem in FLINK task, could u share what conf u 
changed? thank you!
   
   There is a tenant option in the task running page, where you can specify a 
specific tenant. Please ensure that the tenant's environment variables are 
correct


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to