This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new fdedec1de54 [SPARK-45252][CORE] Escape the greater/less than symbols
in the comments to make `sbt doc` execute successfully
fdedec1de54 is described below
commit fdedec1de54fa7619e1d1b59eb6f15a49e727428
Author: yangjie01 <[email protected]>
AuthorDate: Thu Sep 21 16:06:37 2023 -0700
[SPARK-45252][CORE] Escape the greater/less than symbols in the comments to
make `sbt doc` execute successfully
### What changes were proposed in this pull request?
This pr escape the greater than and less than symbols in the comments to
make `sbt doc` execute successfully.
### Why are the changes needed?
The `sbt doc` command should be able to execute successfully.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- Pass GitHub Actions
- Manual check:
run
```
build/sbt clean doc -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl
-Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive -Pvolcano
```
or
```
dev/change-scala-version.sh 2.13
build/sbt clean doc -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl
-Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive -Pvolcano
-Pscala-2.13
```
Before
The `sbt doc` command execution failed like:
```
[info] Main Scala API documentation successful.
[error] sbt.inc.Doc$JavadocGenerationFailed
[error] at
sbt.inc.Doc$.sbt$inc$Doc$$$anonfun$cachedJavadoc$1(Doc.scala:51)
[error] at sbt.inc.Doc$$anonfun$cachedJavadoc$2.run(Doc.scala:41)
[error] at
sbt.inc.Doc$.sbt$inc$Doc$$$anonfun$prepare$1(Doc.scala:62)
[error] at sbt.inc.Doc$$anonfun$prepare$5.run(Doc.scala:57)
[error] at sbt.inc.Doc$.go$1(Doc.scala:73)
[error] at sbt.inc.Doc$.$anonfun$cached$5(Doc.scala:82)
[error] at sbt.inc.Doc$.$anonfun$cached$5$adapted(Doc.scala:81)
[error] at
sbt.util.Tracked$.$anonfun$inputChangedW$1(Tracked.scala:220)
[error] at sbt.inc.Doc$.sbt$inc$Doc$$$anonfun$cached$1(Doc.scala:85)
[error] at sbt.inc.Doc$$anonfun$cached$7.run(Doc.scala:68)
[error] at
sbt.Defaults$.$anonfun$docTaskSettings$4(Defaults.scala:2178)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at
sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:63)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:69)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:283)
[error] at
sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:24)
[error] at sbt.Execute.work(Execute.scala:292)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:283)
[error] at
sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] at
sbt.CompletionService$$anon$2.call(CompletionService.scala:65)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:750)
[error] sbt.inc.Doc$JavadocGenerationFailed
[error] at
sbt.inc.Doc$.sbt$inc$Doc$$$anonfun$cachedJavadoc$1(Doc.scala:51)
[error] at sbt.inc.Doc$$anonfun$cachedJavadoc$2.run(Doc.scala:41)
[error] at
sbt.inc.Doc$.sbt$inc$Doc$$$anonfun$prepare$1(Doc.scala:62)
[error] at sbt.inc.Doc$$anonfun$prepare$5.run(Doc.scala:57)
[error] at sbt.inc.Doc$.go$1(Doc.scala:73)
[error] at sbt.inc.Doc$.$anonfun$cached$5(Doc.scala:82)
[error] at sbt.inc.Doc$.$anonfun$cached$5$adapted(Doc.scala:81)
[error] at
sbt.util.Tracked$.$anonfun$inputChangedW$1(Tracked.scala:220)
[error] at sbt.inc.Doc$.sbt$inc$Doc$$$anonfun$cached$1(Doc.scala:85)
[error] at sbt.inc.Doc$$anonfun$cached$7.run(Doc.scala:68)
[error] at
sbt.Defaults$.$anonfun$docTaskSettings$4(Defaults.scala:2178)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at
sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:63)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:69)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:283)
[error] at
sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:24)
[error] at sbt.Execute.work(Execute.scala:292)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:283)
[error] at
sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] at
sbt.CompletionService$$anon$2.call(CompletionService.scala:65)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:750)
[error] (network-yarn / Compile / doc) sbt.inc.Doc$JavadocGenerationFailed
[error] (network-shuffle / Compile / doc)
sbt.inc.Doc$JavadocGenerationFailed
[error] Total time: 126 s (02:06), completed 2023-9-21 20:51:43
```
After
The `sbt doc` command executed successfully.
```
[success] Total time: 127 s (02:07), completed 2023-9-21 21:13:17
```
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #43032 from LuciferYang/sbt-doc.
Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../src/main/java/org/apache/spark/network/shuffle/ErrorHandler.java | 2 +-
.../src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git
a/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ErrorHandler.java
b/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ErrorHandler.java
index abd2348cdaf..f24de2dbfe2 100644
---
a/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ErrorHandler.java
+++
b/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ErrorHandler.java
@@ -29,7 +29,7 @@ import
org.apache.spark.network.server.BlockPushNonFatalFailure;
* Plugs into {@link RetryingBlockTransferor} to further control when an
exception should be retried
* and logged.
* Note: {@link RetryingBlockTransferor} will delegate the exception to this
handler only when
- * - remaining retries < max retries
+ * - remaining retries < max retries
* - exception is an IOException
*
* @since 3.1.0
diff --git
a/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java
b/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java
index b9b9568aa47..a3b823f2c71 100644
---
a/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java
+++
b/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java
@@ -97,7 +97,7 @@ import
org.apache.spark.network.yarn.util.HadoopConfigProvider;
* {@code yarn.nodemanager.aux-services.<service>.classpath}, this file must
be on the classpath
* of the NodeManager itself. When using the {@code classpath} configuration,
it can be present
* either on the NodeManager's classpath, or specified in the classpath
configuration.
- * This {@code classpath} configuration is only supported on YARN versions >=
2.9.0.
+ * This {@code classpath} configuration is only supported on YARN versions
>= 2.9.0.
*/
public class YarnShuffleService extends AuxiliaryService {
private static final Logger defaultLogger =
LoggerFactory.getLogger(YarnShuffleService.class);
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]