[
https://issues.apache.org/jira/browse/OAK-12134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18066852#comment-18066852
]
Julian Reschke commented on OAK-12134:
--------------------------------------
[~jsedding] - unfortunately the new tests fail on Windows apparently because of
unclosed files trying to be deleted:
{noformat}
[ERROR]
org.apache.jackrabbit.oak.segment.CheckpointCompactorEfficiencyTest.checkpointDeletedByConcurrentWrite(Class,
CompactionStrategy, FileStore, NodeStore)[2] -- Time elapsed: 0.052 s <<<
ERROR!
org.junit.platform.commons.JUnitException: Failed to close extension context
at java.base/java.util.Optional.ifPresent(Optional.java:178)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
at
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
at
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
at
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
at
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
at
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
Caused by: java.nio.file.FileSystemException:
C:\projects\apache\oak\trunk\oak-segment-tar\target\test-tmp\o.a.j.o.s.CheckpointCompactorEfficiencyTest\checkpointDeletedByConcurrentWrite\2-class-org-apache-jackrabbit-oak-segment-CheckpointCompactor-full-compaction\data00000a.tar:
Der Prozess kann nicht auf die Datei zugreifen, da sie von einem anderen
Prozess verwendet wird
at
java.base/sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:92)
at
java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:103)
at
java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:108)
at
java.base/sun.nio.fs.WindowsFileSystemProvider.implDelete(WindowsFileSystemProvider.java:275)
at
java.base/sun.nio.fs.AbstractFileSystemProvider.deleteIfExists(AbstractFileSystemProvider.java:110)
at java.base/java.nio.file.Files.deleteIfExists(Files.java:1191)
at
org.apache.commons.io.file.DeletingPathVisitor.visitFile(DeletingPathVisitor.java:158)
at
org.apache.commons.io.file.DeletingPathVisitor.visitFile(DeletingPathVisitor.java:37)
at java.base/java.nio.file.Files.walkFileTree(Files.java:2811)
at java.base/java.nio.file.Files.walkFileTree(Files.java:2882)
at
org.apache.commons.io.file.PathUtils.visitFileTree(PathUtils.java:1845)
at
org.apache.commons.io.file.PathUtils.lambda$deleteDirectory$0(PathUtils.java:583)
at
org.apache.commons.io.file.PathUtils.withPosixFileAttributes(PathUtils.java:1967)
at
org.apache.commons.io.file.PathUtils.deleteDirectory(PathUtils.java:582)
at org.apache.commons.io.file.PathUtils.delete(PathUtils.java:533)
at org.apache.commons.io.file.PathUtils.delete(PathUtils.java:511)
at
org.apache.jackrabbit.oak.segment.test.FileStoreParameterResolver$CloseablePath.close(FileStoreParameterResolver.java:200)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at
java.base/java.util.stream.SortedOps$RefSortingSink.end(SortedOps.java:395)
at java.base/java.util.stream.Sink$ChainedReference.end(Sink.java:258)
at java.base/java.util.stream.Sink$ChainedReference.end(Sink.java:258)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
... 56 more
{noformat}
> compaction with concurrent writes can increase segmentstore size
> ----------------------------------------------------------------
>
> Key: OAK-12134
> URL: https://issues.apache.org/jira/browse/OAK-12134
> Project: Jackrabbit Oak
> Issue Type: Bug
> Affects Versions: 1.88.0
> Reporter: Rishabh Daim
> Assignee: Julian Sedding
> Priority: Major
> Fix For: 2.0.0
>
> Attachments: 1.78.GC#2.log, 1.78.GC#3.log, 1.88.GC#3.log
>
>
> The cleanup behavior is identical on both branches 1.78 & 1.88
> Both show 0 bytes in post-compaction cleanup. That was never the real
> difference.
> The actual regression is in compaction itself:
> ||Metric||1.78 GC#3||1.88 GC#3||
> |Compaction time |2.4s (3 cycles)|35.4s (6 cycles + force)|
> |Data written by compaction|~65 MB (489→554 MB)|~800 MB (511→1300 MB)|
> |Initial checkpoints|~66|~46|
> |Force compact needed|No|Yes|
> 1.88 writes 12x more data during compaction despite having fewer checkpoints.
> That's the smoking gun.
> Root cause: OAK-11895
> The CheckpointCompactor change (onto vs after) modified what paths get
> compacted per checkpoint:
> - 1.78 (old): collectSuperRootPaths returns "root" and "checkpoints/X/root"
> — only compacts the repository root and each checkpoint's content root
> - 1.88 (new): returns "" and "checkpoints/X" — compacts the entire
> super-root and the full checkpoint subtree (including metadata, not just the
> root)
> More paths traversed per checkpoint → more nodes copied → more segments
> written → more data produced.
> This has a cascading effect: more data written means compaction takes longer,
> which means more concurrent commits happen during compaction, which means
> more retry cycles, which eventually forces a blocking
> compaction.
> Summary
> The problem is not "cleanup doesn't reclaim space" — cleanup works
> identically on both branches. The problem is that 1.88 TAIL compaction
> produces ~12x more output data than 1.78 due to OAK-11895, causing the
> store to grow significantly after each GC cycle instead of shrinking. This is
> worth raising as a regression against OAK-11895 in Apache JIRA.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)