mdomaradzki-cosmose commented on issue #15103:
URL: https://github.com/apache/iceberg/issues/15103#issuecomment-3876478209
Hi @RussellSpitzer, here are my plans:
Physical plan 1:
```
== Physical Plan ==
AdaptiveSparkPlan (10)
+- == Final Plan ==
ResultQueryStage (7), Statistics(sizeInBytes=8.0 EiB)
+- * HashAggregate (6)
+- ShuffleQueryStage (5), Statistics(sizeInBytes=16.0 B, rowCount=1)
+- Exchange (4)
+- * HashAggregate (3)
+- * Project (2)
+- * LocalTableScan (1)
+- == Initial Plan ==
HashAggregate (9)
+- Exchange (8)
+- HashAggregate (3)
+- Project (2)
+- LocalTableScan (1)
(1) LocalTableScan [codegen id : 1]
Output [1]: [count(*)#50L]
Arguments: [count(*)#50L]
(2) Project [codegen id : 1]
Output [1]: [count(*)#50L AS agg_func_0#49L]
Input [1]: [count(*)#50L]
(3) HashAggregate [codegen id : 1]
Input [1]: [agg_func_0#49L]
Keys: []
Functions [1]: [partial_sum(agg_func_0#49L)]
Aggregate Attributes [1]: [sum#51L]
Results [1]: [sum#52L]
(4) Exchange
Input [1]: [sum#52L]
Arguments: SinglePartition, ENSURE_REQUIREMENTS, [plan_id=23]
(5) ShuffleQueryStage
Output [1]: [sum#52L]
Arguments: 0
(6) HashAggregate [codegen id : 2]
Input [1]: [sum#52L]
Keys: []
Functions [1]: [sum(agg_func_0#49L)]
Aggregate Attributes [1]: [sum(agg_func_0#49L)#48L]
Results [1]: [sum(agg_func_0#49L)#48L AS count#26L]
(7) ResultQueryStage
Output [1]: [count#26L]
Arguments: 1
(8) Exchange
Input [1]: [sum#52L]
Arguments: SinglePartition, ENSURE_REQUIREMENTS, [plan_id=13]
(9) HashAggregate
Input [1]: [sum#52L]
Keys: []
Functions [1]: [sum(agg_func_0#49L)]
Aggregate Attributes [1]: [sum(agg_func_0#49L)#48L]
Results [1]: [sum(agg_func_0#49L)#48L AS count#26L]
(10) AdaptiveSparkPlan
Output [1]: [count#26L]
Arguments: isFinalPlan=true
```
Phisical plan 2:
```
== Physical Plan ==
AdaptiveSparkPlan (14)
+- == Final Plan ==
ResultQueryStage (8), Statistics(sizeInBytes=8.0 EiB)
+- * HashAggregate (7)
+- ShuffleQueryStage (6), Statistics(sizeInBytes=384.0 B, rowCount=24)
+- Exchange (5)
+- * HashAggregate (4)
+- * Project (3)
+- * Filter (2)
+- BatchScan my_db.my_table.files (1)
+- == Initial Plan ==
HashAggregate (13)
+- Exchange (12)
+- HashAggregate (11)
+- Project (10)
+- Filter (9)
+- BatchScan my_db.my_table.files (1)
(1) BatchScan my_db.my_table
Output [1]: [serverTime#5]
my_db.my_table (branch=null) [filters=serverTime IS NOT NULL, serverTime >
1768435200000000, groupedBy=]
(2) ColumnarToRow [codegen id : 1]
Input [1]: [serverTime#5]
(3) Filter [codegen id : 1]
Input [1]: [serverTime#5]
Condition : (serverTime#5 > 2026-01-15 00:00:00)
(4) Project [codegen id : 1]
Output: []
Input [1]: [serverTime#5]
(5) HashAggregate [codegen id : 1]
Input: []
Keys: []
Functions [1]: [partial_count(1)]
Aggregate Attributes [1]: [count#50L]
Results [1]: [count#51L]
(6) Exchange
Input [1]: [count#51L]
Arguments: SinglePartition, ENSURE_REQUIREMENTS, [plan_id=34]
(7) ShuffleQueryStage
Output [1]: [count#51L]
Arguments: 0
(8) HashAggregate
Input [1]: [count#51L]
Keys: []
Functions [1]: [count(1)]
Aggregate Attributes [1]: [count(1)#48L]
Results [1]: [count(1)#48L AS count#26L]
(9) Filter
Input [1]: [serverTime#5]
Condition : (serverTime#5 > 2026-01-15 00:00:00)
(10) Project
Output: []
Input [1]: [serverTime#5]
(11) HashAggregate
Input: []
Keys: []
Functions [1]: [partial_count(1)]
Aggregate Attributes [1]: [count#50L]
Results [1]: [count#51L]
(12) Exchange
Input [1]: [count#51L]
Arguments: SinglePartition, ENSURE_REQUIREMENTS, [plan_id=15]
(13) HashAggregate
Input [1]: [count#51L]
Keys: []
Functions [1]: [count(1)]
Aggregate Attributes [1]: [count(1)#48L]
Results [1]: [count(1)#48L AS count#26L]
(14) AdaptiveSparkPlan
Output [1]: [count#26L]
Arguments: isFinalPlan=false
```
I'm using Spark 4.0.1, because I cannot see iceberg-spark-runtime for 4.1
(https://repo1.maven.org/maven2/org/apache/iceberg/)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]