This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new 2e2ef6e  [SPARK-54981] Improve `DataFrame.collect` to support 
`Bool|Int|Float|Double` types via `[any Sendable]` cast
2e2ef6e is described below

commit 2e2ef6e8f7ec162eba3b59abb91b39e65633d347
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Jan 9 16:16:52 2026 +0900

    [SPARK-54981] Improve `DataFrame.collect` to support 
`Bool|Int|Float|Double` types via `[any Sendable]` cast
    
    ### What changes were proposed in this pull request?
    
    This PR aims to improve `DataFrame.collect` to support 
`Bool|Int|Float|Double` types more properly via `[any Sendable]` cast.
    
    ### Why are the changes needed?
    
    To support more types properly.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No behavior change because `list` type support is a new unreleased feature.
    
    ### How was this patch tested?
    
    Pass the CIs with the newly added test case.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #277 from dongjoon-hyun/SPARK-54981.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 Sources/SparkConnect/DataFrame.swift         |  2 +-
 Tests/SparkConnectTests/DataFrameTests.swift | 11 +++++++++++
 2 files changed, 12 insertions(+), 1 deletion(-)

diff --git a/Sources/SparkConnect/DataFrame.swift 
b/Sources/SparkConnect/DataFrame.swift
index 2483573..0ac84cc 100644
--- a/Sources/SparkConnect/DataFrame.swift
+++ b/Sources/SparkConnect/DataFrame.swift
@@ -449,7 +449,7 @@ public actor DataFrame: Sendable {
             case .complexInfo(.strct):
               values.append((array as! AsString).asString(i))
             case .complexInfo(.list):
-              values.append(array.asAny(i) as? [String])
+              values.append(array.asAny(i) as? [any Sendable])
             default:
               values.append(array.asAny(i) as? String)
             }
diff --git a/Tests/SparkConnectTests/DataFrameTests.swift 
b/Tests/SparkConnectTests/DataFrameTests.swift
index 256ce09..77568e4 100644
--- a/Tests/SparkConnectTests/DataFrameTests.swift
+++ b/Tests/SparkConnectTests/DataFrameTests.swift
@@ -130,6 +130,17 @@ struct DataFrameTests {
     await spark.stop()
   }
 
+  @Test
+  func array() async throws {
+    let spark = try await SparkSession.builder.getOrCreate()
+    #expect(try await spark.sql("SELECT array('a')").collect() == 
[Row(Array(["a"]))])
+    #expect(try await spark.sql("SELECT array(true)").collect() == 
[Row(Array([true]))])
+    #expect(try await spark.sql("SELECT array(1)").collect() == 
[Row(Array([1]))])
+    #expect(try await spark.sql("SELECT array(1.0F)").collect() == 
[Row(Array([Float(1.0)]))])
+    #expect(try await spark.sql("SELECT array(1.0D)").collect() == 
[Row(Array([1.0]))])
+    await spark.stop()
+  }
+
   @Test
   func sameSemantics() async throws {
     let spark = try await SparkSession.builder.getOrCreate()


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to