This is an automated email from the ASF dual-hosted git repository.
liyuanjian pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark-connect-rust.git
from 84db605 Update Project Description
add dcd08fb Initial commit
add 476dec9 initial commit
add 62ac79e github action
add d317981 fix: github actions
add a5b36b3 fix: github actions
add ba0fcc7 fix: github actions
add b342f15 fix: toml, readme, github action
add c98c2eb fix: toml & readme
add d36b3d1 fix: update readme
add d379b09 fix: update readme
add 85f4671 feat(dataframe): additional methods & updated docs
add f9b93d7 update: docker-compose & release
add e66de54 bump version
add 05f19ae feat: explain as println
add 5b1b7f3 chore
add faf2d8a feat: restructure existing modules, and add Column,
Expression, and Functions modules (#1)
add ab932de feat: marco refactor and lots of functions (#3)
add 778f7c0 bump version
add d82454a Add Catalog & more DataFrame traits (#4)
add 635c321 feat(errors): some better error handling - create custom
error enum - update session traits to leverage new error method - update .sql
command to be a Result
add 8441e2f update readme
add 13e463b feat(session): make sensitive fields private - should not
have access to token or user_id on the spark session
add 5c85ffb feat(dataframe): implement core traits (#6)
add bc0e0d0 bump(spark): move to tag v3.5.1
add fbd54c6 feat(tls): enable tls authentication - update client for
better metadata headers - add feature flag for tls to enable connection to
Databricks - update sparksession builder to accetp &str and not String for
connection string
add 6f0fc4f bug(example): change connection string
add 2b8995f feat: better generic args (#8)
add bf6d9f3 bump: cargo version & deps
add 32c11be feat(client/handler): reimplement client & response handlers
(#9)
add c039840 Merge branch 'main' of
https://github.com/sjrusso8/spark-connect-rs
add e5f9411 bug(client): reset response state
add 6650707 feat(updates): streaming & aggregations (#10)
add 89c5d85 bug(column): incorrect logical not (#11)
add 67d7887 feat: implement pivot & unpivot (#12)
add 74f14f1 chore: update readme & examples (#15)
add ca5882c feat(window): create Window & WindowSpec (#19)
add 21c827a bump: dependencies & docker (#21)
add 418e926 feat(catalog): add additional methods (#22)
add 073e1fa feat: client improvements & flaky tests (#23)
add 587e09a chore: reorganize code into a workspace (#24)
add 7f110d1 refactor: wrap all spark sessions in box (#27)
add 3adabe6 fix: publish & bump versions (#28)
add 6233fe0 fix: add include to cargo.toml (#29)
add e38f4e2 update cargo.toml
add 707a176 feat(types) & bug(plan): add new datatypes & fix plan id bug
(#31)
add 7fa53b0 feat: initial implementation of DataframeWriterV2 (#30)
add eb12b83 feat(col): dropFields & withField (#32)
add 7238447 feat(dataframe): implement transform (#34)
add 9924e61 feat(examples): update examples (#35)
add 93f8fc9 bug(tls): incorrect scheme for tls (#39)
add 32c8f3c Update README (#37)
add 3a28504 feat(data types): implement data types (#41)
add c9cb321 feat(dataframe): Spark DF to Polars, DataFusion, or JSON (#43)
add 87cb35d feat(examples): update with local datasets (#44)
add cf25ac9 feat(session): add RunTimeConfig & Session tags (#45)
add 1db643c update README
add 8988aba bug(deadlock): change to async rwlock (#48)
add 395e593 bug(conf): runtimeconfig on wasm feature (#49)
add ca9df65 bug(dep): fix tokio dependeny on wasm feature (#50)
add 8f204f4 refactor(session): use ref for session methods & error
messages (#51)
add 5ea47e5 refactor: remove snake case (#52)
add 42c88e4 feat(sq): implement missing methods (#53)
add 1284830 bump version
add 676e893 feat(submodule): remove git submodule (#66)
add 386a430 feat(dataframe): implement missing methods (#67)
add b25c938 fix(cicd): issue with docker compose (#69)
add 7ea744b tests(column): enhance column function code coverage (#68)
add af53c29 fix(cicd): resolve issue with broken release (#70)
add ab85ae6 chore: update readme & bump dependencies (#73)
add 2b138c6 refactor: consolidate traits (#74)
add 84f170a feat(readwriter): Implement File Format Reader/Writer (#72)
add 1a58cd2 feat(functions): implement more functions #75
add 70c3255 revert(tonic): move tonic back to 0.11 (#76)
add 11e810a feat(reattach): Add reattach & release logic (#78)
add b2e4207 feat(errors): cleaner handling of tonic status errors (#79)
add 8e622e8 feat(sqm): add streaming query manager (#80)
add 5b3dce0 grunt: remove rust/ folder and change core/ to crates/ (#81)
add cfbfa56 feat: Implement createTable and createExternalTable in
Catalog (#55) (#82)
add 4b7c697 docs: update rustdocs (#83)
add bb46632 chore: bump crate version (#84)
add 188fb52 feat!(wasm): remove feature flag (#85)
add 7e55ffb bump(docker): update docker and fix test (#87)
add 57dcd2d bug(reattach): connection is lost and session id is empty
(#88)
add 6eb4de4 feat(funcs): add more functions (#89)
add 2a27f05 feat(config): add ability to control builder configurations
(#91)
add 0945fb6 bump: release (#92)
add 251ebce code clean up & dep bump (#94)
add bb520be feat(license) add apache license to files (#96)
add 379e775 merge repo
new 257df1c Merge pull request #1 from sjrusso8/source
The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails. The revisions
listed as "add" were already present in the repository and have only
been added to this reference.
Summary of changes:
.github/pull_request_template.md | 35 +
.github/workflows/build.yml | 115 +
.github/workflows/release.yml | 64 +
.gitignore | 24 +
.pre-commit-config.yaml | 35 +
Cargo.lock | 3843 ++++++++++++++++++++
Cargo.toml | 62 +
README.md | 967 ++++-
crates/connect/Cargo.toml | 98 +
crates/connect/build.rs | 38 +
crates/connect/protobuf/spark-3.5/buf.yaml | 25 +
.../protobuf/spark-3.5/spark/connect/base.proto | 816 +++++
.../protobuf/spark-3.5/spark/connect/catalog.proto | 243 ++
.../spark-3.5/spark/connect/commands.proto | 416 +++
.../protobuf/spark-3.5/spark/connect/common.proto | 48 +
.../spark-3.5/spark/connect/expressions.proto | 382 ++
.../spark-3.5/spark/connect/relations.proto | 1003 +++++
.../protobuf/spark-3.5/spark/connect/types.proto | 195 +
crates/connect/src/catalog.rs | 972 +++++
crates/connect/src/client/builder.rs | 273 ++
crates/connect/src/client/config.rs | 120 +
crates/connect/src/client/middleware.rs | 108 +
crates/connect/src/client/mod.rs | 639 ++++
crates/connect/src/column.rs | 513 +++
crates/connect/src/conf.rs | 149 +
crates/connect/src/dataframe.rs | 2784 ++++++++++++++
crates/connect/src/errors.rs | 194 +
crates/connect/src/expressions.rs | 268 ++
crates/connect/src/functions/mod.rs | 2549 +++++++++++++
crates/connect/src/group.rs | 238 ++
crates/connect/src/lib.rs | 153 +
crates/connect/src/plan.rs | 872 +++++
crates/connect/src/readwriter.rs | 1312 +++++++
crates/connect/src/session.rs | 477 +++
crates/connect/src/storage.rs | 135 +
crates/connect/src/streaming/mod.rs | 1028 ++++++
crates/connect/src/types.rs | 717 ++++
crates/connect/src/window.rs | 443 +++
datasets/dir1/dir2/file2.parquet | Bin 0 -> 520 bytes
datasets/dir1/file1.parquet | Bin 0 -> 520 bytes
datasets/dir1/file3.json | 1 +
datasets/employees.json | 4 +
datasets/full_user.avsc | 1 +
datasets/kv1.txt | 500 +++
datasets/people.csv | 3 +
datasets/people.json | 3 +
datasets/people.txt | 3 +
datasets/user.avsc | 8 +
datasets/users.avro | Bin 0 -> 334 bytes
datasets/users.orc | Bin 0 -> 547 bytes
datasets/users.parquet | Bin 0 -> 615 bytes
docker-compose.yml | 33 +
examples/Cargo.toml | 56 +
examples/README.md | 92 +
examples/src/databricks.rs | 80 +
examples/src/deltalake.rs | 121 +
examples/src/reader.rs | 65 +
examples/src/readstream.rs | 57 +
examples/src/sql.rs | 57 +
examples/src/writer.rs | 77 +
pre-commit.sh | 86 +
61 files changed, 23599 insertions(+), 1 deletion(-)
create mode 100644 .github/pull_request_template.md
create mode 100644 .github/workflows/build.yml
create mode 100644 .github/workflows/release.yml
create mode 100644 .gitignore
create mode 100644 .pre-commit-config.yaml
create mode 100644 Cargo.lock
create mode 100644 Cargo.toml
create mode 100644 crates/connect/Cargo.toml
create mode 100644 crates/connect/build.rs
create mode 100644 crates/connect/protobuf/spark-3.5/buf.yaml
create mode 100644 crates/connect/protobuf/spark-3.5/spark/connect/base.proto
create mode 100644
crates/connect/protobuf/spark-3.5/spark/connect/catalog.proto
create mode 100644
crates/connect/protobuf/spark-3.5/spark/connect/commands.proto
create mode 100644 crates/connect/protobuf/spark-3.5/spark/connect/common.proto
create mode 100644
crates/connect/protobuf/spark-3.5/spark/connect/expressions.proto
create mode 100644
crates/connect/protobuf/spark-3.5/spark/connect/relations.proto
create mode 100644 crates/connect/protobuf/spark-3.5/spark/connect/types.proto
create mode 100644 crates/connect/src/catalog.rs
create mode 100644 crates/connect/src/client/builder.rs
create mode 100644 crates/connect/src/client/config.rs
create mode 100644 crates/connect/src/client/middleware.rs
create mode 100644 crates/connect/src/client/mod.rs
create mode 100644 crates/connect/src/column.rs
create mode 100644 crates/connect/src/conf.rs
create mode 100644 crates/connect/src/dataframe.rs
create mode 100644 crates/connect/src/errors.rs
create mode 100644 crates/connect/src/expressions.rs
create mode 100644 crates/connect/src/functions/mod.rs
create mode 100644 crates/connect/src/group.rs
create mode 100644 crates/connect/src/lib.rs
create mode 100644 crates/connect/src/plan.rs
create mode 100644 crates/connect/src/readwriter.rs
create mode 100644 crates/connect/src/session.rs
create mode 100644 crates/connect/src/storage.rs
create mode 100644 crates/connect/src/streaming/mod.rs
create mode 100644 crates/connect/src/types.rs
create mode 100644 crates/connect/src/window.rs
create mode 100644 datasets/dir1/dir2/file2.parquet
create mode 100644 datasets/dir1/file1.parquet
create mode 100644 datasets/dir1/file3.json
create mode 100644 datasets/employees.json
create mode 100644 datasets/full_user.avsc
create mode 100644 datasets/kv1.txt
create mode 100644 datasets/people.csv
create mode 100644 datasets/people.json
create mode 100644 datasets/people.txt
create mode 100644 datasets/user.avsc
create mode 100644 datasets/users.avro
create mode 100644 datasets/users.orc
create mode 100644 datasets/users.parquet
create mode 100644 docker-compose.yml
create mode 100644 examples/Cargo.toml
create mode 100644 examples/README.md
create mode 100644 examples/src/databricks.rs
create mode 100644 examples/src/deltalake.rs
create mode 100644 examples/src/reader.rs
create mode 100644 examples/src/readstream.rs
create mode 100644 examples/src/sql.rs
create mode 100644 examples/src/writer.rs
create mode 100644 pre-commit.sh
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]