This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 066abfa665a3 [SPARK-50753][PYTHON][DOCS] Add pyspark plotting to API 
documentation
066abfa665a3 is described below

commit 066abfa665a3c57402f8ad8b93cb39767748230d
Author: Xinrong Meng <[email protected]>
AuthorDate: Mon Feb 10 11:47:01 2025 -0800

    [SPARK-50753][PYTHON][DOCS] Add pyspark plotting to API documentation
    
    ### What changes were proposed in this pull request?
    Add pyspark plotting to API documentation
    
    ### Why are the changes needed?
    Better documentation.
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    Manual tests.
    
![image](https://github.com/user-attachments/assets/35aab68c-1624-493a-ab73-d11c39144665)
    
![image](https://github.com/user-attachments/assets/d58a4118-dda0-482a-880c-8ae8a1b70ed0)
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No.
    
    Closes #49841 from xinrong-meng/spark4_plot_doc.
    
    Authored-by: Xinrong Meng <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .../source/reference/pyspark.sql/dataframe.rst     | 22 ++++++++++++++++++++++
 python/pyspark/sql/dataframe.py                    |  4 ++--
 python/pyspark/sql/plot/core.py                    |  7 +++++++
 3 files changed, 31 insertions(+), 2 deletions(-)

diff --git a/python/docs/source/reference/pyspark.sql/dataframe.rst 
b/python/docs/source/reference/pyspark.sql/dataframe.rst
index 62dd24581b9a..9652eb7c4275 100644
--- a/python/docs/source/reference/pyspark.sql/dataframe.rst
+++ b/python/docs/source/reference/pyspark.sql/dataframe.rst
@@ -167,3 +167,25 @@ UDTF(User-Defined Table Function)s.
     TableArg.partitionBy
     TableArg.orderBy
     TableArg.withSinglePartition
+
+
+Plotting
+--------
+The ``DataFrame.plot`` attribute serves both as a callable method and a 
namespace, providing access to various
+plotting functions via the ``PySparkPlotAccessor``. Users can call specific 
plotting methods in the format
+``DataFrame.plot.<kind>``.
+
+.. currentmodule:: pyspark.sql.plot.core
+
+.. autosummary::
+    :toctree: api/
+
+    PySparkPlotAccessor.area
+    PySparkPlotAccessor.bar
+    PySparkPlotAccessor.barh
+    PySparkPlotAccessor.line
+    PySparkPlotAccessor.pie
+    PySparkPlotAccessor.scatter
+    PySparkPlotAccessor.box
+    PySparkPlotAccessor.kde
+    PySparkPlotAccessor.hist
\ No newline at end of file
diff --git a/python/pyspark/sql/dataframe.py b/python/pyspark/sql/dataframe.py
index df2bf39a3a53..cd06b3fa3eeb 100644
--- a/python/pyspark/sql/dataframe.py
+++ b/python/pyspark/sql/dataframe.py
@@ -6828,13 +6828,13 @@ class DataFrame:
     @property
     def plot(self) -> "PySparkPlotAccessor":
         """
-        Returns a :class:`PySparkPlotAccessor` for plotting functions.
+        Returns a :class:`plot.core.PySparkPlotAccessor` for plotting 
functions.
 
         .. versionadded:: 4.0.0
 
         Returns
         -------
-        :class:`PySparkPlotAccessor`
+        :class:`plot.core.PySparkPlotAccessor`
 
         Notes
         -----
diff --git a/python/pyspark/sql/plot/core.py b/python/pyspark/sql/plot/core.py
index c648bc8f3840..5247044984a6 100644
--- a/python/pyspark/sql/plot/core.py
+++ b/python/pyspark/sql/plot/core.py
@@ -85,6 +85,13 @@ class PySparkSampledPlotBase:
 
 
 class PySparkPlotAccessor:
+    """
+    Accessor for DataFrame plotting functionality in PySpark.
+
+    Users can call the accessor as ``df.plot(kind="line")`` or use the 
dedicated
+    methods like ``df.plot.line(...)`` to generate plots.
+    """
+
     plot_data_map = {
         "area": PySparkSampledPlotBase().get_sampled,
         "bar": PySparkTopNPlotBase().get_top_n,


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to