pyspark.sql.functions.count#

pyspark.sql.functions.count(col)[source]#

Aggregate function: returns the number of items in a group.

New in version 1.3.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
colColumn or str

target column to compute on.

Returns
Column

column for computed results.

Examples

Example 1: Count all rows in a DataFrame

>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([(None,), ("a",), ("b",), ("c",)], schema=["alphabets"])
>>> df.select(sf.count(sf.expr("*"))).show()
+--------+
|count(1)|
+--------+
|       4|
+--------+

Example 2: Count non-null values in a specific column

>>> from pyspark.sql import functions as sf
>>> df.select(sf.count(df.alphabets)).show()
+----------------+
|count(alphabets)|
+----------------+
|               3|
+----------------+

Example 3: Count all rows in a DataFrame with multiple columns

>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame(
...     [(1, "apple"), (2, "banana"), (3, None)], schema=["id", "fruit"])
>>> df.select(sf.count(sf.expr("*"))).show()
+--------+
|count(1)|
+--------+
|       3|
+--------+

Example 4: Count non-null values in multiple columns

>>> from pyspark.sql import functions as sf
>>> df.select(sf.count(df.id), sf.count(df.fruit)).show()
+---------+------------+
|count(id)|count(fruit)|
+---------+------------+
|        3|           2|
+---------+------------+