pyspark.sql.functions.try_subtract¶
-
pyspark.sql.functions.
try_subtract
(left: ColumnOrName, right: ColumnOrName) → pyspark.sql.column.Column[source]¶ Returns left-right and the result is null on overflow. The acceptable input types are the same with the - operator.
New in version 3.5.0.
Examples
>>> df = spark.createDataFrame([(6000, 15), (1990, 2)], ["a", "b"]) >>> df.select(try_subtract(df.a, df.b).alias('r')).collect() [Row(r=5985), Row(r=1988)]
>>> from pyspark.sql.types import StructType, StructField, IntegerType, StringType >>> schema = StructType([ ... StructField("i", IntegerType(), True), ... StructField("d", StringType(), True), ... ]) >>> df = spark.createDataFrame([(1, '2015-09-30')], schema) >>> df = df.select(df.i, to_date(df.d).alias('d')) >>> df.select(try_subtract(df.d, df.i).alias('r')).collect() [Row(r=datetime.date(2015, 9, 29))]
>>> df.select(try_subtract(df.d, make_interval(df.i)).alias('r')).collect() [Row(r=datetime.date(2014, 9, 30))]
>>> df.select( ... try_subtract(df.d, make_interval(lit(0), lit(0), lit(0), df.i)).alias('r') ... ).collect() [Row(r=datetime.date(2015, 9, 29))]
>>> df.select( ... try_subtract(make_interval(df.i), make_interval(df.i)).alias('r') ... ).show(truncate=False) +---------+ |r | +---------+ |0 seconds| +---------+