-
Notifications
You must be signed in to change notification settings - Fork 245
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Don't distinguish between types of ArithmeticException
for Spark 3.2.x
#5483
Don't distinguish between types of ArithmeticException
for Spark 3.2.x
#5483
Conversation
Signed-off-by: remzi <13716567376yh@gmail.com>
build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
when you say it throws both exceptions, is it for different of these operations? It would be better to perhaps split apart by operation in that case.
ArithmeticException
for spark32X.ArithmeticException
for Spark 3.2.x
For Spark 32X, this test will throw java.lang.ArithmeticException: Decimal(expanded,-12,2,0}) cannot be represented as Decimal(7, 7).
at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotChangeDecimalPrecisionError(QueryExecutionErrors.scala:97) and SparkArithmeticException(errorClass = "DIVIDE_BY_ZERO",..) Do you mean to move |
Looks like it throws |
yes, what @nartal1 said, move the ones that java.lang.ArithmeticException to a different test then the ones that throw SparkArithmeticException so that we can properly check the full Exception type. |
I could do that, but I am afraid the test code would be less readable. Because I have to skip the cases of @pytest.mark.parametrize('data_gen', _arith_data_gens_no_neg_scale, ids=idfn)
@pytest.mark.parametrize('overflow_exp', [
'pmod(a, cast(0 as {}))',
'pmod(cast(-12 as {}), cast(0 as {}))',
'a % (cast(0 as {}))',
'cast(-12 as {}) % cast(0 as {})'], ids=idfn)
def test_mod_pmod_by_zero(data_gen, overflow_exp):
# -12 cannot cast to decimal(7, 7), we will test this case in test_cannot_cast_to_decimal
if (data_gen is _decimal_gen_7_7) and ("cast(-12 as {})" in overflow_exp):
pass
else:
...
assert_gpu_and_cpu_error(
...
java.lang.ArithmeticException if < 3.2.0 else
org.apache.spark.SparkArithmeticException
...
)
@pytest.mark.parametrize('data_gen', _arith_data_gens_no_neg_scale, ids=idfn)
@pytest.mark.parametrize('overflow_exp', [
'pmod(cast(-12 as {}), cast(0 as {}))',
'cast(-12 as {}) % cast(0 as {})'], ids=idfn)
def test_cannot_cast_to_decimal(data_gen_overflow_exp):
assert_gpu_and_cpu_error(
...
java.lang.ArithmeticException if < 3.3.0 else
org.apache.spark.SparkArithmeticException
...
) |
cc @res-life. What's your opinion? |
From the name
Is this OK @HaoYang670 |
I am not sure if this would decrease the test coverage as |
Tests should not normally be special-casing their parametric inputs. That implies there should be a separate test case potentially with separate parametric inputs. If
Names aren't great, but you get the idea. Let's not complicate test code by checking for special inputs -- that's a different test case. If there's too much duplication between tests at that point, refactor the common parts into a utility method that multiple tests can leverage. |
Signed-off-by: remzi <13716567376yh@gmail.com>
build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks like we did drop a couple of tests like pmod(cast(-12 as {}), but not sure necessary since its cast that is throwing.
Signed-off-by: remzi 13716567376yh@gmail.com
Close #5474
In the test
test_mod_pmod_by_zero
, Spark 31X always throwsjava.lang.ArithmeticException
, Spark 33X always throwsSparkArithmeticException
, but Spark 32X will throw either of them. So we don't distinguish the two for Spark 32X.