Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[FEA] Support try_multiply function #11868

Open
LIN-Yu-Ting opened this issue Dec 12, 2024 · 2 comments
Open

[FEA] Support try_multiply function #11868

LIN-Yu-Ting opened this issue Dec 12, 2024 · 2 comments
Labels
feature request New feature or request

Comments

@LIN-Yu-Ting
Copy link

I wish spark rapids can support try_multiply sql function:

try_multiply(CAST((GenotypeDT.alleleDepths[1]/GenotypeDT.depth) AS DECIMAL(5,4)), 100)

The fallback message is:

! <TryEval> tryeval(CheckOverflow((promote_precision(cast(cast((cast(alleleDepths#1912[1] as double) / cast(depth#1920 as double)) as decimal(5,4)) as decimal(7,4))) * 100.0000), DecimalType(9,4))) cannot run on GPU because GPU does not currently support the operator class org.apache.spark.sql.catalyst.expressions.TryEval
@LIN-Yu-Ting LIN-Yu-Ting added ? - Needs Triage Need team to review and classify feature request New feature or request labels Dec 12, 2024
@revans2
Copy link
Collaborator

revans2 commented Dec 17, 2024

@LIN-Yu-Ting what version of Spark is this for? And do you need/want the exception handling to include the full expression tree?

try_multiply was added to Spark 3.3.0 https://issues.apache.org/jira/browse/SPARK-38164, but was then changed in Spark 3.4.0 https://issues.apache.org/jira/browse/SPARK-40222 so that numeric arguments would not catch exceptions for their children.

It looks like you are using a version of Spark 3.3.0 or after and before 3.4.0. If it is after 3.4.0 we have a bug in our code for versions 3.4.0 and above. I filed #11884 for it. Thanks for having me look more closely at the code.

To try and support TryEval generally, especially for 3.3.X to get try_multiply to work we are going to have to modify the entire tree to some how support this processing.

@LIN-Yu-Ting
Copy link
Author

@revans2 you are correct that we are using Spark 3.3.0 in this case. Currently, we only apply try_multiply. Therefore, we do not need the exception handling to include the full expression tree. Thanks for your support.

@sameerz sameerz removed the ? - Needs Triage Need team to review and classify label Dec 31, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
feature request New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants