You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
We have a table where all the rows need to be re-written after a Spark batch job. The table has existing permissions that should be preserved. However, the table permissions are lost when the rows are written in overwrite mode:
Is your feature request related to a problem? Please describe.
We have a table where all the rows need to be re-written after a Spark batch job. The table has existing permissions that should be preserved. However, the table permissions are lost when the rows are written in overwrite mode:
Describe the solution you'd like
In overwrite mode, there should be an option to truncate the target table instead of dropping it before re-write.
The standard Spark JDBC connector allows one to set
truncate
option to solve this (link).Describe alternatives you've considered
Workarounds exist, but they are not very practical:
target_table_sql
parameterAdditional context
What the solution could look like after implementation:
The text was updated successfully, but these errors were encountered: