-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Adding a dependancy exclusion on Hadoop's jersey-server to fix a Spark UI issue #524
Conversation
…o complex_parquet
Codecov Report
@@ Coverage Diff @@
## main #524 +/- ##
=======================================
Coverage 87.58% 87.58%
=======================================
Files 44 44
Lines 2005 2005
Branches 122 122
=======================================
Hits 1756 1756
Misses 249 249
Flags with carried forward coverage won't be shown. Click here to find out more. 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
Along with the standard GitHub checks, I ran the weekly and nightly tests on this branch. I checked the various functional tests locally across various Spark versions to ensure the Spark Worker UI works for those as well. |
Summary
We've had an issue raised wherein the Spark UI shows blank and returns a stacktrace when trying to look at the executors tab in the Spark Worker UI as it runs a job.
Description
It seems that the connector's two dependancies Spark and Hadoop contained different packages of jersey-server. Excluding Hadoop's seems to fix the issue without problem.
I've also added the ability to run
sbt dependencyTree
so we can observe dependancies closer in the future.Related Issue
Closes #522.
Additional Reviewers
@alexey-temnikov
@alexr-bq
@jonathanl-bq
@jeremyp-bq