From e83ddd8e64ab8c6682997dea0cb319f6d83af99d Mon Sep 17 00:00:00 2001 From: Hao Zhu <9665750+viadea@users.noreply.github.com> Date: Thu, 4 Aug 2022 16:20:41 -0700 Subject: [PATCH] Add 2208 doc link Signed-off-by: Hao Zhu <9665750+viadea@users.noreply.github.com> --- docs/download.md | 21 +++++++++++++++++++++ docs/spark-profiling-tool.md | 2 +- docs/spark-qualification-tool.md | 4 ++-- 3 files changed, 24 insertions(+), 3 deletions(-) diff --git a/docs/download.md b/docs/download.md index 5bcbbbc31af..f43970f29df 100644 --- a/docs/download.md +++ b/docs/download.md @@ -41,6 +41,27 @@ for your hardware's minimum driver version. *For Cloudera and EMR support, please refer to the [Distributions](./FAQ.md#which-distributions-are-supported) section of the FAQ. +### Download v22.08.0 +* Download the [RAPIDS + Accelerator for Apache Spark 22.08.0 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.08.0/rapids-4-spark_2.12-22.08.0.jar) + +This package is built against CUDA 11.5 and all CUDA 11.x versions are supported through [CUDA forward +compatibility](https://docs.nvidia.com/deploy/cuda-compatibility/index.html). It is tested +on V100, T4, A2, A10, A30 and A100 GPUs with CUDA 11.0-11.5. For those using other types of GPUs which +do not have CUDA forward compatibility (for example, GeForce), CUDA 11.5 or later is required. Users will +need to ensure the minimum driver (450.80.02) and CUDA toolkit are installed on each Spark node. + +### Verify signature +* Download the [RAPIDS Accelerator for Apache Spark 22.08.0 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.08.0/rapids-4-spark_2.12-22.08.0.jar) + and [RAPIDS Accelerator for Apache Spark 22.08.0 jars.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.08.0/rapids-4-spark_2.12-22.08.0.jar.asc) +* Download the [PUB_KEY](https://keys.openpgp.org/search?q=sw-spark@nvidia.com). +* Import the public key: `gpg --import PUB_KEY` +* Verify the signature: `gpg --verify rapids-4-spark_2.12-22.08.0.jar.asc rapids-4-spark_2.12-22.08.0.jar` + +The output if signature verify: + + gpg: Good signature from "NVIDIA Spark (For the signature of spark-rapids release jars) " + ### Release Notes New functionality and performance improvements for this release include: * Rocky Linux 8 support diff --git a/docs/spark-profiling-tool.md b/docs/spark-profiling-tool.md index 7a87b36a6ee..224c79b4fa4 100644 --- a/docs/spark-profiling-tool.md +++ b/docs/spark-profiling-tool.md @@ -31,7 +31,7 @@ more information. The Profiling tool requires the Spark 3.x jars to be able to run but do not need an Apache Spark run time. If you do not already have Spark 3.x installed, you can download the Spark distribution to any machine and include the jars in the classpath. -- Download the jar file from [Maven repository](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark-tools_2.12/22.06.0/) +- Download the jar file from [Maven repository](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark-tools_2.12/22.08.0/) - [Download Apache Spark 3.x](http://spark.apache.org/downloads.html) - Spark 3.1.1 for Apache Hadoop is recommended If you want to compile the jars, please refer to the instructions [here](./spark-qualification-tool.md#How-to-compile-the-tools-jar). diff --git a/docs/spark-qualification-tool.md b/docs/spark-qualification-tool.md index ee748a13ec5..b9ba4ff0bf8 100644 --- a/docs/spark-qualification-tool.md +++ b/docs/spark-qualification-tool.md @@ -54,7 +54,7 @@ more information. The Qualification tool require the Spark 3.x jars to be able to run but do not need an Apache Spark run time. If you do not already have Spark 3.x installed, you can download the Spark distribution to any machine and include the jars in the classpath. -- Download the jar file from [Maven repository](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark-tools_2.12/22.06.0/) +- Download the jar file from [Maven repository](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark-tools_2.12/22.08.0/) - [Download Apache Spark 3.x](http://spark.apache.org/downloads.html) - Spark 3.1.1 for Apache Hadoop is recommended ### Step 2 Run the Qualification tool @@ -307,7 +307,7 @@ to [Understanding the Qualification tool output](#understanding-the-qualificatio - Java 8 or above, Spark 3.0.1+ ### Download the tools jar -- Download the jar file from [Maven repository](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark-tools_2.12/22.06.0/) +- Download the jar file from [Maven repository](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark-tools_2.12/22.08.0/) ### Modify your application code to call the api's