Skip to content

Commit

Permalink
update doc for rapids v2202 release (#106)
Browse files Browse the repository at this point in the history
* update databricks init 7.3&9.1 scripts

Signed-off-by: liyuan <[email protected]>

* update release version from 21.12 to 22.02

Signed-off-by: liyuan <[email protected]>

* revert databricks init scripts changes, dont need to remove the xgboost jars in script, we already overwrite them

Signed-off-by: liyuan <[email protected]>
  • Loading branch information
nvliyuan authored Feb 17, 2022
1 parent 195c92a commit a510e4d
Show file tree
Hide file tree
Showing 14 changed files with 42 additions and 42 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ CUDA 11.0 toolkit on the cluster. This can be done with the [generate-init-scri
Spark plugin and the CUDA 11 toolkit.
- [Databricks 9.1 LTS
ML](https://docs.databricks.com/release-notes/runtime/9.1ml.html#system-environment) has CUDA 11
installed. Users will need to use 21.12.0 or later on Databricks 9.1 LTS ML. In this case use
installed. Users will need to use 22.02.0 or later on Databricks 9.1 LTS ML. In this case use
[generate-init-script.ipynb](generate-init-script.ipynb) which will install
the RAPIDS Spark plugin.
2. Once you are in the notebook, click the “Run All” button.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@
"source": [
"%sh\n",
"cd ../../dbfs/FileStore/jars/\n",
"sudo wget -O cudf-21.12.0-cuda11.jar https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.0/cudf-21.12.0-cuda11.jar\n",
"sudo wget -O rapids-4-spark_2.12-21.12.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar\n",
"sudo wget -O cudf-22.02.0-cuda11.jar https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/cudf-22.02.0-cuda11.jar\n",
"sudo wget -O rapids-4-spark_2.12-22.02.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar\n",
"sudo wget -O xgboost4j_3.0-1.4.2-0.2.0.jar https://repo1.maven.org/maven2/com/nvidia/xgboost4j_3.0/1.4.2-0.2.0/xgboost4j_3.0-1.4.2-0.2.0.jar\n",
"sudo wget -O xgboost4j-spark_3.0-1.4.2-0.2.0.jar https://repo1.maven.org/maven2/com/nvidia/xgboost4j-spark_3.0/1.4.2-0.2.0/xgboost4j-spark_3.0-1.4.2-0.2.0.jar\n",
"ls -ltr\n",
Expand Down Expand Up @@ -58,8 +58,8 @@
"dbutils.fs.put(\"/databricks/init_scripts/init.sh\",\"\"\"\n",
"#!/bin/bash\n",
"sudo cp /dbfs/FileStore/jars/xgboost4j_3.0-1.4.2-0.2.0.jar /databricks/jars/spark--maven-trees--ml--7.x--xgboost--ml.dmlc--xgboost4j_2.12--ml.dmlc__xgboost4j_2.12__1.0.0.jar\n",
"sudo cp /dbfs/FileStore/jars/cudf-21.12.0-cuda11.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/rapids-4-spark_2.12-21.12.0.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/cudf-22.02.0-cuda11.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/rapids-4-spark_2.12-22.02.0.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/xgboost4j-spark_3.0-1.4.2-0.2.0.jar /databricks/jars/spark--maven-trees--ml--7.x--xgboost--ml.dmlc--xgboost4j-spark_2.12--ml.dmlc__xgboost4j-spark_2.12__1.0.0.jar\n",
"sudo wget -O /etc/apt/preferences.d/cuda-repository-pin-600 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/cuda-ubuntu1804.pin\n",
"sudo wget -O ~/cuda-repo-ubuntu1804-11-0-local_11.0.3-450.51.06-1_amd64.deb https://developer.download.nvidia.com/compute/cuda/11.0.3/local_installers/cuda-repo-ubuntu1804-11-0-local_11.0.3-450.51.06-1_amd64.deb\n",
Expand Down Expand Up @@ -139,7 +139,7 @@
"1. Edit your cluster, adding an initialization script from `dbfs:/databricks/init_scripts/init.sh` in the \"Advanced Options\" under \"Init Scripts\" tab\n",
"2. Reboot the cluster\n",
"3. Go to \"Libraries\" tab under your cluster and install `dbfs:/FileStore/jars/xgboost4j-spark_3.0-1.4.2-0.2.0.jar` in your cluster by selecting the \"DBFS\" option for installing jars\n",
"4. Import the mortgage example notebook from `https://github.com/NVIDIA/spark-rapids-examples/blob/branch-21.12/examples/Spark-ETL+XGBoost/mortgage/notebooks/python/mortgage-gpu.ipynb`\n",
"4. Import the mortgage example notebook from `https://github.com/NVIDIA/spark-rapids-examples/blob/branch-22.02/examples/Spark-ETL+XGBoost/mortgage/notebooks/python/mortgage-gpu.ipynb`\n",
"5. Inside the mortgage example notebook, update the data paths\n",
" `train_data = reader.schema(schema).option('header', True).csv('/data/mortgage/csv/small-train.csv')`\n",
" `trans_data = reader.schema(schema).option('header', True).csv('/data/mortgage/csv/small-trans.csv')`"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@
"source": [
"%sh\n",
"cd ../../dbfs/FileStore/jars/\n",
"sudo wget -O cudf-21.12.0-cuda11.jar https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.0/cudf-21.12.0-cuda11.jar\n",
"sudo wget -O rapids-4-spark_2.12-21.12.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar\n",
"sudo wget -O cudf-22.02.0-cuda11.jar https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/cudf-22.02.0-cuda11.jar\n",
"sudo wget -O rapids-4-spark_2.12-22.02.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar\n",
"sudo wget -O xgboost4j_3.0-1.4.2-0.2.0.jar https://repo1.maven.org/maven2/com/nvidia/xgboost4j_3.0/1.4.2-0.2.0/xgboost4j_3.0-1.4.2-0.2.0.jar\n",
"sudo wget -O xgboost4j-spark_3.0-1.4.2-0.2.0.jar https://repo1.maven.org/maven2/com/nvidia/xgboost4j-spark_3.0/1.4.2-0.2.0/xgboost4j-spark_3.0-1.4.2-0.2.0.jar\n",
"ls -ltr\n",
Expand Down Expand Up @@ -58,8 +58,8 @@
"dbutils.fs.put(\"/databricks/init_scripts/init.sh\",\"\"\"\n",
"#!/bin/bash\n",
"sudo cp /dbfs/FileStore/jars/xgboost4j_3.0-1.4.2-0.2.0.jar /databricks/jars/spark--maven-trees--ml--9.x--xgboost-gpu--ml.dmlc--xgboost4j-gpu_2.12--ml.dmlc__xgboost4j-gpu_2.12__1.4.1.jar\n",
"sudo cp /dbfs/FileStore/jars/cudf-21.12.0-cuda11.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/rapids-4-spark_2.12-21.12.0.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/cudf-22.02.0-cuda11.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/rapids-4-spark_2.12-22.02.0.jar /databricks/jars/\n",
"sudo cp /dbfs/FileStore/jars/xgboost4j-spark_3.0-1.4.2-0.2.0.jar /databricks/jars/spark--maven-trees--ml--9.x--xgboost-gpu--ml.dmlc--xgboost4j-spark-gpu_2.12--ml.dmlc__xgboost4j-spark-gpu_2.12__1.4.1.jar\"\"\", True)"
]
},
Expand Down Expand Up @@ -132,7 +132,7 @@
"1. Edit your cluster, adding an initialization script from `dbfs:/databricks/init_scripts/init.sh` in the \"Advanced Options\" under \"Init Scripts\" tab\n",
"2. Reboot the cluster\n",
"3. Go to \"Libraries\" tab under your cluster and install `dbfs:/FileStore/jars/xgboost4j-spark_3.0-1.4.2-0.2.0.jar` in your cluster by selecting the \"DBFS\" option for installing jars\n",
"4. Import the mortgage example notebook from `https://github.com/NVIDIA/spark-rapids-examples/blob/branch-21.12/examples/Spark-ETL+XGBoost/mortgage/notebooks/python/mortgage-gpu.ipynb`\n",
"4. Import the mortgage example notebook from `https://github.com/NVIDIA/spark-rapids-examples/blob/branch-22.02/examples/Spark-ETL+XGBoost/mortgage/notebooks/python/mortgage-gpu.ipynb`\n",
"5. Inside the mortgage example notebook, update the data paths\n",
" `train_data = reader.schema(schema).option('header', True).csv('/data/mortgage/csv/small-train.csv')`\n",
" `trans_data = reader.schema(schema).option('header', True).csv('/data/mortgage/csv/small-trans.csv')`"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ export SPARK_DOCKER_IMAGE=<gpu spark docker image repo and name>
export SPARK_DOCKER_TAG=<spark docker image tag>

pushd ${SPARK_HOME}
wget https://github.com/NVIDIA/spark-rapids-examples/raw/branch-21.12/dockerfile/Dockerfile
wget https://github.com/NVIDIA/spark-rapids-examples/raw/branch-22.02/dockerfile/Dockerfile

# Optionally install additional jars into ${SPARK_HOME}/jars/

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ For simplicity export the location to these jars. All examples assume the packag
* [XGBoost4j-Spark Package](https://repo1.maven.org/maven2/com/nvidia/xgboost4j-spark_3.0/1.4.2-0.2.0/)

2. Download the RAPIDS Accelerator for Apache Spark plugin jar
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar)
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar)

Then download the version of the cudf jar that your version of the accelerator depends on.

* [cuDF Package](https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.2/cudf-21.12.2-cuda11.jar)
* [cuDF Package](https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/cudf-22.02.0-cuda11.jar)

### Build XGBoost Python Examples

Expand All @@ -29,8 +29,8 @@ You need to download Mortgage dataset to `/opt/xgboost` from this [site](https:/

``` bash
export SPARK_XGBOOST_DIR=/opt/xgboost
export CUDF_JAR=${SPARK_XGBOOST_DIR}/cudf-21.12.2-cuda11.jar
export RAPIDS_JAR=${SPARK_XGBOOST_DIR}/rapids-4-spark_2.12-21.12.0.jar
export CUDF_JAR=${SPARK_XGBOOST_DIR}/cudf-22.02.0-cuda11.jar
export RAPIDS_JAR=${SPARK_XGBOOST_DIR}/rapids-4-spark_2.12-22.02.0.jar
export XGBOOST4J_JAR=${SPARK_XGBOOST_DIR}/xgboost4j_3.0-1.4.2-0.2.0.jar
export XGBOOST4J_SPARK_JAR=${SPARK_XGBOOST_DIR}/xgboost4j-spark_3.0-1.4.2-0.2.0.jar
export SAMPLE_ZIP=${SPARK_XGBOOST_DIR}/samples.zip
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ For simplicity export the location to these jars. All examples assume the packag
### Download the jars

1. Download the RAPIDS Accelerator for Apache Spark plugin jar
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar)
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar)

Then download the version of the cudf jar that your version of the accelerator depends on.

* [cuDF Package](https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.2/cudf-21.12.2-cuda11.jar)
* [cuDF Package](https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/cudf-22.02.0-cuda11.jar)

### Build XGBoost Scala Examples

Expand All @@ -25,7 +25,7 @@ You need to download mortgage dataset to `/opt/xgboost` from this [site](https:/

``` bash
export SPARK_XGBOOST_DIR=/opt/xgboost
export CUDF_JAR=${SPARK_XGBOOST_DIR}/cudf-21.12.2-cuda11.jar
export RAPIDS_JAR=${SPARK_XGBOOST_DIR}/rapids-4-spark_2.12-21.12.0.jar
export CUDF_JAR=${SPARK_XGBOOST_DIR}/cudf-22.02.0-cuda11.jar
export RAPIDS_JAR=${SPARK_XGBOOST_DIR}/rapids-4-spark_2.12-22.02.0.jar
export SAMPLE_JAR=${SPARK_XGBOOST_DIR}/sample_xgboost_apps-0.2.2-jar-with-dependencies.jar
```
Original file line number Diff line number Diff line change
Expand Up @@ -9,16 +9,16 @@
"All data could be found at https://docs.rapids.ai/datasets/mortgage-data\n",
"\n",
"### 2. Download needed jars\n",
"* [cudf-21.12.2-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.0/)\n",
"* [rapids-4-spark_2.12-21.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar)\n",
"* [cudf-22.02.0-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/)\n",
"* [rapids-4-spark_2.12-22.02.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar)\n",
"\n",
"\n",
"### 3. Start Spark Standalone\n",
"Before running the script, please setup Spark standalone mode\n",
"\n",
"### 4. Add ENV\n",
"```\n",
"$ export SPARK_JARS=cudf-21.12.2-cuda11.jar,rapids-4-spark_2.12-21.12.0.jar\n",
"$ export SPARK_JARS=cudf-22.02.0-cuda11.jar,rapids-4-spark_2.12-22.02.0.jar\n",
"$ export PYSPARK_DRIVER_PYTHON=jupyter \n",
"$ export PYSPARK_DRIVER_PYTHON_OPTS=notebook\n",
"```\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,15 @@
"All data could be found at https://docs.rapids.ai/datasets/mortgage-data\n",
"\n",
"### 2. Download needed jars\n",
"* [cudf-21.12.2-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.2/)\n",
"* [rapids-4-spark_2.12-21.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar)\n",
"* [cudf-22.02.0-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/)\n",
"* [rapids-4-spark_2.12-22.02.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar)\n",
"\n",
"### 3. Start Spark Standalone\n",
"Before Running the script, please setup Spark standalone mode\n",
"\n",
"### 4. Add ENV\n",
"```\n",
"$ export SPARK_JARS=cudf-21.12.2-cuda11.jar,rapids-4-spark_2.12-21.12.0.jar\n",
"$ export SPARK_JARS=cudf-22.02.0-cuda11.jar,rapids-4-spark_2.12-22.02.0.jar\n",
"\n",
"```\n",
"\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -160,10 +160,10 @@
"```scala\n",
"import org.apache.spark.sql.SparkSession\n",
"val spark = SparkSession.builder().appName(\"Taxi-GPU\").getOrCreate\n",
"%AddJar file:/data/libs/cudf-21.12.2-cuda11.jar\n",
"%AddJar file:/data/libs/cudf-22.02.0-cuda11.jar\n",
"%AddJar file:/data/libs/xgboost4j_3.0-1.4.2-0.2.0.jar\n",
"%AddJar file:/data/libs/xgboost4j-spark_3.0-1.4.2-0.2.0.jar\n",
"%AddJar file:/data/libs/rapids-4-spark_2.12-21.12.0.jar\n",
"%AddJar file:/data/libs/rapids-4-spark_2.12-22.02.0.jar\n",
"// ...\n",
"```"
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,15 @@
"All data could be found at https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page\n",
"\n",
"### 2. Download needed jars\n",
"* [cudf-21.12.2-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.2/)\n",
"* [rapids-4-spark_2.12-21.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar)\n",
"* [cudf-22.02.0-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/)\n",
"* [rapids-4-spark_2.12-22.02.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar)\n",
"\n",
"### 3. Start Spark Standalone\n",
"Before running the script, please setup Spark standalone mode\n",
"\n",
"### 4. Add ENV\n",
"```\n",
"$ export SPARK_JARS=cudf-21.12.2-cuda11.jar,rapids-4-spark_2.12-21.12.0.jar\n",
"$ export SPARK_JARS=cudf-22.02.0-cuda11.jar,rapids-4-spark_2.12-22.02.0.jar\n",
"$ export PYSPARK_DRIVER_PYTHON=jupyter \n",
"$ export PYSPARK_DRIVER_PYTHON_OPTS=notebook\n",
"```\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,15 @@
"All data could be found at https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page\n",
"\n",
"### 2. Download needed jars\n",
"* [cudf-21.12.2-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/21.12.2/)\n",
"* [rapids-4-spark_2.12-21.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar)\n",
"* [cudf-22.02.0-cuda11.jar](https://repo1.maven.org/maven2/ai/rapids/cudf/22.02.0/)\n",
"* [rapids-4-spark_2.12-22.02.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.02.0/rapids-4-spark_2.12-22.02.0.jar)\n",
"\n",
"### 3. Start Spark Standalone\n",
"Before running the script, please setup Spark standalone mode\n",
"\n",
"### 4. Add ENV\n",
"```\n",
"$ export SPARK_JARS=cudf-21.12.2-cuda11.jar,rapids-4-spark_2.12-21.12.0.jar\n",
"$ export SPARK_JARS=cudf-22.02.0-cuda11.jar,rapids-4-spark_2.12-22.02.0.jar\n",
"\n",
"```\n",
"\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -150,10 +150,10 @@
"```scala\n",
"import org.apache.spark.sql.SparkSession\n",
"val spark = SparkSession.builder().appName(\"Taxi-GPU\").getOrCreate\n",
"%AddJar file:/data/libs/cudf-21.12.2-cuda11.jar\n",
"%AddJar file:/data/libs/cudf-22.02.0-cuda11.jar\n",
"%AddJar file:/data/libs/xgboost4j_3.0-1.4.2-0.2.0.jar\n",
"%AddJar file:/data/libs/xgboost4j-spark_3.0-1.4.2-0.2.0.jar\n",
"%AddJar file:/data/libs/rapids-4-spark-21.12.0.jar\n",
"%AddJar file:/data/libs/rapids-4-spark-22.02.0.jar\n",
"// ...\n",
"```"
]
Expand Down
8 changes: 4 additions & 4 deletions examples/Spark-cuML/pca/spark-submit.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@
# limitations under the License.
#

ML_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark-ml_2.12/21.12.0-SNAPSHOT/rapids-4-spark-ml_2.12-21.12.0-SNAPSHOT.jar
CUDF_JAR=/root/.m2/repository/ai/rapids/cudf/21.12.0-SNAPSHOT/cudf-21.12.0-SNAPSHOT-cuda11.jar
PLUGIN_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark_2.12/21.12.0-SNAPSHOT/rapids-4-spark_2.12-21.12.0-SNAPSHOT.jar
ML_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark-ml_2.12/22.02.0-SNAPSHOT/rapids-4-spark-ml_2.12-22.02.0-SNAPSHOT.jar
CUDF_JAR=/root/.m2/repository/ai/rapids/cudf/22.02.0-SNAPSHOT/cudf-22.02.0-SNAPSHOT-cuda11.jar
PLUGIN_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark_2.12/22.02.0-SNAPSHOT/rapids-4-spark_2.12-22.02.0-SNAPSHOT.jar

$SPARK_HOME/bin/spark-submit \
--master spark://127.0.0.1:7077 \
Expand All @@ -39,4 +39,4 @@ $SPARK_HOME/bin/spark-submit \
--conf spark.network.timeout=1000s \
--jars $ML_JAR,$CUDF_JAR,$PLUGIN_JAR \
--class com.nvidia.spark.examples.pca.Main \
/workspace/target/PCAExample-21.12.0-SNAPSHOT.jar
/workspace/target/PCAExample-22.02.0-SNAPSHOT.jar
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@
"import os\n",
"# Change to your cluster ip:port and directories\n",
"SPARK_MASTER_URL = os.getenv(\"SPARK_MASTER_URL\", \"spark:your-ip:port\")\n",
"CUDF_JAR = os.getenv(\"CUDF_JAR\", \"/your-path/cudf-21.12.2-cuda11.jar\")\n",
"RAPIDS_JAR = os.getenv(\"RAPIDS_JAR\", \"/your-path/rapids-4-spark_2.12-21.12.0.jar\")\n"
"CUDF_JAR = os.getenv(\"CUDF_JAR\", \"/your-path/cudf-22.02.0-cuda11.jar\")\n",
"RAPIDS_JAR = os.getenv(\"RAPIDS_JAR\", \"/your-path/rapids-4-spark_2.12-22.02.0.jar\")\n"
]
},
{
Expand Down

0 comments on commit a510e4d

Please sign in to comment.