Skip to content

Commit

Permalink
Merge pull request #19 from mlverse/updates
Browse files Browse the repository at this point in the history
Reverts env name change,
  • Loading branch information
edgararuiz authored Aug 18, 2023
2 parents 415d092 + dfc9e88 commit 69d6ae5
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 7 deletions.
6 changes: 3 additions & 3 deletions R/install.R
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#' Installs python dependencies
#' @param envname The name of the Python Environment to use to install the
#' Python libraries. Defaults to "recent".
#' Python libraries. Defaults to "r-sparklyr".
#' @param python_version The version of Python to use to create the Python
#' environment.
#' @param new_env If `TRUE`, any existing Python virtual environment and/or
Expand All @@ -11,10 +11,10 @@
#' specified by `envname`.
#' @param ... Passed on to [`reticulate::py_install()`]
#' @export
install_pyspark <- function(envname = "recent",
install_pyspark <- function(envname = "r-sparklyr",
...,
python_version = ">=3.9",
new_env = identical(envname, "recent"),
new_env = identical(envname, "r-sparklyr"),
method = c("auto", "virtualenv", "conda")) {
packages <- c(
"pyspark",
Expand Down
2 changes: 1 addition & 1 deletion R/spark-connect.R
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ py_spark_connect <- function(master,
token = Sys.getenv("DATABRICKS_TOKEN"),
cluster_id = NULL,
method = "",
virtualenv_name = "recent",
virtualenv_name = "r-sparklyr",
spark_version = NULL,
databricks_connect_version = NULL,
config = list()) {
Expand Down
6 changes: 3 additions & 3 deletions man/install_pyspark.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 69d6ae5

Please sign in to comment.