Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates #30

Merged
merged 3 commits into from
Sep 11, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,7 @@ importFrom(sparklyr,spark_write_text)
importFrom(tidyr,pivot_longer)
importFrom(tidyselect,matches)
importFrom(tidyselect,tidyselect_data_has_predicates)
importFrom(utils,compareVersion)
importFrom(utils,head)
importFrom(utils,type.convert)
importFrom(vctrs,vec_as_names)
2 changes: 1 addition & 1 deletion R/install-pyspark.R
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ install_pyspark <- function(envname = "r-sparklyr",
method = method,
python_version = python_version,
pip = TRUE,
...
... = ...
)
}

Expand Down
7 changes: 7 additions & 0 deletions R/local-connect.R
Original file line number Diff line number Diff line change
@@ -1,17 +1,24 @@
#' Starts Spark Connect locally
#' @param version Spark version to use (3.4 or above)
#' @param scala_version Acceptable Scala version of packages to be loaded
#' @param include_args Flag that indicates wether to add the additional arguments
#' to the command that starts the service. At this time, only the 'packages'
#' argument is submitted.
#' @param ... Optional arguments; currently unused
#' @export
spark_connect_service_start <- function(version = "3.4",
scala_version = "2.12",
include_args = TRUE,
...) {
get_version <- spark_install_find(version = version)
cmd <- path(get_version$sparkVersionDir, "sbin", "start-connect-server.sh")
args <- c(
"--packages",
glue("org.apache.spark:spark-connect_{scala_version}:{get_version$sparkVersion}")
)
if(!include_args) {
args <- ""
}
prs <- process$new(
command = cmd,
args = args,
Expand Down
2 changes: 1 addition & 1 deletion R/package.R
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
#' @importFrom rlang is_string is_character as_utf8_character parse_exprs
#' @importFrom methods new is setOldClass
#' @importFrom tidyselect matches
#' @importFrom utils head type.convert
#' @importFrom utils head type.convert compareVersion
#' @importFrom tidyr pivot_longer
#' @importFrom vctrs vec_as_names
#' @importFrom processx process
Expand Down
11 changes: 10 additions & 1 deletion man/spark_connect_service_start.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.