Skip to content

Commit

Permalink
Add missing Node help. fix some help typos.
Browse files Browse the repository at this point in the history
  • Loading branch information
cregouby committed Jul 20, 2024
1 parent 1943812 commit 2fa5868
Show file tree
Hide file tree
Showing 9 changed files with 16 additions and 5 deletions.
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -66,5 +66,5 @@ Config/testthat/parallel: false
Config/testthat/start-first: interface, explain, params
Encoding: UTF-8
Roxygen: list(markdown = TRUE)
RoxygenNote: 7.3.1
RoxygenNote: 7.3.2
Language: en-US
4 changes: 4 additions & 0 deletions R/hardhat.R
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@
#' * A __matrix__ of predictors.
#' * A __recipe__ specifying a set of preprocessing steps
#' created from [recipes::recipe()].
#' * A __Node__ where tree will be used as hierarchical outcome,
#' and attributes will be used as predictors.
#'
#' The predictor data should be standardized (e.g. centered or scaled).
#' The model treats categorical predictors internally thus, you don't need to
Expand Down Expand Up @@ -244,6 +246,8 @@ new_tabnet_fit <- function(fit, blueprint) {
#' * A __matrix__ of predictors.
#' * A __recipe__ specifying a set of preprocessing steps
#' created from [recipes::recipe()].
#' * A __Node__ where tree will be used as hierarchical outcome,
#' and attributes will be used as predictors.
#'
#' The predictor data should be standardized (e.g. centered or scaled).
#' The model treats categorical predictors internally thus, you don't need to
Expand Down
2 changes: 1 addition & 1 deletion R/model.R
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ resolve_data <- function(x, y) {
#' @param learn_rate initial learning rate for the optimizer.
#' @param optimizer the optimization method. currently only `"adam"` is supported,
#' you can also pass any torch optimizer function.
#' @param valid_split (`[0, 1)`) The fraction of the dataset used for validation.
#' @param valid_split In \[0, 1). The fraction of the dataset used for validation.
#' (default = 0 means no split)
#' @param num_independent Number of independent Gated Linear Units layers at each step of the encoder.
#' Usual values range from 1 to 5.
Expand Down
2 changes: 1 addition & 1 deletion R/tab-network.R
Original file line number Diff line number Diff line change
Expand Up @@ -418,7 +418,7 @@ tabnet_no_embedding <- torch::nn_module(
#' @param n_shared Number of shared GLU layer in each GLU block of the encoder.
#' @param epsilon Avoid log(0), this should be kept very low.
#' @param virtual_batch_size Batch size for Ghost Batch Normalization.
#' @param momentum Float value between 0 and 1 which will be used for momentum in all batch norm.
#' @param momentum Numerical value between 0 and 1 which will be used for momentum in all batch norm.
#' @param mask_type Either "sparsemax" or "entmax" : this is the masking function to use.
#' @export
tabnet_nn <- torch::nn_module(
Expand Down
2 changes: 1 addition & 1 deletion man/tabnet_config.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions man/tabnet_fit.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion man/tabnet_nn.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions man/tabnet_pretrain.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions tests/spelling.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
if(requireNamespace('spelling', quietly = TRUE))
spelling::spell_check_test(vignettes = TRUE, error = FALSE,
skip_on_cran = TRUE)

0 comments on commit 2fa5868

Please sign in to comment.