You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to know how much free memory on each node simultaneously. The code I run as following:
get.remote.free.mem <- function(ssh.session) {
out <- ssh_exec_internal(ssh.session, command = c('cat /proc/meminfo'));
mem = floor(as.numeric(stringr::str_extract(sys::as_text(out$stdout)[3], '\\d+'))/1024/1024);
return(mem)
}
# SSH.SESSION is a list, every item of which is a ssh session
free.mem.list = parallel::mclapply(SSH.SESSION, get.remote.free.mem, mc.cores = length(SSH.SESSION))
I found that for this code mclapply can run normally only once. After I get the the free memory, re-running the last state (parallel::mclapply) will raise "Error: libssh failure at 'ssh_channel_open_session': Socket error: disconnected".
The text was updated successfully, but these errors were encountered:
yaotianran
changed the title
SSH session automatically disconnect after parallel programming
SSH session automatically disconnects after parallel programming
Jul 24, 2021
I want to know how much free memory on each node simultaneously. The code I run as following:
I found that for this code mclapply can run normally only once. After I get the the free memory, re-running the last state (parallel::mclapply) will raise "Error: libssh failure at 'ssh_channel_open_session': Socket error: disconnected".
The text was updated successfully, but these errors were encountered: