Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Downloading large file on remote host. Then close ssh connection

Tags:

bash

ssh

wget

I am trying to download a stackoverlfow dump of all posts to a remote server (actually a container on a remote host). Now as you can image the dump is large (11G). I want to start a download and then be able to exit my SSH connection to the remote host.

I have looked at tmux but it's confusing. I know wget https://archive.org/download/stackexchange/stackoverflow.com-Posts.7z will work but I will have to stay connected for the duration of the download.

Does anyone know how I can use tmux to solve this problem?

like image 346
gotthecodes Avatar asked Aug 31 '25 22:08

gotthecodes


1 Answers

If I've correctly understood you situation, using nohup to launch the command will do the trick.

nohup wget https://archive.org/download/stackexchange/stackoverflow.com-Posts.7z

This will prevent the killing of the wget process when the shell terminates. You can connect via SSH, execute the above command and exit. It will keed downloading by itself.

By the way: Tmux stands for Terminal Multiplexer and it's not related to the life cycle of a process.

like image 109
Ilario Pierbattista Avatar answered Sep 03 '25 10:09

Ilario Pierbattista