Hello, not so long ago I discovered Jellyfin, and by extension Radarr, Sonarr, Bazarr, Jackett, etc.

So I immediately installed Jellyfin on a Hetzner server that I already had, powerful enough to have several people consuming content at the same time on these kinds of services.

As the hard drives at home were full, and I didn’t want to sacrifice my bandwidth, I got a Hetzner Storage Box to store the video content on.

So I’ve got Radarr, Sonarr and qBitTorrent installed on my computer, where I manage the downloads, upload them to the Storage Box which is then mounted on the server where Jellyfin is installed to serve the content (see the diagram above, the orange part would be a hypothesis for integrating more storage in the future).

Problems :

  • Synchronisation between my local machine and the Hetzner Storage Box: as I was saying, I download everything locally and then upload everything with rclone sync to the storage. The exact command is: rclone sync /mnt/2TB/Jellyfin HetznerBox:/Jellyfin --exclude "*.!qB" --transfers 10 --fast-list --checkers 500 --progress --log-level DEBUG --log-file=/var/log/rclone/rclone-sync-jellyfin.log This has several problems, it’s slow, and I have a 1 GB/s fibre connection, and I doubt the problem is with Hetzner… Secondly, if I delete content on the storage server and run the command again, the content will be resynchronised. What’s more, it often happens that I run the command and nothing happens for a long time before the synchronisation starts. Thirdly, rclone sync forces me to keep a local copy of everything.
  • rclone mount does not seem to respect the --vfs-cache-max-size: here is the command I use to mount the storage server on the server where Jellyfin is installed rclone mount HetznerBox:/Jellyfin /somepath/Jellyfin/ --vfs-cache-mode full --vfs-cache-max-size 100G --allow-other. But if I run df -h I get /dev/md2 436G 148G 266G 36% / And if I run sudo du -sh Jellyfin/ I get 378G Jellyfin/ The server only has Jellyfin installed, and I don’t know how to look precisely how much the vfs cache is using.
  • since Radarr, Sonarr and everything are on my computer, if I install Jellyserr on the same server as Jellyfin (which would be perfect for managing everything when we have several users), it won’t be able to communicate with Radarr and Sonarr.

Possible solutions:

  • Don’t use my local machine. I’d like to use the Hetzner server for Radarr, Sonarr, qBitTorrent, etc. but the problem is that I don’t want to store the data on this server, but on the Storage Box. How can I download directly from one server to another?
  • Could I use Mullvad VPN only for qBitTorrent downloads (and Jackett) on a Hetzner server, or is that too complicated or foolish?

Edit:

Diagram updated to adapt more to my configuration:

image

  • the Hetzner servers are in Finland
  • tried again overnight, rclone mount works fine, just extremely slow
  • downloads of “finished” torrents and those “in progress” are already separated
  • @killabeezio@lemm.ee
    link
    fedilink
    English
    210 months ago

    Not sure where you got the idea that it’s not advisable to mount the box via NFS. You can totally do this. I would make some adjustments though.

    I would use mergerfs to union multiple mounts into one. You would then download to the local mount which is the drive connected directly to your seed box. Then I would have a remote mount to the nfs mount. You merge these into one so that when you link up jellyfin, it won’t know the difference and you can just stream like normal.

    You need to copy files from the local drive to the remote, so you can try and roll your own solution by using rclone or use something like cloudplow which solves this issue as well. Cloudplow uses rclone as well, but monitors for changes automatically.

    As far as copying files, why are you using sync anyway? It’s pretty dangerous. Just use move or copy instead. This way you don’t need to keep copies on your computer and the server.

    As far as streaming from the nfs mount. You may need to make some changes to the cache settings and ensure they are set correctly.

    With a setup like that, you should have no problems though.