Some kind of weird loop behaviour

Hi all,

I recently switched my setup from Windows to Linux&Docker. Im experiencing a weird issue which i hope you guys are able to aid me with.

Somehow when Sonarr (or Radarr) downloads something, after downloading it copies a “.partial~” file to the TV Shows folder. And it keeps going from 0kb to 1GB to 0KB to 1GB etc. I decided to turn on Trace Logging and i see its going in some kind of loop, doing the same episode over and over again.

Here is the link to the trace logging:
https://pastebin.com/raw/4vgAN64R

Thank you in advance.

Kind regards,
Hofboss

18-4-2 23:16:17.8|Trace|DiskTransferService|Removing old partial.
18-4-2 23:16:17.8|Trace|DiskProviderBase|Deleting file: /tv/Into the Badlands/S01E01 - The Fort.mkv.partial~
18-4-2 23:16:17.8|Trace|DiskTransferService|Attempting to move hardlinked backup.
18-4-2 23:16:25.1|Debug|ProcessProvider|Found 0 processes with the name: NzbDrone.Console
18-4-2 23:16:25.1|Debug|ProcessProvider|Found 1 processes with the name: NzbDrone

It looks a lot like Sonarr is crashing trying to delete the file. Capturing the console output and to a file may give more insight into the issue.

hi @markus101,

Thanks for your reply. I’ve copied the logs from the docker container:
https://pastebin.com/raw/dcUjS1hZ

Let me know what you think, i can’t find too much in it.

EDIT: I forgot to mention that im running Ubuntu on VMWare. Inside Ubuntu docker containers run. One of these containers is Sonarr and another is SabNZBd.
Also the data folders (media folders) are mounted through NFS. I have full permissions (can read/write) on the NFS shares.

EDIT2: I tried 2 more things:

  1. Use Drone Factory --> Same bad behaviour
  2. Copy the files by hand --> works fine

EDIT3: I tried setting up another NFS server (this time on Ubuntu instead of Windows Server). Same behaviour. I think its safe to say that its not an issue with the filesystem/shares.

Kind regards,
Hofboss

What about a local filesystem instead of backed by NFS, just as a test?

Exactly the same behaviour. All data is now on the docker host itself (so no more network connections).

EDIT: I also tried removing the configuration (starting a fresh one). No luck here (same behaviour)

PS. I’m using this image: https://hub.docker.com/r/linuxserver/sonarr/

Allright. Finally i found the issue. This is the command i used to create the container with:
docker create --name=sonarr --cpus=4 --memory=256MB --restart unless-stopped --net=eth1macvlan --ip=192.168.x.x -e PUID=1000 -e PGID=1000-e -v /etc/localtime:/etc/localtime:ro -v “/path/to/config”:/config -v “/path/to/tvshows”:/tv -v “/path/to/downloads”:/downloads/tv linuxserver/sonarr

But apparently 256MB is not enough memory (allthough it never goes above 180MB usage from what i can see). I changed this to 2GB (for testing) and the issues went away.

Maybe its an idea to throw some out of memory exception handling in? and put it in the log files when Sonarr runs OOM?

Thanks for your support guys!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.