Massive Slowdown Fix (Partial Files)

Just wanted to put something out there in case it’s affecting anyone else. For the past couple of months I have found the Sonarr Web UI to be extremely laggy, unresposive, hanging, etc.

I tried switching back and forth between master/development channels, reinstalls, restarting services, etc. Upgrading mono, etc.

Finally last night - I was running it in the console and noticed some strange io error go by in purple text. I located some files with the extension .partial~ buried my media library folders.

I think they were remnants from files that were interrupted during transfer for one reason or another.

I couldn’t actually delete them from my main box as they were shared from my share box and locked somehow.

I ssh’d into my smb share box and rm’d the files. Sonarr took off like a racehorse and has been nice and snappy ever since.

As part of the housekeeping routines - you might want to consider a simple check to locate those files and put up a warning message so they can be dealt with.

My setup - Sonarr - Development Build, July 1st build, running on Linuxmint 17.1. Medialibrary is stored on a WD MyCloud 4TB.

Much happier camper here - hope this helps someone.

Definitely can’t be the cause of slowdowns over ‘the past couple of months’, coz that piece of code wasn’t in Sonarr back then.

But if you still run into .partial~ files, lemme know asap.

Interesting - I would say it’s more likely I had the time period wrong or I may have had something else going on back then.

For sure though - my install is running really smooth now and the only thing I did was delete those files. It was so sluggish I couldn’t even get the UI to come up a lot of times.

I will keep an eye out for more files occurring.

Just a quick followup - since I reported this - I have noticed UI slowdown/sluggishness that was alleviate twice by deleting partial~ files. These occurred on new downloads. I couldn’t delete the file from my windows machine over the smb either - I had to SSH in and delete it from my NAS. As soon as I rm’d the file - the UI for sonarr sped up again.

I’ve added this as a cron job on the NAS to automate this removal for now to keep things speedy:
find /DataVolume/shares/medialibrary -name *.partial~ -exec rm -rf {} ;

Version 2.0.0.3316, Mono Version
4.0.2 (Stable 4.0.2.5/c99aa0c Wed Jun 24 10:12:58 UTC 2015), running on Linuxmint 17.2.
Medialibrary is stored on a WD MyCloud 4TB.

This was in the log 133 times since since this morning at 3:00 am when this file completed download.

15-7-12 03:20:51.8|Warn|ImportApprovedEpisodes|Couldn’t import episode /var/downloads/tvshows/Orange.Is.The.New.Black.Season.1.Complete.BDRip.x264-DEMAND/OITNB.S01E09.BDRip.x264-DEMAND.mkv

System.IO.IOException: Lock violation on path /ext/medialibrary/2 Main Vids/1 Television/Orange Is the New Black/Season 01/Orange Is the New Black - S01E09 - Fucksgiving.mkv.partial~
at System.IO.File.Delete (System.String path) [0x00000] in :0
at NzbDrone.Common.Disk.DiskProviderBase.DeleteFile (System.String path) [0x0005e] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskProviderBase.cs:178
at NzbDrone.Common.Disk.DiskTransferService.TryMoveFile (System.String sourcePath, System.String targetPath) [0x0006d] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskTransferService.cs:230
at NzbDrone.Common.Disk.DiskTransferService.TransferFile (System.String sourcePath, System.String targetPath, TransferMode mode, Boolean overwrite, Boolean verified) [0x002a9] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskTransferService.cs:150
at NzbDrone.Core.MediaFiles.EpisodeFileMovingService.TransferFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Tv.Series series, System.Collections.Generic.List1 episodes, System.String destinationFilePath, TransferMode mode) [0x0010e] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\EpisodeFileMovingService.cs:118 at NzbDrone.Core.MediaFiles.EpisodeFileMovingService.MoveEpisodeFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Parser.Model.LocalEpisode localEpisode) [0x0005e] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\EpisodeFileMovingService.cs:80 at NzbDrone.Core.MediaFiles.UpgradeMediaFileService.UpgradeEpisodeFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Parser.Model.LocalEpisode localEpisode, Boolean copyOnly) [0x00119] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\UpgradeMediaFileService.cs:64 at NzbDrone.Core.MediaFiles.EpisodeImport.ImportApprovedEpisodes.Import (System.Collections.Generic.List1 decisions, Boolean newDownload, NzbDrone.Core.Download.DownloadClientItem downloadClientItem) [0x0022c] in m:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\EpisodeImport\ImportApprovedEpisodes.cs:89

Trace log please. Of a new download till import.

Are we talking about torrent downloads? I think it doesn’t matter, but still wanna be sure.

Sounds to me like a messy NAS implementation again. Tracelogs will tell more in which stage of the verified-file-transfer it gets messed up.

These were nzb downloads. These doesn’t happen on every download - just occasionally.

How do I get the tracelog specifically?

I think this is what you want -> logs

The point is, at debug/trace level the whole copy action is logged. The logs you included don’t contain the required detail.

Ok, I understand. I have enabled trace and will update when I get another partial file.

Just an FYI, at Trace level the log files won’t go that far back in time.

@thermodyn Any progress here? I’m anxious to know whether there’s really a problem with verified transfer coz we’re planning on a master release soon.

Hi Taloth - Thanks for keeping an eye on this. Your team’s work is much appreciated.

Unfortunately I haven’t seen a reoccurence so far. I’ve been checking every day as well.

Hi Taloth - This still has not reoccured and I don’t see any other reports from other users. I will keep tracing and maybe something will happen but I would go ahead with the master release.

I would suggest that you add something in the daily housekeeping tasks to detect and inform the the user somehow if they are found. If it’s like mine they may not be able to delete them locally.

Thanks

Ow forgot to mention, a couple of days prior to the master merge I disabled it, coz i felt it was too risky to merge to master.
I’ll put it back on :smile:

In the same change earlier I also changed the logic somewhat to handle certain errors better. But we’ll see.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.