Jump to content


Photo

Disk Activity/Latency when running Emby in a VM on Hyper-V

Hyper-V Virtual Machine VM Disk Activity Latency Timeouts Lag esxi vmware

  • Please log in to reply
13 replies to this topic

#1 RedBaron164 OFFLINE  

RedBaron164

    Advanced Member

  • Members
  • 222 posts
  • Local time: 02:52 AM

Posted 07 June 2017 - 10:21 PM

I've been running Emby in a VM on Hyper-V for a couple of years now. Recently I've been trying to resolve some disk activity/latency issues that have been causing delays while loading and moving around in Emby from various clients. I think I've finally found my solution and wanted to share what I learned on the off chance it might help someone else.

 

Long story short, move the cache/metadata/transcode-temp directory to a separate virtual hard disk (vhd).

 

I noticed that when an Emby client would take a long time to load or start to timeout, disk activity on the server would jump noticeably. I also found that running the disk cleanup task would cause the same kind of usage. While monitoring the disk performance in task manager and resource monitor I would see that disk activity would max out, the disk queue length would be around .90 - 1.5 and average response time for the disk would hit 500ms. All this even though the disk throughput was only being reported at about 500kbps.

 

I had two vhdx's on the VM, a Fixed size Disk for C: and another Fixed size disk for D: which is where shows are recorded. The first thing I did to try and improve the situation was reconfigure my Hyper-V's host storage as a Raid 10 array instead of Raid 5. This helped a little but not significantly and the issues were still present. I kept monitoring disk usage and noticed that the cache directory was being accessed rather frequently so I decided to move the Emby cache to a different disk. I added a third 10gb Fixed size disk and changed the cache path. This helped a little but I would still see performance issues when the Library.db was being accessed heavily. After looking into if I could move the Library.db to a different directory (and finding out it would be rather difficult) I noticed that the Metadata directory would show up frequently so I decided to move that instead. I moved the metadata location to the same drive as the cache and after refreshing the metadata in my libraries I finally noticed a significant improvement in performance. I also decided to move the transcoding-temp directory to the third drive as well just to cover all my bases.

 

I originally did not think that moving the cache/metadata directories to a different virtual disk would make a difference since all the VHD's were on the same physical volume. But doing just that appears to have resolved my issue and now I regret not doing it sooner, I've been running performance tests since making the changes and so far I have not been able to re-produce my original performance issues with a variety of different clients including FireTV/Web/Theater. My Emby clients are much more responsive and not having to sit around and wait while browsing my music library is refreshing. And if I never see that VolleyError timeout message again I'll be very happy.

 

I'm going to keep a close eye on my disk utilization and performance for the next few days but wanted to share what I was experiencing and what I did in case anyone else runs into the same issue.

 

Also, on a side note, I did try enabling Quality of Service Management on the Virtual Disks but it ended up making the situation worse. I'm also not sure if this solution applies to only Hyper-V or also VMware. I'm not running VMware ESXi at home so I can't say for certain. But if you are running Emby in VMware and are having a similar issue then maybe this will help you as well.


  • happpyg likes this

#2 Luke OFFLINE  

Luke

    System Architect

  • Administrators
  • 136138 posts
  • Local time: 02:52 AM

Posted 07 June 2017 - 10:34 PM

Hi, thanks for the info ! If you think there's an Emby issue somewhere, then the best thing to do is please discuss an example and then attach the information requested in how to report a problem. thanks !



#3 happpyg OFFLINE  

happpyg

    Advanced Member

  • Members
  • 127 posts
  • Local time: 02:52 PM

Posted 08 June 2017 - 02:11 AM

Thanks for the detailed post!  

 

I too am running Emby in a Hyper-V VM and have been plagued by random slow page load times (about 3-4 per week when accessing Emby via web 10-15 times a week) and have also noted High Disk usage almost identical to your description particularly on the library.db file and metadata files as you suggest.  Even though the VHD is on a RAID 10 volume with fast SAS disks it still randomly takes over 30-60 seconds to load the home page and occasionally times out.  

I have been asked to remove plugins, submit logs etc which I did but never got anywhere with it as I cannot find a pattern and had almost given up.

 

I'll be trying this out and hopefully it resolves the issue!

 

Cheers!



#4 RedBaron164 OFFLINE  

RedBaron164

    Advanced Member

  • Members
  • 222 posts
  • Local time: 02:52 AM

Posted 08 June 2017 - 09:24 AM

Thanks for the detailed post!  

 

I too am running Emby in a Hyper-V VM and have been plagued by random slow page load times (about 3-4 per week when accessing Emby via web 10-15 times a week) and have also noted High Disk usage almost identical to your description particularly on the library.db file and metadata files as you suggest.  Even though the VHD is on a RAID 10 volume with fast SAS disks it still randomly takes over 30-60 seconds to load the home page and occasionally times out.  

I have been asked to remove plugins, submit logs etc which I did but never got anywhere with it as I cannot find a pattern and had almost given up.

 

I'll be trying this out and hopefully it resolves the issue!

 

Cheers!

 

I hope my experience helps you. One other thing I noticed last night during testing is that I do see a difference in disk utilization depending on where I store the cache vhd. In my Hyper-V host I have Hyper-V Standalone installed to an SSD while all the VM's are stored either on the local raid 10 storage array or my Synology via iSCSI. I tried putting the cache VHD on the SSD and saw additional performance gains and reduced disk activity. I'm still currently experimenting with the cache vhd on the host SSD's drive. If that continues to show additional improvements over having it on the same physical volume as the other Emby VM disks I may get a small 32/64gb SSD and just use it to store the cache vhd.



#5 happpyg OFFLINE  

happpyg

    Advanced Member

  • Members
  • 127 posts
  • Local time: 02:52 PM

Posted 08 June 2017 - 07:46 PM

I hope my experience helps you. One other thing I noticed last night during testing is that I do see a difference in disk utilization depending on where I store the cache vhd. In my Hyper-V host I have Hyper-V Standalone installed to an SSD while all the VM's are stored either on the local raid 10 storage array or my Synology via iSCSI. I tried putting the cache VHD on the SSD and saw additional performance gains and reduced disk activity. I'm still currently experimenting with the cache vhd on the host SSD's drive. If that continues to show additional improvements over having it on the same physical volume as the other Emby VM disks I may get a small 32/64gb SSD and just use it to store the cache vhd.

 

Thanks RedBaron, sounds like your setup is similar to mine!  Will see how it goes over the next few weeks.



#6 Swynol OFFLINE  

Swynol

    Advanced Member

  • Members
  • 1058 posts
  • Local time: 07:52 AM
  • LocationWales, UK

Posted 09 June 2017 - 07:08 AM

I'm not using HyperV but i'm tempted to move my metadata and cache folders to a SSD to see if it makes a difference. 



#7 TimFromFL OFFLINE  

TimFromFL

    Advanced Member

  • Members
  • 64 posts
  • Local time: 02:52 AM

Posted 02 October 2017 - 05:15 AM

I've been running Emby in a VM on Hyper-V for a couple of years now. Recently I've been trying to resolve some disk activity/latency issues that have been causing delays while loading and moving around in Emby from various clients. I think I've finally found my solution and wanted to share what I learned on the off chance it might help someone else.

 

Long story short, move the cache/metadata/transcode-temp directory to a separate virtual hard disk (vhd).

 

I noticed that when an Emby client would take a long time to load or start to timeout, disk activity on the server would jump noticeably. I also found that running the disk cleanup task would cause the same kind of usage. While monitoring the disk performance in task manager and resource monitor I would see that disk activity would max out, the disk queue length would be around .90 - 1.5 and average response time for the disk would hit 500ms. All this even though the disk throughput was only being reported at about 500kbps.

 

I had two vhdx's on the VM, a Fixed size Disk for C: and another Fixed size disk for D: which is where shows are recorded. The first thing I did to try and improve the situation was reconfigure my Hyper-V's host storage as a Raid 10 array instead of Raid 5. This helped a little but not significantly and the issues were still present. I kept monitoring disk usage and noticed that the cache directory was being accessed rather frequently so I decided to move the Emby cache to a different disk. I added a third 10gb Fixed size disk and changed the cache path. This helped a little but I would still see performance issues when the Library.db was being accessed heavily. After looking into if I could move the Library.db to a different directory (and finding out it would be rather difficult) I noticed that the Metadata directory would show up frequently so I decided to move that instead. I moved the metadata location to the same drive as the cache and after refreshing the metadata in my libraries I finally noticed a significant improvement in performance. I also decided to move the transcoding-temp directory to the third drive as well just to cover all my bases.

 

I originally did not think that moving the cache/metadata directories to a different virtual disk would make a difference since all the VHD's were on the same physical volume. But doing just that appears to have resolved my issue and now I regret not doing it sooner, I've been running performance tests since making the changes and so far I have not been able to re-produce my original performance issues with a variety of different clients including FireTV/Web/Theater. My Emby clients are much more responsive and not having to sit around and wait while browsing my music library is refreshing. And if I never see that VolleyError timeout message again I'll be very happy.

 

I'm going to keep a close eye on my disk utilization and performance for the next few days but wanted to share what I was experiencing and what I did in case anyone else runs into the same issue.

 

Also, on a side note, I did try enabling Quality of Service Management on the Virtual Disks but it ended up making the situation worse. I'm also not sure if this solution applies to only Hyper-V or also VMware. I'm not running VMware ESXi at home so I can't say for certain. But if you are running Emby in VMware and are having a similar issue then maybe this will help you as well.

 

@RedBaron164 I have a similar setup as you and I had a couple of questions. I saw on another thread that you were experimenting with moving the Emby-Server\data folder by using symbolic links and reported that you had some success with doing that. Is it something that you would recommend for others or was it not worth the trouble? Also, just curious about your Hyper-V hosts config, Do are you still using RAID 10 for your VHD storage and if so how many disks? Do you also have a SSD Raid set or did you just move everything to the SSD that the Hosts is installed on?

 

Thanks In Advance,

 

TimFromFL



#8 RedBaron164 OFFLINE  

RedBaron164

    Advanced Member

  • Members
  • 222 posts
  • Local time: 02:52 AM

Posted 02 October 2017 - 10:57 AM

I have been using a Symbolic Link for the data directory since I made this post and have not had any issues with it. it was definitely worth the trouble for me. Since moving the data directory to an SSD, Emby has performed considerably better, night and day really.

 

As far as a recommendation, I think having the entire Emby OS VHD on an SSD is probably the best scenario. But if your like me and that wasn't an option, just moving the data to an SSD worked wonderfully.

 

My Emby VM currently is still sitting on a local Raid 10 volume with 6x 3TB Western Digital Red drives.

 

I do not have an SSD Raid set. I just created the small 10gb VHD on the host SSD. I do plan on adding another, larger SSD at some point in the future and when I do, I will then move the entire Emby OS VHD to that SSD.


  • TimFromFL likes this

#9 TimFromFL OFFLINE  

TimFromFL

    Advanced Member

  • Members
  • 64 posts
  • Local time: 02:52 AM

Posted 06 October 2017 - 06:49 AM

@RedBaron164 As an update I moved my VHD to a spare SSD that I had and, as you pointed out, that has made a dramatic difference. Ideally i would have liked all of my VHDs to be on my RAID set, but until I get the performance where it needs to be I have a workable solution.Thanks again for your help



#10 Lotus503 OFFLINE  

Lotus503

    Newbie

  • Members
  • 2 posts
  • Local time: 11:52 PM

Posted 11 October 2017 - 03:24 PM

I  run a 2008 R2 Hyper V, VM.

 

I have the VM on an SSD, I moved everything configurable to another drive via disk pass-through. I think I accomplished the same thing because I don't have issues.



#11 TimFromFL OFFLINE  

TimFromFL

    Advanced Member

  • Members
  • 64 posts
  • Local time: 02:52 AM

Posted 11 October 2017 - 04:19 PM

I run a 2008 R2 Hyper V, VM.

I have the VM on an SSD, I moved everything configurable to another drive via disk pass-through. I think I accomplished the same thing because I don't have issues.

I was seeing all of my disk activity on the database files and i didn't want to have to uninstall and reinstall to a custom location or use a portable installation. I wanted to keep things in a fault tolerant setup, so writing directly to a drive was not a setup that i wanted. I now know what to look for now, and i plan on ordering some more drives to give me the performance that i need or maybe setup an ssd raid set. The big reason i moved everything to a VM was so everytime i had a hardware issue i wasn't having to start from scratch rebuilding things. My current setup, i can survive most single component failures without having an outage.

I do have a question for you,@Lotus503 and @RedBaron164, do either of you use RemoteFX graphics cards in your VMs for hardware acceleration for transcoding?

Edited by TimFromFL, 11 October 2017 - 04:21 PM.


#12 RedBaron164 OFFLINE  

RedBaron164

    Advanced Member

  • Members
  • 222 posts
  • Local time: 02:52 AM

Posted 13 October 2017 - 11:13 AM

As far as Hardware Acceleration I do not have a RemoteFX Graphics card. I am running on AMD Hardware so I could not use the Intel/NVidia tech but I did enable VA API acceleration and I did see a noticeable drop in CPU usage when transcoding.



#13 TimFromFL OFFLINE  

TimFromFL

    Advanced Member

  • Members
  • 64 posts
  • Local time: 02:52 AM

Posted 13 October 2017 - 06:27 PM

As far as Hardware Acceleration I do not have a RemoteFX Graphics card. I am running on AMD Hardware so I could not use the Intel/NVidia tech but I did enable VA API acceleration and I did see a noticeable drop in CPU usage when transcoding.


I have a mid-range NVIDIA card that should help so i will try that out. I'll report back my findings so others can benefit.

#14 snake98 OFFLINE  

snake98

    Advanced Member

  • Members
  • 113 posts
  • Local time: 11:52 PM

Posted 06 November 2017 - 02:28 PM

I have a mid-range NVIDIA card that should help so i will try that out. I'll report back my findings so others can benefit.

Did it help, do you have an intel cpu you could try quick sync with?  I"m interested as I've had problem with live tv buffering issue in vmware workstation on a skylake 6600k.







Also tagged with one or more of these keywords: Hyper-V, Virtual Machine, VM, Disk Activity, Latency, Timeouts, Lag, esxi, vmware

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users