Jump to content

Recommended Posts

earthtorob
Posted

Emby uses cache drives for different things.  I was thinking about dusting off my 2 iRam drives for Embys cache.  But I don't know if it's worth the trouble,  Will I get much speed advantage?

 

The drives have 150Mb/s throughput with practically zero latency.  Its great for when a system needs to swap around a lot of small temp files.

 

Would pointing Emby's cache at them speed Emby up for me? 

Posted (edited)

I run my server OS and application on a 120 GB SSD. The CPU and mobo are an old AMD 6 core 3.2 GHz (OC'd to 3.6), teamed with 8GB RAM. The video card is the only thing I am starting to think it is time to replace. It's an older ATI 6850 that I can't quite work out transcoding on. My data drive is a 4TB NAS ready 7200 RPM drive. I am planning on picking up 2more NAS drives for my rig.

My CPU rarely goes above 30% utilized. And RAM only ever uses up 3GB. I run the client on a half dozen devices at home.

 

 

 

Sent from my iPhone using Tapatalk

Edited by Tur0k
earthtorob
Posted

Impressive rig.

 

Transcoding is not an issue for me.  I'm running an old system.  My end players are all Raspberry Pi's running Kodi.  Kodi plays pretty much everything and the Raspberry's have the processing power to handle 1080p.  the only time I need to transcode something is when I stream to my phone.  Which isn't often.  So my ten year old dual core with 2 gig of ram does fine for serving files.

 

But my page loads are often slow.  Reading all that data from an old spindle can take a while.  But I'm worried about all the reads and writes on an SSD causing fatigue.  Two iRam dries would give me 8 gig very low latency storage for the database.  That is if Emby has the option to keep those files wherever I want.

 

Does it?  Should I try it?

earthtorob
Posted

Thats what I'm thinking this is all about.

Specify a custom location for server cache files, such as images. Leave blank to use the server default.

pir8radio
Posted (edited)

Thats what I'm thinking this is all about.

Specify a custom location for server cache files, such as images. Leave blank to use the server default.

 

That is correct.  Also what I do... run a ramdisk.  Works well as long as your ramdisk supports saving data at shut down, otherwise you have to rebuild all of that info after a reboot.   I have a 10 gig ramdisk, and have only ever seen 2 gigs of "stuff" on it.   I have 500 or so movies, 13,000 music videos, and 140,000 mp3's.   So you can probably get away with a smaller ram disk.

 

EDIT, i just googled what you were talking about, looks like a piece of hardware (PCI?) my ram disks use actual system ram http://memory.dataram.com/products-and-services/software/ramdisk   

 

Using ramdisk software you can get some pretty good ramdisk speeds even when put up against decent SSD's (below). 

 

58e1ca84ec7c2_ramdisk.jpg

Edited by pir8radio
earthtorob
Posted

That is correct.  Also what I do... run a ramdisk.  Works well as long as your ramdisk supports saving data at shut down, otherwise you have to rebuild all of that info after a reboot.   I have a 10 gig ramdisk, and have only ever seen 2 gigs of "stuff" on it.   I have 500 or so movies, 13,000 music videos, and 140,000 mp3's.   So you can probably get away with a smaller ram disk.

 

EDIT, i just googled what you were talking about, looks like a piece of hardware (PCI?) my ram disks use actual system ram http://memory.dataram.com/products-and-services/software/ramdisk   

 

Using ramdisk software you can get some pretty good ramdisk speeds even when put up against decent SSD's (below). 

 

58e1ca84ec7c2_ramdisk.jpg

 

 

Oh yeah, a ram disk would work great.....but....I have 2 gig of ram on my system.  And I'm not interested in building a new one.  You are correct that the iRam plugs into the PCI bus.  But it doesn't transmit data through that bus.  It uses SATA.  So I get 150MB/s through each.  And it uses DDR.  So unlike SSD I don't have to worry about all those writes and Trim support.

 

Or maybe just buy a cheap SSD and just use it for cache until it starts to slow down and then replace it.

 

Either way, if that will get my page loads up faster then I'll go that route.  Thanks for the info.  

Posted (edited)

A good Samsung EVO 750 has a 3 year warranty and some 35 TBW or 70 TBW depending on size lifetime. The evo 850/950 go up from there. You could configure 2 or more SSD drives to increase redundancy and support HA. Depending on how extreme you want to go you could configure One for OS and software and a second for cache. Your other option would be to configure them in a raid 1 or greater configuration to increase redundancy.

At any rate, I would agree that you should keep your OS, software, and cache off of your data drive.

What are the specs and configuration on your current HDD?

 

 

Sent from my iPhone using Tapatalk

Edited by Tur0k
Animosity022
Posted

I would look for 2 things when building a rig out.

 

I would 100% put my OS and Library/Cache stuff on SSD. I just use a 250GB SSD as they are pretty cheap these days. 

 

Also, make sure you buy something that does good single threaded performance and think of how many streams you may run.

 

I've found this to be very helpful when trying to figure out a price point and what rig to build:

 

https://www.cpubenchmark.net/singleThread.html

pir8radio
Posted

I would look for 2 things when building a rig out.

 

I would 100% put my OS and Library/Cache stuff on SSD. I just use a 250GB SSD as they are pretty cheap these days. 

 

Also, make sure you buy something that does good single threaded performance and think of how many streams you may run.

 

I've found this to be very helpful when trying to figure out a price point and what rig to build:

 

https://www.cpubenchmark.net/singleThread.html

 

Just curious why focus on single thread performance?

Animosity022
Posted

Just curious why focus on single thread performance?

 

For transcoding in general, it seems depending on the codecs/etc limits the number of cores that are used so making sure that a single core can meet your needs seemed like a more foolproof setup. Scale out on the # of cores based on your friends/family that leech off of you.

 

Plex tends to stick to a single core when transcoding as well.

PrincessClevage
Posted

For transcoding in general, it seems depending on the codecs/etc limits the number of cores that are used so making sure that a single core can meet your needs seemed like a more foolproof setup. Scale out on the # of cores based on your friends/family that leech off of you.

 

Plex tends to stick to a single core when transcoding as well.

In Emby server setting is the max thread count for transcoding not adjustable ? (Would be strange to have many threads assigned to just one core)
Guest asrequested
Posted (edited)

Just spend as much money as you can :D

 

Other than that, if you have an M.2 port on your motherboard, you could hook up an NVMe SSD. The Samsung 960 NVMe M.2 SSDs have a huge transfer rate. But I don't think you'll see much difference in speed between that and a standard SSD, if you're only using it for metadata cache.

Edited by Doofus
Animosity022
Posted

In Emby server setting is the max thread count for transcoding not adjustable ? (Would be strange to have many threads assigned to just one core)

 

ffmegp can use more threads/more cores, which Emby uses. Having a higher clocked CPU would help.

earthtorob
Posted

A good Samsung EVO 750 has a 3 year warranty and some 35 TBW or 70 TBW depending on size lifetime. The evo 850/950 go up from there. You could configure 2 or more SSD drives to increase redundancy and support HA. Depending on how extreme you want to go you could configure One for OS and software and a second for cache. Your other option would be to configure them in a raid 1 or greater configuration to increase redundancy.

At any rate, I would agree that you should keep your OS, software, and cache off of your data drive.

What are the specs and configuration on your current HDD?

 

 

Sent from my iPhone using Tapatalk

 

Those super high throughput rates aren't as important as i/o per second, in my opinion.  We are talking about a lot of small files.  I'm betting that just moving the server cache to SSD or iRam would be enough to give me the speed boost I'm looking for.

 

Yeah, I don't think anybody keeps their data on their on the same drive as their OS.   My Data is on a few drives.

earthtorob
Posted

I would look for 2 things when building a rig out.

 

I would 100% put my OS and Library/Cache stuff on SSD. I just use a 250GB SSD as they are pretty cheap these days. 

 

Also, make sure you buy something that does good single threaded performance and think of how many streams you may run.

 

I've found this to be very helpful when trying to figure out a price point and what rig to build:

 

https://www.cpubenchmark.net/singleThread.html

 

I'm using an old Dual core P4.  I think it's an old Netburst core.  Really old.  And 2 gig of ram.  I serve 4 players 1080p video at the same time.  My old system was a 533Mhz C3.  The whole system ran at 40 Watts.  And it could serve all my boxes too.  I really don't think I need to upgrade my CPU, because I really never need to transcode.  

 

I'll slip in the IRam before I get an SSD.  I'll let you all know how it comes out, if anybody is interested.  :) 

earthtorob
Posted

For transcoding in general, it seems depending on the codecs/etc limits the number of cores that are used so making sure that a single core can meet your needs seemed like a more foolproof setup. Scale out on the # of cores based on your friends/family that leech off of you.

 

Plex tends to stick to a single core when transcoding as well.

 

Now that is something I'm curious about.  Are you accessing Emby remotely?  Not just your phone but also DNLA devices at friends and family homes?  How does that work for you?

earthtorob
Posted

58e7e4554fda2_IMG_20170407_150013282.jpg58e7e4801ad92_IMG_20170407_150018635.jpg58e7e49249817_IMG_20170407_150028637.jpgFor those who might have had a passing interest, my iRam is dead.
 
At least I assume so.  I dare not plug the thing in.  Corrosion has had it's way with my beloved old hardware.  I guess I'll pick up a cheap SSD and use that.  Although I still think the low latency of my iRam would have worked great.

Animosity022
Posted

Now that is something I'm curious about.  Are you accessing Emby remotely?  Not just your phone but also DNLA devices at friends and family homes?  How does that work for you?

 

I have a friend or two and some family members that leverage my media server. On a normal night, I get 1-2 streams local and 2-5 remote. Locally, I have ATV 4th gens in the house and remote tends to be Rokus with the XBoxes tossed in the mix.

 

I use a Model name:            Intel® Core i7-4790 CPU @ 3.60GHz which seems to do quite well, but majority of content is direct streamed and the older devices tend to transcode. 

 

I turn DLNA off as I don't have a need for that.

Chillout
Posted (edited)

I put all my emby cache, transcoding, and metadata on a pair of raid0 SSDs and everything is very quick.  

Edited by Chillout
Posted

All of my front-end emby servers are virtual's and each one (3x of them) run everything but the "most current" content from memory (tmpfs/ramdrive).

 

Metadata = ~5.3G (1050 movies & ~ 9478 episodes)

Cache = ~250MB

Logs = ~1G

sqlitedbs (data) = ~400MB

trans-coded latest (movies and episodes) = 30G 

 

In total each servers uses about ~37G  + Plus emby's own usage (which for me these days is about 3G after 12 hrs of uptime). And it's all fully reserved.

 

Now truth fully, I haven't been able get a noticeable (user seen) difference. in load times, buffering, etc. from any of fallowing setups (meaning they gave the same end user experience).

  • Everything running directly from the host, emby , meda, setc all disk based & local (zfs zraid3, 12 drives + 2x ssd for OS).
  • Everything being served over NFS mounts (everything long term/sable)
  • Everything being stored/run from tmpfs (Except content)
  • Everything being stored/run from SSDs (Except content) (signle, mirrored and raid 10)
  • The above setups in both VM's or on real hardware.

That all being said, from the END users point of view it was all the same. From the servers point of view and the "stress" on the server. Running everything out of ram (for me). Is the best option. It also gives me the fastest load up times after a crash. 

 

Side notes.

  • All emby data is synced from the backend emby server to the front ends at boot (metadata, userdata, etc). Or resynced after a crash.
  • the metadata is refreshed only when the file server  "flags" that the front end servers need to refresh (as something has been changed / updated).
  • User data is exported from the front ends and imported into backend server after user logout/timeout.
  • only the backend server updates the metadata,Chapters,auto arrange, etc
  • all front end and back servers are "named" the same in emby to allow the folder sync plugin to look in the right directories/locations.
  • All movies and episodes are viable via samba shares (for local playback) EXCEPT the lastest transcoded content. Which is only a available from the tmpfs on each front end server.

 

PS. I never report issues,bug, etc I have with or from these servers has it's 100%. no where. at all. near the defaults. And way way outside the norm. But this setup works extremely well for me. With haproxy in front of these. I've not only gained a simplified ssl + emby setup. But also nearly 100% uptime (even when crashes happen) with rollovers.

 

PSS. My life would be way simpler if emby  supported other db backends (mariadb, mssql ,etc). And aside from the "possible" additional setup work during install there would no downside. provided that persistent connections where used. But the upsides would be wonderful. Emby could become truly/fully multi threaded, hot backups would be possible, db clustering becomes an option and remote DB storage becomes an option.      

earthtorob
Posted

I have a friend or two and some family members that leverage my media server. On a normal night, I get 1-2 streams local and 2-5 remote. Locally, I have ATV 4th gens in the house and remote tends to be Rokus with the XBoxes tossed in the mix.

 

I use a Model name:            Intel® Core i7-4790 CPU @ 3.60GHz which seems to do quite well, but majority of content is direct streamed and the older devices tend to transcode. 

 

I turn DLNA off as I don't have a need for that.

 

Thanks.  But what I mean is how do your remote users connect?  I assume they use the apps when on an android or iOS device.  But how do they typically connect with other devices like Roku and KODI?

earthtorob
Posted

It sounds like all of you have very impressive set-ups.  I'm running a 10 year old dual core with 2 gig or ram.

 

The good news is that my hunch was correct.  Latency is much more important is throughput.  Although my iRam is busted I decided to try a simple thumb drive.  I plugged in an 8 gig thumb drive and set server cache to it.  Now my pages load 2 or 3 times faster.

 

I'm working with about 3000 movies and 20,000 episodes.  

 

The cache is populating right now.  So I don't know how much data will be transferred yet.  Of course I don't expect the thumb to last long.  But they are cheap so if I end up having to replace is every 6 months or so, I wouldn't mind.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...