Jump to content

Calculating Transcode file sizes


Recommended Posts

JuJuJurassic
Posted

Hi All,

Is anyone able to give an indication of the file sizes of the transcode temporary files created or rather the criteria that would allow me to calculate them? I have limited some of my internet users bandwidth, so pretty well every film they watch is transcoded.

I'm looking at dedicating a M.2 SSD to it, I'm just trying to work out it's size.

If it's small, I'd then wonder if I could share the M.2 with a Virtual machine running sonar etc. Not too sure about that, I might just put in a separate sata SSD for the VM as I'll have used up all the M.2 slots when the transcode M.2 goes in.

Any thoughts?

Happy2Play
Posted

Sort of a hard question but I would say about the same size as the media that is being converted would be a good estimate times the number of possible users transcoding.

 

  • Thanks 1
visproduction
Posted (edited)

It depends on your transcoding settings and how much bandwidth your server can handle.  Setting a quality very high, makes a larger file.  The transcoding, I believe, picks a encoding size that allows for real time encoding, so that it doesn't take longer than 0.7, or 0.8 seconds to encode 1 second of media. It runs the test, I think, at the beginning of the transcoding and whatever is going on with your server determines the encoding speed and as a result the file size.  I expect that the code that handles transcoding steps up and down based on whatever settings you pick and your server load when transcoding begins.

So, if you are playing back 4K at a very high quality setting, your server tries to do that and if speed is fine it will hit that high quality preference and make quite a large file.

If you don't use 4K and everything is 1080P or smaller, than, of course the server demand is less, but it will still try to keep the high quality preference you set and make a large 1080P copy that is somewhat smaller than if you did 4K.

If you drop your transcode settings to accept a lower quality, then all conversions can have smaller files.

Part of the problem is transcoding in real time.  With a very fast server, the quality can remain high, the file size will just pick your preference settings and make a copy that is probably large. If the server is not so fast, and the transcoding can't work in real time, for example it looks like it will take 2 seconds to convert 1 second of media, then the quality is probably adjusted downward and playback gets more artifacts to make sure your server can do real time transcoding.

Again, I have not looked at the code and so I am assuming this based on when I work on similar code for social media sites.

I think the best answer is for you try a few examples and look at the results.  It is not really possible to predict what your server does with your network demand for your transcoding preferences.

I can tell you if you pre encode using ffmpeg or other 3rd party software like AVIDemux, at a very high quality level. This may not be possible in real time, because the setting is so high. At this high setting,  you can get very nice looking quality playback for .mp4 files 1080P at around 1.2 to 1.5 GB / hour.  4K .mp4 only look good at around 4 to 6GB / hour and at that size, the bandwidth to multiple users can start getting to be too heavy and may cause stuttering.  To do this quality in real time, you need probably hardware transcoding to get this quality.  Without hardware transcoding, CPU conversion will probably make this media at least twice as large.

I would also point out that online sites with social media videos are encoded in advance to reach an acceptable quality and then added to an online collection.  The site owners can set the quality preferences and the user is playing back something already pre-encoded.

Other people, who actually use the Emby transcoder, will probably have more accurate info.

Hope that helps.

Edited by visproduction
Posted

I support 3 remote users all are limited to 4Mbps in bandwidth.  I have 32GB RAM Drive (since server has 64GB) which seems to be adequate.  I have observed all 3 playing at the same time with no known reported issues.  They all transcode on the GPU, but temp files are created in the RAM drive itself.

Posted (edited)

Realtime compression has to contend with being realtime - so it generally sacrifices bitrate for quality in an effort to do this.   Therefore, temp files will be larger than the source material - especially if they were hevc/h265 to begin with, as emby will convert/stream them as avc/h264 - needing a larger bitrate anyway to achieve comparable quality.

I'm pretty sure emby will now purge files if the temp space is getting low - ie it will remove the oldest chunks as required - and it will remove them all once the stream has finished.

Edited by rbjtech
JuJuJurassic
Posted

Thanks very much for your advice, it's much appreciated.

I think.to be safe I should give 50gb per user, way more than a 4k transcode would need, but I want to play safe.

Thank you

 

 

 

Posted (edited)

You should factor in your server or user bitrate limits for the remote playback. The transcoded streams will not exceed the limits for those users and hopefully local playback is rarely transcoded except for live-TV.

 

Edited by Q-Droid
JuJuJurassic
Posted

 So given the average media file is 10 GB, ( ls *.mkv -lR| gawk '{sum += $5; n++;} END {print sum/n;}'  on Ubuntu)

If I allocate double that to each user I should be safe. So 30gb per user should be more than enough.

I'll put in a M.2 to give that capacity.

Thanks for the advice/help.

Much appreciated 🙂 

 

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...