Jump to content

Using ASS Subtitles Causes Every Video Trouble To Load


Recommended Posts

NeverExists
Posted
On 22/12/2021 at 11:19, cayars said:

Been on the lookout but haven't received a PM from you yet.
Did you forget to send me the link or just waiting for the upload to finish?

I sent it a right after my last post. It says you haven't read it yet

Posted

Not sure how I missed it but just looking I didn't see it either.  Then I search by name and found it.  Strange, probably looked right past it.

It's download now.

NeverExists
Posted
1 minute ago, cayars said:

Not sure how I missed it but just looking I didn't see it either.  Then I search by name and found it.  Strange, probably looked right past it.

It's download now.

Sweet! Thanks for taking a look. Keep me updated on your findings k.

NeverExists
Posted
10 minutes ago, cayars said:

Not sure how I missed it but just looking I didn't see it either.  Then I search by name and found it.  Strange, probably looked right past it.

It's download now.

Also I am using the nvidia shield to stream my content from my laptop

Posted

It's likely the fact it's a 10bit AVC/H.264 file.

Testing on my -7 notebook running both Server and web client. 

This is with option Transcoding->Allow subtitle extraction on the fly, not enabled and played back in browser.
The server was transcoding it but I found playback high quality and smooth with 2 dropped frames (always happens).

 

After enabling Allow subtitle extraction on the fly I can direct play this in either Chrome or Edge.

I can direct play this in Chrome with 0 dropped frames if no ASS is showing.
If I turn on ASS subs it still direct plays but I get a lot of dropped frames.
Playing back from the start with ASS showing at the one minute mark.  312 dropped frames.  That of course gives a jerky look.
image.png.ab95d5e6c416e85b80f8e249f3e191cb.png

In Microsoft Edge for me it plays smoother but still dropped frames
image.png.2f69af246337660ec56baf7047cf1d8e.png

What client are you using to play back with?
What do you get with stats for nerds like I showed above at the 1 minute mark?

NeverExists
Posted
2 minutes ago, cayars said:

It's likely the fact it's a 10bit AVC/H.264 file.

Testing on my -7 notebook running both Server and web client. 

This is with option Transcoding->Allow subtitle extraction on the fly, not enabled and played back in browser.
The server was transcoding it but I found playback high quality and smooth with 2 dropped frames (always happens).

 

After enabling Allow subtitle extraction on the fly I can direct play this in either Chrome or Edge.

I can direct play this in Chrome with 0 dropped frames if no ASS is showing.
If I turn on ASS subs it still direct plays but I get a lot of dropped frames.
Playing back from the start with ASS showing at the one minute mark.  312 dropped frames.  That of course gives a jerky look.
image.png.ab95d5e6c416e85b80f8e249f3e191cb.png

In Microsoft Edge for me it plays smoother but still dropped frames
image.png.2f69af246337660ec56baf7047cf1d8e.png

What client are you using to play back with?
What do you get with stats for nerds like I showed above at the 1 minute mark?

My previous post stated what I am using. The Emby Nvidia Shield app streaming from my gaming laptop. Same setup I had before when it was working fine. I then switched to plex for 2 or 3 months and when I came back to emby the playback was a lot worse with ASS subtitles. Exact same setup I had before when it was working fine.

Posted

How are they playing on the Shield according to stats for nerds?
How is this option set on your computer, Transcoding->Allow subtitle extraction on the fly ?
What does your H.264 encode and decode settings look like?

NeverExists
Posted (edited)

I don't know what the stats are and don't know how to find out. Like I said before I have my settings the exact same way I had them when everything was playing smoothly. The only thing different then when it was playing fine is updating the server and emby Nvidia shield app. I have subtitle extraction on the fly checked. I was not getting bad framerate before, cause everything was playing smooth. Now its a choppy mess.

Edited by NeverExists
NeverExists
Posted (edited)
5 hours ago, cayars said:

How are they playing on the Shield according to stats for nerds?
How is this option set on your computer, Transcoding->Allow subtitle extraction on the fly ?
What does your H.264 encode and decode settings look like?

Here you go. I don't know how to get stats for nerds with nvidia shield

Transcoding 1.PNG

Transcoding 2.PNG

Edited by NeverExists
Posted
4 hours ago, NeverExists said:

Here you go. I don't know how to get stats for nerds with nvidia shield

Click the cog icon during playback and select stats for nerds.

NeverExists
Posted
7 hours ago, cayars said:

Click the cog icon during playback and select stats for nerds.

Here you go 

20211225_100107.jpg

Posted

Hi, please attach the emby server log from when you did that. Thanks.

Posted

I re-read the thread. To sum up the media is AVC 10-bit encoded but just not that as it uses a different color space as well. This isn't something that many devices if any, are going to support in hardware. Outside the world of Anime you would never really see this format but instead would see proper spec h.265 with that color space and 10 bit encoding which is supported by most hardware.

In re-reading this it sounds like you found a combination of server settings that does work and is smooth playing back except you had a color output issue. Softworkz has said he already found the issue and fixed it for the next beta. It looks like you tried the other combinations and did get it to also work with Nvidia but didn't like the quality compared to QuickSync which is no surprise as that's an older generation GPU. IMHO Nvidia GPUs starting with Turing architecture is when hardware transcoding got really good.

Short of purchasing a new generation Nvidia GPU or converting the files to h.265/10 bit color space their doesn't appear to be anything to do except to wait for the next beta that will have the fix in it.  Then we can re-evaluate and see how the fix works for you.

 

NeverExists
Posted
4 hours ago, cayars said:

I re-read the thread. To sum up the media is AVC 10-bit encoded but just not that as it uses a different color space as well. This isn't something that many devices if any, are going to support in hardware. Outside the world of Anime you would never really see this format but instead would see proper spec h.265 with that color space and 10 bit encoding which is supported by most hardware.

In re-reading this it sounds like you found a combination of server settings that does work and is smooth playing back except you had a color output issue. Softworkz has said he already found the issue and fixed it for the next beta. It looks like you tried the other combinations and did get it to also work with Nvidia but didn't like the quality compared to QuickSync which is no surprise as that's an older generation GPU. IMHO Nvidia GPUs starting with Turing architecture is when hardware transcoding got really good.

Short of purchasing a new generation Nvidia GPU or converting the files to h.265/10 bit color space their doesn't appear to be anything to do except to wait for the next beta that will have the fix in it.  Then we can re-evaluate and see how the fix works for you.

 

Old gpu? My laptop had a 1050ti I bought it brand new 2 years ago.

pwhodges
Posted
2 hours ago, NeverExists said:

Old gpu? My laptop had a 1050ti I bought it brand new 2 years ago.

Old = "not current generation" in this case. In any case, it doesn't handle 10-bit h264, as stated.  Although they say this is mainly found in anime files, acquired from "various" sources, I have not got a single one such in my couple of hundred anime series, plus more films, so it should be easy to avoid, I think.

Paul

Posted
53 minutes ago, pwhodges said:

Old = "not current generation" in this case. In any case, it doesn't handle 10-bit h264, as stated.  Although they say this is mainly found in anime files, acquired from "various" sources, I have not got a single one such in my couple of hundred anime series, plus more films, so it should be easy to avoid, I think.

Paul

Note: Sorry for the slight derailment while we wait for the next beta release but I had this mostly handy and thought I'd post it as it might help someone at some point.

Yes I was actually referring to the architecture of the the GPU and more specifically to the generation of NVENC used. There have been 7 generations of NVENC technically but only 3 us streamers care about.

Kepler - 1st gen
Maxwell 2nd & 3rd gen
Pascal - 4th gen
Volta - 5th gen
Turing - 6th gen
Ampere - 7th gen

Kepler was a glimpse of something to come but not usable, Maxwell had a couple generations and for "normal" h.264 8 bit became usuable for streaming as it could encode at better than real-time speed but didn't come close to the optimizations you get via CPU so it wasn't a good choice to use for converting files to keep.

Pascal architecture was the game changer that brought HEVC Main10 10-bit encoding as well as many optimizations to h.264 encoding. It typically produced conversions that were 15-20% larger than a well optimized CPU encode but could encode in better than real-time vs 10+x time compared to CPU.  This is the generation that a lot of people decided was "good enough" to use to mass convert files with especially to HEVC.

Volta should not have been. This was more of a stop gap for gamers who could get the new generation cards but with older NVENC. It did add HEVC B-Frame support to HVENC which helps a lot.

Turing (or Ampere) really is the current generation to get if possible. It has lots of little improvements/refinements that help a lot with compression and quality such as HEVC B-frames, support for use as middle nodes, Alpha HEVC, etc. These changes provides up to 25% bitrate savings for HEVC and up to 15% bitrate savings for H.264. Quality of conversions using NVENC vs CPU are great and short of frame-by-frame analysis appear near equal. This is the architecture that makes conversion of media using GPU a no-brainer.

Ampere is new and can be considered a refinement of Turing with addition of AVC1 decoding with film grain.
https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

The main architectures we care about NVENC wise were Maxwell, Pascal and Turing.

The generation/architecture used isn't all we care about as that will mostly give us an idea of features and quality.  To determine the amount of streams you can handle is more complex and involves the GPU & Memory clock speeds as well as number of Shaders, TMUs & ROPs. Not understanding this could easily cause a person to upgrade GPUs to not gain any improvement in amount of streams they can handle.

Here's an example number of streams doing transcoding from 1080p 10Mbps to 720p 4Mbps.

image.png.48316f5d082fe2a8a166d31aa14dd51a.png

So if you have a 1050 TI that does 14 transcodes the best bang for the buck upgrade would be to an RTX 2070 with 8GB giving you 27 streams. An upgrade to a GTX 1650 might allow more functionality but not more streams. An upgrade clear into the RTX 3000 series may not give you an improvement over the 2070/8Mbps as it will depend on the configuration of the card.

Understanding the relationship of how this works for streaming can allow you to purchase the best GPU for the money based on streaming and not game play.
You can generate you own graphs and lookup this type of info here: Transcode Expectations

Something to think about is talking to any gamer friends to see what card they have. If it get's you a decent upgrade maybe work out a deal to give them a bit of cash for the hand-me-down which might offset their cost to upgrade. You have your old card to either sell or barter with as well.

NeverExists
Posted (edited)
9 minutes ago, cayars said:

Note: Sorry for the slight derailment while we wait for the next beta release but I had this mostly handy and thought I'd post it as it might help someone at some point.

Yes I was actually referring to the architecture of the the GPU and more specifically to the generation of NVENC used. There have been 7 generations of NVENC technically but only 3 us streamers care about.

Kepler - 1st gen
Maxwell 2nd & 3rd gen
Pascal - 4th gen
Volta - 5th gen
Turing - 6th gen
Ampere - 7th gen

Kepler was a glimpse of something to come but not usable, Maxwell had a couple generations and for "normal" h.264 8 bit became usuable for streaming as it could encode at better than real-time speed but didn't come close to the optimizations you get via CPU so it wasn't a good choice to use for converting files to keep.

Pascal architecture was the game changer that brought HEVC Main10 10-bit encoding as well as many optimizations to h.264 encoding. It typically produced conversions that were 15-20% larger than a well optimized CPU encode but could encode in better than real-time vs 10+x time compared to CPU.  This is the generation that a lot of people decided was "good enough" to use to mass convert files with especially to HEVC.

Volta should not have been. This was more of a stop gap for gamers who could get the new generation cards but with older NVENC. It did add HEVC B-Frame support to HVENC which helps a lot.

Turing (or Ampere) really is the current generation to get if possible. It has lots of little improvements/refinements that help a lot with compression and quality such as HEVC B-frames, support for use as middle nodes, Alpha HEVC, etc. These changes provides up to 25% bitrate savings for HEVC and up to 15% bitrate savings for H.264. Quality of conversions using NVENC vs CPU are great and short of frame-by-frame analysis appear near equal. This is the architecture that makes conversion of media using GPU a no-brainer.

Ampere is new and can be considered a refinement of Turing with addition of AVC1 decoding with film grain.
https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

The main architectures we care about NVENC wise were Maxwell, Pascal and Turing.

The generation/architecture used isn't all we care about as that will mostly give us an idea of features and quality.  To determine the amount of streams you can handle is more complex and involves the GPU & Memory clock speeds as well as number of Shaders, TMUs & ROPs. Not understanding this could easily cause a person to upgrade GPUs to not gain any improvement in amount of streams they can handle.

Here's an example number of streams doing transcoding from 1080p 10Mbps to 720p 4Mbps.

image.png.48316f5d082fe2a8a166d31aa14dd51a.png

So if you have a 1050 TI that does 14 transcodes the best bang for the buck upgrade would be to an RTX 2070 with 8GB giving you 27 streams. An upgrade to a GTX 1650 might allow more functionality but not more streams. An upgrade clear into the RTX 3000 series may not give you an improvement over the 2070/8Mbps as it will depend on the configuration of the card.

Understanding the relationship of how this works for streaming can allow you to purchase the best GPU for the money based on streaming and not game play.
You can generate you own graphs and lookup this type of info here: Transcode Expectations

Something to think about is talking to any gamer friends to see what card they have. If it get's you a decent upgrade maybe work out a deal to give them a bit of cash for the hand-me-down which might offset their cost to upgrade. You have your old card to either sell or barter with as well.

Can't really upgrade a laptops gpu. And also like I said it was all working fine vfore the updates I installed.

Edited by NeverExists
Posted

Likely it was a regression issue that appeared due to another improvement made in the code. This kind of thing can be very tricky.

Softworkz found and fixed the issue so as soon as the next beta is released you should hopefully be back to smooth playback for this specific type of file.

Posted

The Nvidia encoding quality is not generally bad, what is being compared here are just the default settings.

With the diagnostics plugin, you can change Nvidia H.264 encoding presets to adjust the output quality.

Posted
5 hours ago, cayars said:

Likely it was a regression issue that appeared due to another improvement made in the code. This kind of thing can be very tricky.

Correct. It came with the improvements in subtitle burn-in, which requires a certain color format for overlay.

  • Thanks 1

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...