Jump to content

Recommended Posts

Posted

为什么不让bif支持gpu呢 这样可以节省大量cpu资源 速度也会更快

Neminem
Posted (edited)

This has been talked about here + other places but same conclusion.

 

Edited by Neminem
  • Thanks 1
rbjtech
Posted
1 hour ago, Luke said:

@seanbuffor @rbjtechdidn't you ask this before? Do you recall where?

@NeminemTks. 

 In summary, due to the very small image being processed, even with h/w tone mapping being applied, it takes longer to move the data into the GPU to then process it and move it back, than it does to simply process it directly in the CPU.    If you want things done quicker, then running multiple instances of the BIF generator with Emby on different parts of the library is the way to go. 

  • Agree 1
  • Thanks 1

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...