Jump to content

Server becoming unresponsive. RAM is very bloated..


Recommended Posts

andrewmcd7
Posted

Please bear with me as im not very tech savvy.  I have an emby server that about once a day becomes unresponsive.  The Home Screen loads the library’s and that’s it.. it wont show the home screen content.  Sometimes it takes 20-40 min to become responsive again. I noticed the RAM is VERY bloated at that time.  I will upload the logs.  Hoping ANYONE can assist.  

Emby Log 2.pdf Emby Log 1.pdf

Posted

HI, can you please attach the server log files in their original plain text formats? Thanks.

andrewmcd7
Posted

Sorry I was doing this from my iPad so had a hard time open/downloading the logs. I will need to wait for the issue to happen again. For now can you use what’s attached?

7 hours ago, Luke said:

HI, can you please attach the server log files in their original plain text formats? Thanks.

 

seanbuff
Posted
37 minutes ago, andrewmcd7 said:

Sorry I was doing this from my iPad so had a hard time open/downloading the logs. I will need to wait for the issue to happen again. For now can you use what’s attached?

The logs should still be available on the server if you know the date/time the issue occurred.

andrewmcd7
Posted

In addition.. looks like Memory is just bloating also.

bdX55MDXZ5PVq9p125gN.jpg

andrewmcd7
Posted

Emby is using 55gb of 64gbs ram

nKYiKFEmJsxa42wegMQl.jpg

Q-Droid
Posted

If this is correct then it may also be your problem. You appear to be using RAM for your transcoding temp path.

Quote

2023-06-22 00:00:35.843 Info App: Transcoding temporary files path: /dev/shm/transcoding-temp


 

  • Agree 1
andrewmcd7
Posted
19 minutes ago, Q-Droid said:

If this is correct then it may also be your problem. You appear to be using RAM for your transcoding temp path.


 

I will change this and see if that resolves the issue... thanks for your thoughts

andrewmcd7
Posted

Sadly that did not resolve the issue.  Its currently unresponsive now.  New logs uploaded.

embyserver.txt

Q-Droid
Posted

You have many errors in your server log, most look like file access. It would be best to resolve those to clear out the clutter from the log.  There are also a good number of clients connecting and failing to play/stream/whatever. A bit of a mess that makes it harder to spot a problem and could very well be contributing to it.

 

 

andrewmcd7
Posted

I just realized the logs cycle after a restart.  Here is a diff set of logs before the restart.  Hoping someone on the emby team can lead me to figure out why my server is bloating randomly.

embyserver-63823053301.txt

isamudaison
Posted

What distro/version are you running this on? How are you executing the emby process (systemd? custom script?)? I'd recommend you enable debug logging and restart the server, this should give some 'more' insight into what all is happening.

(Probably not what you wanted to hear, but I notice you have docker running - have you considered trying out the dockerized version?)

andrewmcd7
Posted
20 hours ago, isamudaison said:

What distro/version are you running this on? How are you executing the emby process (systemd? custom script?)? I'd recommend you enable debug logging and restart the server, this should give some 'more' insight into what all is happening.

(Probably not what you wanted to hear, but I notice you have docker running - have you considered trying out the dockerized version?)

 

Debug logging is on now. Host is Ubuntu 18.04. emby is latest official docker container.

 
andrewmcd7
Posted
On 22/06/2023 at 16:17, isamudaison said:

What distro/version are you running this on? How are you executing the emby process (systemd? custom script?)? I'd recommend you enable debug logging and restart the server, this should give some 'more' insight into what all is happening.

(Probably not what you wanted to hear, but I notice you have docker running - have you considered trying out the dockerized version?)

Debug logging is on now. Host is Ubuntu 18.04. emby is latest official docker container.

andrewmcd7
Posted

Nothing from the emby devs?

andrewmcd7
Posted

Hey Emby Dev’s. Could you possibly weigh in?

Posted

How's you memory right now?

Be careful with Debug logging on as it has always made the problem worse for me.

I've narrowed the problem down for me and can reproduce at will. What I've found is that when library scans are taking place any error events trap memory that isn't freed until the scan is complete. If you have the ability get a graph going of memory used for the day or starting at the beginning of the test.

Restart your Emby server so it's using the least amount of RAM to start with. After it restarts give it a minute or two to calm down, then go into Scheduled Tasks and run a Scan Media Library job by clicking on the arrow to the right.  You can run this with debug log.

If you see memory starting to climb go into the logs menu and click on the 2nd icon from the right on the top server log to open the log in a new window.
Are you seeing errors in the log that match the time the memory starts to climb?

Carlo

  • Like 1
andrewmcd7
Posted
2 hours ago, Luke said:

I’m not sure what the ? Is for. I’ve uploaded logs. It’s happens occasionally. Maybe once a day and requires a server reboot. I have a friend who has his own Emby server and he’s having the same issue

Posted
On 6/25/2023 at 3:49 PM, andrewmcd7 said:

I’m not sure what the ? Is for. I’ve uploaded logs. It’s happens occasionally. Maybe once a day and requires a server reboot. I have a friend who has his own Emby server and he’s having the same issue

It was to answer the questions that Carlo had asked. Thanks !

andrewmcd7
Posted

This is happening when a scan is NOT happening. I specifically set the library scan to start at 2am to avoid this issue. But it still happens upon occasion. So it has nothing to do with server scans. 

Don’t the logs show anything? It’s all gibberish to me. 

Neminem
Posted

I'm seeing this also, and this started after updating Unraid to 6.12.0.

From change logges for Unraid 6.12.0 i see they updatet from docker: version 20.10.18 to docker: version 23.0.6

There are several open issues with docker at Unraid about these docker issues. 

So that tells me that something is now different, not sure what.

In the below pic you can see emby docker memory usage and when i updated unraid. 
The update was done on the 06-17-2023 

Yesterday I decided to limit emby docker memory, by using this extra docker command

--memory="[memory_limit]" [docker_image]

Now emby has a memory max of 4GB.

I have not see the memory bloat after, and emby is running smoothly again.

image.thumb.png.7d6faf74b7623ff695a68e7145a49612.png 

  • Like 1
Posted
On 6/26/2023 at 4:17 PM, andrewmcd7 said:

This is happening when a scan is NOT happening. I specifically set the library scan to start at 2am to avoid this issue. But it still happens upon occasion. So it has nothing to do with server scans. 

Don’t the logs show anything? It’s all gibberish to me. 

 

On 6/27/2023 at 12:56 AM, jaycedk said:

I'm seeing this also, and this started after updating Unraid to 6.12.0.

From change logges for Unraid 6.12.0 i see they updatet from docker: version 20.10.18 to docker: version 23.0.6

There are several open issues with docker at Unraid about these docker issues. 

image.thumb.png.7d6faf74b7623ff695a68e7145a49612.png 

What I was hoping for is being able to compare a log file with a short duration of minutes.  That would really be useful. On my system is a slow climb over about 7 hours which is how long my library scan takes.

Neminem
Posted (edited)

@CarloI do not think this is a Emby issue, in my case.

I have other dockers showing the same behavior. 

So in my case it's about Unraid and the docker engine version, they updated to.

Since this all started after the Unraid update and the docker engine they added.

Im still testing the memory limit I set on emby, if that continues to works great.

Then I will test with other dockers.

image.thumb.png.86d846a7b646d00b142396460d4dc25e.png

I only posted, so that op could se if he/she was on the same Docker version, and if he/she had tried with memory limit.

Edited by jaycedk

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...