appas 3 Posted May 10, 2024 Posted May 10, 2024 Greetings, I am experiencing an issue similar to this thread, where Emby starts up fine, but after some time (10+ minutes), will get killed. It seems the cause is a memory leak, because the latest time this happened, the status that systemctl gives for `emby-server.service` was `Active: failed (Result: oom-kill)`. Now, I believe the "oom" in "oom-kill" means "out of memory". I would be willing to profile the application to find the leak, but first wanted to know if there are any easy fixes? I was using Emby 4.8.5.0 (which I am now updating to 4.8.6.0 - but this problem has persisted for a couple of updates, already, so I am making this thread), on Debian 11 x64, kernel 5.10.0-26.
sa2000 674 Posted May 10, 2024 Posted May 10, 2024 There are other forum threads about memory usage and crashes - one recent one being here I suggest enabling debug logging and providing server logs covering the time of the oom crash to see what activity was running and also the subsequent launch period If you do upload logs in the forum make sure you use the download options to anonymize the log contents - see
serverebps01 12 Posted May 10, 2024 Posted May 10, 2024 4 hours ago, appas said: Saudações, Estou enfrentando um problema semelhante a este tópico , onde o Emby inicia bem, mas depois de algum tempo (mais de 10 minutos), ele é morto. Parece que a causa é um vazamento de memória, porque na última vez que isso aconteceu, o status que o systemctl fornece para `emby-server.service` era `Ativo: falhou (resultado: oom-kill)`. Agora, acredito que o “oom” em “oom-kill” significa “sem memória”. Eu estaria disposto a criar um perfil do aplicativo para encontrar o vazamento, mas primeiro queria saber se há alguma solução fácil. Eu estava usando o Emby 4.8.5.0 (que agora estou atualizando para 4.8.6.0 - mas esse problema já persistiu por algumas atualizações, então estou criando este tópico), no Debian 11 x64, kernel 5.10.0-26 . @appasIt seems that @Luke is already seeing a problem with high memory consumption on the EMBY server. for playing m3u files. I already left it for my Emby server logs
appas 3 Posted August 26, 2024 Author Posted August 26, 2024 Hey, just to update: this "out-of-memory kill" is still a problem with Emby 4.8.8.0. Here is a log of a session which ends in getting killed due to OOM. Nothing is actually done on the server during the session, so there's not much there, but it shows that it has to be some "core process" that has the memory leak, because, like I said, no playback is initiated or anything like that.
Luke 42077 Posted August 26, 2024 Posted August 26, 2024 17 minutes ago, appas said: Hey, just to update: this "out-of-memory kill" is still a problem with Emby 4.8.8.0. Here is a log of a session which ends in getting killed due to OOM. Nothing is actually done on the server during the session, so there's not much there, but it shows that it has to be some "core process" that has the memory leak, because, like I said, no playback is initiated or anything like that. hi there, can you please attach the log file here? Thanks.
appas 3 Posted August 26, 2024 Author Posted August 26, 2024 (edited) On 26/08/2024 at 19:33, Luke said: hi there, can you please attach the log file here? Thanks. Here you are. Edited by sa2000 to remove raw log file Edited February 26, 2025 by sa2000
appas 3 Posted August 29, 2024 Author Posted August 29, 2024 Yes, whenever I start Emby, it gets oom-killed, even if nothing is done during the session.
normic 0 Posted August 31, 2024 Posted August 31, 2024 Sorry for jumping in, but I stumbled about the same issue yesterday and investigated it a bit further. I'm running on a dedicated Debian 11 machine. But after the update to 4.8.8 it got stuck while scanning the media libs at 53%. Stopping the scan keeps Emby running, not stopping it in time while cause Emby to consume all memory of the system.
Luke 42077 Posted September 10, 2024 Posted September 10, 2024 On 8/31/2024 at 4:29 PM, normic said: Sorry for jumping in, but I stumbled about the same issue yesterday and investigated it a bit further. I'm running on a dedicated Debian 11 machine. But after the update to 4.8.8 it got stuck while scanning the media libs at 53%. Stopping the scan keeps Emby running, not stopping it in time while cause Emby to consume all memory of the system. Hi there, please attach the emby server log from when this happened: How to Report a Problem Thanks !
appas 3 Posted February 24, 2025 Author Posted February 24, 2025 (edited) Just reporting that this is still a problem in 4.8.10. Edit: I thought I might take a whack at fixing this myself, but the code at https://github.com/MediaBrowser/Emby is almost 10 years old, and the project website does not have a link to source. Is Emby no longer open source? Edited February 24, 2025 by appas
Happy2Play 9780 Posted February 24, 2025 Posted February 24, 2025 15 minutes ago, appas said: Just reporting that this is still a problem in 4.8.10. Edit: I thought I might take a whack at fixing this myself, but the code at https://github.com/MediaBrowser/Emby is almost 10 years old, and the project website does not have a link to source. Is Emby no longer open source? Correct https://emby.media/community/index.php?/topic/133968-the-history-and-evolution-of-emby
Luke 42077 Posted February 25, 2025 Posted February 25, 2025 22 hours ago, appas said: Just reporting that this is still a problem in 4.8.10. Edit: I thought I might take a whack at fixing this myself, but the code at https://github.com/MediaBrowser/Emby is almost 10 years old, and the project website does not have a link to source. Is Emby no longer open source? Hi, can you please attach a new Emby server log? Thanks.
appas 3 Posted February 25, 2025 Author Posted February 25, 2025 (edited) Here. Edited by sa2000 to remove raw log file Edited February 26, 2025 by sa2000
Luke 42077 Posted February 26, 2025 Posted February 26, 2025 OK do you have the realtime monitor enabled on any of your libraries? That would be one thing to try turning off.
appas 3 Posted February 26, 2025 Author Posted February 26, 2025 7 hours ago, Luke said: OK do you have the realtime monitor enabled on any of your libraries? That would be one thing to try turning off. How do I check that? But no, I haven't at least manually set anything like that, so unless it's the default, I'd say the answer is no.
sa2000 674 Posted February 26, 2025 Posted February 26, 2025 51 minutes ago, appas said: I haven't at least manually set anything like that, so unless it's the default, I'd say the answer is no. Real Time Monitoring for libraries is enabled by default when creating a library. You can see the option when you open server settings and go to Library settings and view/edit each library settings Are you sure the Emby Server process is being killed ? and have you found out what memory usage is whilst running ? The log shows loads of very slow database transactions and requests that are taking a long time to complete - suggesting some system / disc issue perhaps This shows requests that took more than 10 seconds in last log file - pasting timestamp, elapsed seconds to complete and the request url Time Elapsed Time URL 2025-02-25 22:21:49.869 23.72 GET /emby/Items/70396/Images/Primary?maxHeight=80&maxWidth=80&tag=5c0386d0108c7dd6eae9bc6dc7eed0fa&quality=90 2025-02-25 22:21:49.874 23.714 GET /emby/Items/64927/Images/Primary?maxHeight=80&maxWidth=80&tag=9974f9b89aa9a15bc3df834c384606aa&quality=90 2025-02-25 22:21:50.367 23.37 GET /emby/Items/67577/Images/Primary?maxHeight=80&maxWidth=80&tag=6e03e6d482a72e3543a79b24b1ac9105&quality=90 2025-02-25 22:21:50.370 23.219 GET /emby/Items/67589/Images/Primary?maxHeight=80&maxWidth=80&tag=54e6a7b598079ba6d266352b3deaf7ce&quality=90 2025-02-25 22:21:50.517 23.073 GET /emby/Items/67602/Images/Primary?maxHeight=80&maxWidth=80&tag=c6ffa11a66cf6f75ec12ef0cfb0b88f8&quality=90 2025-02-25 22:21:50.527 23.204 GET /emby/Items/67618/Images/Primary?maxHeight=80&maxWidth=80&tag=02251347d44af3e80e888029765ec071&quality=90 2025-02-25 22:22:13.729 108.39 GET /emby/Users/f1bf8161106d4fe68fb3469fdf9757d5/Items?Fields=PrimaryImageAspectRatio&Recursive=true&IncludeItemTypes=Playlist&X-Emby-Client=Emby Web&X-Emby-Device-Name=Chrome&X-Emby-Device-Id=TW96aWxsYS81LjAgKFgxMTsgTGludXggeDg2XzY0KSBBcHBsZVdlYktpdC81MzcuMzYgKEtIVE1MLCBsaWtlIEdlY2tvKSBDaHJvbWUvMTMyLjAuMC4wIFNhZmFyaS81MzcuMzZ8MTc0MDMzNDk3MjkzMw11&X-Emby-Client-Version=4.8.10.0&X-Emby-Language=en-us 2025-02-25 22:22:13.890 65.876 GET /emby/Users/f1bf8161106d4fe68fb3469fdf9757d5/Items/Latest?Limit=16&Fields=BasicSyncInfo,CanDelete,PrimaryImageAspectRatio,ProductionYear,Status,EndDate&ImageTypeLimit=1&EnableImageTypes=Primary,Backdrop,Thumb&ParentId=6&X-Emby-Client=Emby Web&X-Emby-Device-Name=Chrome&X-Emby-Device-Id=TW96aWxsYS81LjAgKFgxMTsgTGludXggeDg2XzY0KSBBcHBsZVdlYktpdC81MzcuMzYgKEtIVE1MLCBsaWtlIEdlY2tvKSBDaHJvbWUvMTMyLjAuMC4wIFNhZmFyaS81MzcuMzZ8MTc0MDMzNDk3MjkzMw11&X-Emby-Client-Version=4.8.10.0&X-Emby-Language=en-us 2025-02-25 22:22:22.241 57.529 GET /emby/Items/67630/Images/Primary?maxHeight=80&maxWidth=80&tag=d0ba6f5a37581b488c372649d14b5810&quality=90 2025-02-25 22:22:22.490 58.496 GET /emby/Items/67641/Images/Primary?maxHeight=80&maxWidth=80&tag=c671f8d64d6e6d2ce925a89ef2982de3&quality=90 2025-02-25 22:22:25.403 17.748 GET /emby/ScheduledTasks?isHidden=false&X-Emby-Client=Emby Web&X-Emby-Device-Name=Chrome&X-Emby-Device-Id=TW96aWxsYS81LjAgKFgxMTsgTGludXggeDg2XzY0KSBBcHBsZVdlYktpdC81MzcuMzYgKEtIVE1MLCBsaWtlIEdlY2tvKSBDaHJvbWUvMTMyLjAuMC4wIFNhZmFyaS81MzcuMzZ8MTc0MDMzNDk3MjkzMw11&X-Emby-Client-Version=4.8.10.0&X-Emby-Language=en-us 2025-02-25 22:22:43.266 36.509 GET /emby/System/Logs/Query?IncludeItemTypes=Log&Fields=BasicSyncInfo,CanDelete,PrimaryImageAspectRatio,ProductionYear,Status,EndDate,CommunityRating,OfficialRating,CriticRating&StartIndex=0&EnableImageTypes=Primary,Backdrop,Thumb&ImageTypeLimit=1&Recursive=true&Limit=30&X-Emby-Client=Emby Web&X-Emby-Device-Name=Chrome&X-Emby-Device-Id=TW96aWxsYS81LjAgKFgxMTsgTGludXggeDg2XzY0KSBBcHBsZVdlYktpdC81MzcuMzYgKEtIVE1MLCBsaWtlIEdlY2tvKSBDaHJvbWUvMTMyLjAuMC4wIFNhZmFyaS81MzcuMzZ8MTc0MDMzNDk3MjkzMw11&X-Emby-Client-Version=4.8.10.0&X-Emby-Language=en-us Looking at the slow db queries, this is one that took over 6 seconds select A.type,A.Id,A.StartDate,A.EndDate,A.IsMovie,A.IsSeries,A.IsKids,A.IsSports,A.IsNews,A.IsRepeat,A.IsNew,A.IsPremiere,A.IsLive,A.CommunityRating,A.CustomRating,A.IndexNumber,A.IsLocked,A.PreferredMetadataLanguage,A.PreferredMetadataCountryCode,A.Width,A.Height,A.DateLastRefreshed,A.Name,A.Path,A.PremiereDate,A.Overview,A.ParentIndexNumber,A.ProductionYear,A.OfficialRating,A.SortName,A.RunTimeTicks,A.Size,A.Container,A.DateCreated,A.DateModified,A.guid,A.ParentId,A.IsInMixedFolder,A.DateLastSaved,A.LockedFields,A.OriginalTitle,A.CriticRating,A.SeriesName,A.Album,A.AlbumId,A.SeriesId,A.PresentationUniqueKey,A.Tagline,A.ProviderIds,A.Images,A.ProductionLocations,A.TotalBitrate,A.ExtraType,A.ExternalId,A.SeriesPresentationUniqueKey,A.Status,A.DisplayOrder,A.ThreeDFormat,A.RemoteTrailers,A.SortIndexNumber,A.SortParentIndexNumber,A.IndexNumberEnd,A.IsPublic from MediaItems A where A.ParentId=30809 You could find out what this relates to and run it in sqlite browser or sqlite studio with emby server shutdown, opening the db file /var/lib/emby/data/library.db you can do the following to see what this relates to select id, name from MediaItems where ParentId=30809; select id, name from MediaItems where Id=30809;
appas 3 Posted February 26, 2025 Author Posted February 26, 2025 15 minutes ago, sa2000 said: Are you sure the Emby Server process is being killed ? and have you found out what memory usage is whilst running ? Yes, in systemctl status emby-server it says "oom-killed". 15 minutes ago, sa2000 said: You could find out what this relates to and run it in sqlite browser or sqlite studio with emby server shutdown, opening the db file /var/lib/emby/data/library.d you can do the following to see what this relates to select id, name from MediaItems where ParentId=30809; select id, name from MediaItems where Id=30809; I get Error: no such table: MediaItems The file /var/lib/emby/data/library.d has a size of 0 bytes.
sa2000 674 Posted February 26, 2025 Posted February 26, 2025 15 minutes ago, appas said: The file /var/lib/emby/data/library.d has a size of 0 bytes. According to the log file - the file has this path /var/lib/emby/data/library.db Whilst the server is running there would be 3 files /var/lib/emby/data/library.db /var/lib/emby/data/library.db-shm /var/lib/emby/data/library.db-wal when the server shutdown, it should be reduced to just the .db file Maybe you have permissions issue and that is why showing as 0 bytes? Copy the file out whilst the server is shutdown Correcting the case sensitivity of the queries I suggested: select Id, Name, Path from MediaItems where ParentId=30809; select Id, Name, Path from MediaItems where Id=30809; With regards to the high memory usage, i would like to capture memory use from launch and up to the oom-kill time. Please first ensure debug logging is enabled on the server Then run this script in a linux window on the machine (echo "PID,time,VmPeak,VmSize,VmLck,VmPin,VmHWM,VmRSS,RssAnon,RssFile,RssShmem,VmData,VmStk,VmExe,VmLib,VmPTE,VmSwap"; while true; do PID=$(pgrep EmbyServer); echo -n "$PID,"; date "+%F %T" | tr '\n' ','; grep -E "^Vm|^Rss" /proc/${PID}/status | cut -d':' -f2 | sed -e 's/ kB$//g' | tr -d '\t' | tr -d ' ' | tr '\n' ',' | sed -e 's/,$//'; echo ""; sleep 1; done;) | tee embyserver-memory-20250226a.csv This will log memory use every 1 second. Then leaving this running, and with debug logging already enabled, restart the emby server wait until it gets oom killed and then breakin on this script and let me have the log file and the output file embyserver-memory-20250226a.csv (it might be an idea to copy out the library.db file (and if there is library.db-shm and library.db-wal - then these as well) - i will send you an upload link
appas 3 Posted February 26, 2025 Author Posted February 26, 2025 (edited) Ah yeah, it was because I mistakenly left out the b in db. SELECT id, name FROM MediaItems WHERE ParentId = 30809; 31029|01 [Orb] - Once More 31030|02 [Orb] - Promis 31031|03 [Orb] - Ghost Dancing 31032|04 [Orb] - Turn it Down 31033|05 [Orb] - Egnable 31034|06 [Orb] - Firestar 31035|07 [Orb] - A Mile Lump of Lard 31036|08 [Orb] - Centuries 31037|09 [Orb] - Plum Island 31038|10 [Orb] - Hamlet of Kings 31039|11 [Orb] - 1,1,1 31040|12 [Orb] - EDM 31041|13 [Orb] - Thursday's Keeper 31042|14 [Orb] - Terminus Attached is the memory log until getting oom-killed. Edit: actually, I noticed that this time, the status isn't oom-killed, but simply killed: systemctl status emby-server ● emby-server.service - Emby Server is a personal media server with apps on just about every device Loaded: loaded (/usr/lib/systemd/system/emby-server.service; enabled; vendor preset: enabled) Active: failed (Result: signal) since Wed 2025-02-26 17:12:32 EET; 4h 51min ago Process: 862123 ExecStart=/opt/emby-server/bin/emby-server (code=killed, signal=KILL) Main PID: 862123 (code=killed, signal=KILL) CPU: 1min 28.573s embyserver-memory-20250226a.csv Edited February 26, 2025 by appas
sa2000 674 Posted February 26, 2025 Posted February 26, 2025 36 minutes ago, appas said: Attached is the memory log until getting oom-killed. This does not look like what i asked for. I wanted you to start the script for memory capture. Then stop and restart emby server and wait for the kill The csv file does not show that. It shows only one PID - i expect to see a PID for Emby Server then no PID until it restarts and then another PID and then no PID number from the time the KILL happens The whole period captured in the memory usage shows memory constant at a high value of over 3G. And I will need zip of copy of the logs folder which you can upload with the database zip
appas 3 Posted February 26, 2025 Author Posted February 26, 2025 Ok - this time, I started the script with Emby server running, stopped and restarted Emby, and waited for the kill, which, this time, was "oom-kill". embyserver-memory-20250226a.csv 1
sa2000 674 Posted March 4, 2025 Posted March 4, 2025 (edited) Thanks for the diagnostics. I am still waiting for a log with debug enabled to correspond to the memory usage csvs you provided - but it does not appear that there is an Emby issue here. I can give you feedback from what can be seen from the memory usage capture For the launch of Emby Server on the 28th February at 00:54, I can see that the process stopped after 13:33:35 - which presumably was the OOM Kill The VmRSS value for the process - which gives the actual RAM being used - only reached 501,440 (489 Mb) at 13:33:35 and is not too high. How much memory do you have on the machine ? and are the OOM parameters configurable? May be there was shortage of memory for other processes and the system was just killing any process with a highish footprint at the time? So as it stands, I do not see an Emby Server issue here as 489 Mb is ok Edited March 4, 2025 by sa2000
appas 3 Posted March 4, 2025 Author Posted March 4, 2025 The machine has 1 GB RAM. Are you saying that's not enough for Emby?
sa2000 674 Posted March 4, 2025 Posted March 4, 2025 (edited) 3 hours ago, appas said: The machine has 1 GB RAM. Are you saying that's not enough for Emby? I will follow this up after I look at the new debug log and memory use capture and establish if the System Requirements as specified here is still correct At least 1GB RAM for Windows/Mac OS X At least 512MB RAM for Linux Edited March 4, 2025 by sa2000
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now