"One of Us Will Die" Kickstarter Surpasses Funding Goal in Under Two Hours
"One of Us Will Die" Kickstarter Surpasses Funding Goal in Under Two Hours
U.S. Supreme Court Records and Briefs: The Arguments That Shaped America, Now Freely Available
Lemmy Release v0.19.18
Three greats who we’ve lost
DLARC Adds New Radio Resources to Celebrate World Amateur Radio Day
Hexbear has been temporarily defederated - UPDATED now federated
Original text below this. Following release of lemmy 0.19.18 we’ve refederated as this release should fix the bug that caused the issue. ## TL;DR We’ve temporarily defederated from Hexbear due to a Lemmy bug with very deeply nested comment threads. A thread there triggered repeated crashes on our server, causing errors like 502 pages and “Lemmy is starting” messages. Defederating stops the issue for now. — ## Announcement Due to technical issues, we’ve temporarily defederated from Hexbear until a Lemmy update is available that fixes issues with deeply nested comment chains. There is a known bug in Lemmy (see: https://github.com/LemmyNet/lemmy/issues/6435 [https://github.com/LemmyNet/lemmy/issues/6435] ) where very deeply nested comments can trigger excessive recursion during federation. When Lemmy processes these comments, it recursively fetches and verifies parent comments, which can eventually lead to stack overflows. Under normal circumstances this happens rarely (we’ve been seeing it maybe once per day), but it becomes much more problematic when multiple new comments are added to an already deeply nested thread. Each new activity can trigger processing of the same deep chain again. In this case, a thread on Hexbear received a large number of additional replies in a very deep comment chain. This caused Lemmy to repeatedly process that chain, leading to stack overflows, federation worker exhaustion and timeouts. Simply put, parts of the server were crashing, too many tasks piled up at once, and requests started timing out and failing to load You may have see this on the website with 502 errors or the lemmy error screen, and on apps it may have presented you with API timeout errors or “Lemmy is starting” errors. For a visual representation, this graph shows the memory drop each time the server restarts: [https://lemmy.zip/pictrs/image/d48f98e2-0cc8-406e-bbe2-f5c910016493.avif] The flat bit to the left is good, everything is fine. The choppy bit to the right, not so good, everything is not fine. Usually its a one-off comment causing this crash, however in this case the user spent a good portion of time bumping the thread, and we had to process each one of those, each causing a crash, restarting the server, and then crashing on the next in the queue, and so on. I did try removing the offending community from Lemmy.zip to prevent this from happening (It’s quite common behavior in that community to bump threads I think), however we still process all the activities from that community - the only certain fix for now is to defederate until a version of lemmy is released that fixes this. The graph is back to improving now: [https://lemmy.zip/pictrs/image/8a2ad4ce-41e9-4ba6-8e38-751d2babcf46.avif] Hope that all makes sense! Demigodrick
A Wedding to Die For: a fully improvised murder mystery dinner game for up to 25 players
EUphoriaCon Takes Off 2nd and 3rd May 2026
https://fed.brid.gy/r/https://www.rascal.news/euphoriacon-takes-off-2nd-and-3rd-may-2026/
Discuss Online downtime and responsiveness
#Development #Announcements
Codex for (almost) everything · OpenAI turns its agent into a development partner https://ilo.im/16c89i
_____
#OpenAI #Codex #AI #Agents #Desktop #Workflows #DevOps #WebDev #Frontend #Backend