RAM overflow even after closing all tasks #ram #thinkpad #memoryusage #2510 #ubuntubudgie

https://askubuntu.com/q/1566427/612

RAM overflow even after closing all tasks

The issue happens after a while of running ubuntu (Budgie), maybe after some suspend - waking up. Today it happened after about 5 minutes of using the firefox after waking up from suspend. Using the

Ask Ubuntu

How to significantly reduce memory usage of GTK4 applications in computers with low RAM running Ubuntu? #ram #memoryusage #gtk4

https://askubuntu.com/q/1566199/612

How to significantly reduce memory usage of GTK4 applications in computers with low RAM running Ubuntu?

I use Ubuntu on a Chromebook with only 4GB of memory, and GTK4 apps consume a lot of memory. Examples: Even a hello world window may consume more than 100 mb of memory One app was reported to hav...

Ask Ubuntu

Why is my Ubuntu VM running out of memory and closing unrar? #memoryusage

https://askubuntu.com/q/1565692/612

Why is my Ubuntu VM running out of memory and closing unrar?

My Ubuntu VM, running on TrueNAS, has 32GB RAM assigned, but cannot even unrar a file with nothing else running. System monitor says 97% or RAM is in use before I even run the command. After a min...

Ask Ubuntu

Ars Technica: Google’s TurboQuant AI-compression algorithm can reduce LLM memory usage by 6x. “Google Research recently revealed TurboQuant, a compression algorithm that reduces the memory footprint of large language models (LLMs) while also boosting speed and maintaining accuracy.”

https://rbfirehose.com/2026/03/26/ars-technica-googles-turboquant-ai-compression-algorithm-can-reduce-llm-memory-usage-by-6x/
Ars Technica: Google’s TurboQuant AI-compression algorithm can reduce LLM memory usage by 6x

Ars Technica: Google’s TurboQuant AI-compression algorithm can reduce LLM memory usage by 6x. “Google Research recently revealed TurboQuant, a compression algorithm that reduces the memory fo…

ResearchBuzz: Firehose
3 Ways to Speed Up Model Training Without More GPUs - MachineLearningMastery.com

Most training jobs run slower than they need to, not because the GPU is weak, but because it isn’t being used efficiently. This article shows three proven ways to speed up model training without adding more GPUs.

MachineLearningMastery.com
Google Chrome was using 30GB RAM until I made this tweak

We all know that Chrome uses memory like no other browser, but this was excessive.

MakeUseOf
#Huawei’s Computing Systems Lab introduced #SINQ, an #opensource #quantisationmethod for large language models (#LLMs). SINQ reduces #memoryusage by 60-70% without sacrificing output quality, enabling models to run on less powerful #hardware. The technique, available on GitHub and Hugging Face, uses #dualaxisscaling and #SinkhornKnopp-style #normalisation for improved performance. https://venturebeat.com/ai/huaweis-new-open-source-technique-shrinks-llms-to-make-them-run-on-less?eicker.news #tech #media #news

Also, Apple apps are no better.

#Mac #Apple #MemoryUsage

Also, Gimp with all images closed.

#Mac #Gimp #MemoryUsage