The approximately 7,500 anatomical terms are a lot for students to remember, but the task is made easier by risque mnemonics.

https://phys.org/news/2026-04-anatomy-naughtiest-mnemonics.html

#Memory #Memorization #Anatomy

Why anatomy's naughtiest mnemonics work so well

Some lovers try positions that they can't handle—I'm referring to the bones of the wrist, of course. The phrase is a classic mnemonic used to remember the eight carpal (wrist) bones—scaphoid, lunate, triquetrum, pisiform, trapezium, trapezoid, capitate and hamate—whose initials form the memorable sentence.

Phys.org

fly51fly (@fly51fly)

훈련 데이터 프루닝이 사실(fact) 기억 성능을 향상시킨다는 연구입니다. 더 적은 데이터로 더 많은 정보를 학습할 수 있게 하는 방법을 제안하며, 데이터 선택과 학습 효율 개선에 관심 있는 개발자에게 중요한 결과입니다.

https://x.com/fly51fly/status/2042718227814584559

#datapruning #memorization #trainingdata #apple #nlp

fly51fly (@fly51fly) on X

[CL] Cram Less to Fit More: Training Data Pruning Improves Memorization of Facts J Ye, V Feldman, K Talwar [Apple & National University of Singapore] (2026) https://t.co/YV2AuoSM7T

X (formerly Twitter)

fly51fly (@fly51fly)

대형 언어모델에서 파인튜닝이 저작권 도서의 문장을 그대로 기억하고 재현하는 현상을 분석한 연구입니다. 모델 정렬을 강화해도 오히려 verbatim recall이 활성화될 수 있어, 저작권·안전성 측면에서 중요한 시사점을 제공합니다.

https://x.com/fly51fly/status/2038026904725504230

#llm #copyright #finetuning #alignment #memorization

fly51fly (@fly51fly) on X

[CL] Alignment Whack-a-Mole : Finetuning Activates Verbatim Recall of Copyrighted Books in Large Language Models X Liu, N Mireshghallah, J C. Ginsburg, T Chakrabarty [Stony Brook University & CMU & Columbia Law School] (2026) https://t.co/XvIboq4dxC

X (formerly Twitter)

Lenka Zdeborova (@zdeborova)

메모리화와 일반화의 균형을 탐구하기 위해 Rules-and-Facts 모델을 소개했다. 단순 암기가 아니라 규칙 학습과 사실 기억을 동시에 요구하는 실제 과제를 평가하는 데 초점을 맞춘 연구로 보인다.

https://x.com/zdeborova/status/2037522113750302758

#llm #machinelearning #memorization #generalization #research

Lenka Zdeborova (@zdeborova) on X

Memorization is often treated as something that can be tolerated without harming generalization - or studied in isolation. But many real tasks require *both learning rules and memorizing facts*. We introduce the Rules-and-Facts model to probe this: https://t.co/hPSYC2eyUV

X (formerly Twitter)

7 unnecessary #Assumptions about #Life in the #Universe : Medium

#AI’s #Memorization #Crisis : Misc

Why Finding #Motivation Is Often Such a #Struggle : Misc

Latest #KnowledgeLinks

https://knowledgezone.co.in/resources/bookmarks

AI's Memorization Crisis - The Atlantic

https://www.theatlantic.com/technology/2026/01/ai-memorization-research/685552/

> Large language models don’t “learn”—they copy. And that could change everything for the tech industry.

#AI #LLM #memorization

AI's Memorization Crisis

Large language models don’t “learn”—they copy. And that could change everything for the tech industry.

The Atlantic
Deconstructing Geoffrey Hinton’s weakest argument

Hinton’s “savage” but misguided attack on Marcus, analyzed

Marcus on AI

Aharon Azulay (@AharonAzulay)

작성자는 관찰 결과가 일치한다고 말하며, 이러한 시스템들이 잘 알려지지 않은 arXiv 논문의 수치적 세부사항까지 기억할 수 있음을 지적하고 있습니다. 연구·검증 관점에서 모델의 기억(기록) 능력과 데이터 출처 관련 행동을 시사하는 댓글입니다.

https://x.com/AharonAzulay/status/2009238311232262533

#research #memorization #arxiv #llm

Aharon Azulay (@AharonAzulay) on X

@rohanpaul_ai Yeah that is in line with what I found: these systems can remember numerical details from obscure arxiv papers. https://t.co/uUGoyZ5eRf

X (formerly Twitter)