Not your Weights, Not your Brain
이 글은 인간의 판단과 경험을 AI 모델과 상호작용하며 개인화된 지식과 판단 체계로 전환하는 '외부피질(exocortex)' 개념을 다룬다. AI의 제안에 대한 인간의 승인 또는 거절이 단순한 피드백을 넘어 전문 지식과 판단을 모델에 내재화하는 중요한 신호임을 강조한다. 저자는 로컬에서 AI 판단을 학습하고 개인화하는 claudectl 도구를 소개하며, 이를 통해 인간의 전문성이 AI와 상호작용하며 함께 성장할 수 있음을 보여준다. 또한, 이러한 상호작용이 인간의 인지 능력 향상에도 기여한다고 설명한다.

https://mercurialsolo.github.io/posts/not-your-weights-not-your-brain/

#exocortex #knowledgedistillation #localai #humanintheloop #aipersonalization

Not your Weights, Not your Brain

Every time I correct my AI copilot, I'm leaking expertise - dark knowledge about how I think, what trade-offs I accept, where I draw the line. That signal is either compounding me or vanishing into someone else's training run. I chose to keep it.

Barada's log

Akshay (@akshay_pachaar)

자동화된 파인튜닝과 작은 모델을 만드는 방법을 다룬 연구를 언급하며, 기존 지식 증류(KD)와 달리 teacher에서 small model로만 가는 방식이 아니라 더 나은 접근이 있을 수 있음을 시사한다. 소형 모델 학습과 자동 최적화 관점에서 주목할 만한 연구 흐름이다.

https://x.com/akshay_pachaar/status/2049772296248623319

#finetuning #knowledgedistillation #smallmodels #llm #research

Akshay 🚀 (@akshay_pachaar) on X

@_avichawla Great paper indeed on automated fine-tuning. I read it a few days back. Btw, I remember reading another paper that also confirmed this and proposed a good solution as well to build small models. In traditional KD, you go from teacher → small model. But there's a

X (formerly Twitter)

Akshay (@akshay_pachaar)

자동화된 파인튜닝에 대한 논문을 언급하며, 전통적 지식 증류(KD) 방식과 달리 더 작은 모델을 만드는 새로운 접근과 해결책이 제안되었다고 말한다. 작은 모델 학습과 증류 전략 개선과 관련된 연구 맥락의 트윗이다.

https://x.com/akshay_pachaar/status/2049764879608017152

#finetuning #knowledgedistillation #smallmodels #research

Akshay 🚀 (@akshay_pachaar) on X

@_avichawla Great paper indeed on automated fine-tuning. I read it a few days back. Btw, I remember reading another paper that also confirmed this and proposed a good solution as well to build small models. In traditional KD, you go from teacher → small model. But there's a

X (formerly Twitter)

fly51fly (@fly51fly)

Google LLC 연구진이 YouTube Music 사례를 통해 제로샷 크로스도메인 지식 증류(knowledge distillation)를 다룬 새 연구를 소개했다. 도메인이 다른 환경에서도 사전 학습 지식을 효과적으로 이전하는 방법을 탐구한 AI 연구로, 추천·검색·멀티도메인 모델 최적화에 참고할 만하다.

https://x.com/fly51fly/status/2039455679875010773

#knowledgedistillation #zeroshot #crossdomain #youtubemusic #google

fly51fly (@fly51fly) on X

[IR] Zero-shot Cross-domain Knowledge Distillation: A Case study on YouTube Music S Ranganathan, N Khani, S Andrews, C Lo… [Google LLC] (2026) https://t.co/efSdMFeWeL

X (formerly Twitter)

Microsoft’s new OPCD technique trims system prompts dramatically while keeping LLM output quality intact. By compressing tokens and applying knowledge distillation, the model stays fast and accurate—great news for open‑source AI projects. Curious how they pull it off? Dive into the full benchmark analysis. #MicrosoftOPCD #LLMCompression #AIPerformance #KnowledgeDistillation

🔗 https://aidailypost.com/news/microsofts-opcd-cuts-system-prompts-while-preserving-ai-performance

right hand: vibe coding a new tool in gradio to download, convert , and output most streaming media podcasts video audio etc to txt so they can be sent straight into ollama for distillation / left hand : giving my 14 year old 4lb yorkie a neck massage in a triple fleeced blanket on my lap

#VibeCoding #Gradio #OpenSource #Ollama #LLMTools #MediaToText #WhisperAI #AIWorkflow #LocalAI #KnowledgeDistillation #Automation

DeepSeek R1 is revolutionizing AI with its cost-effective approach. By using specialist models and innovative training methods, it achieves high performance at a fraction of the cost. This challenges the traditional AI economics. More about this technology here:

https://dev.to/abhishek_gautam-01/a-slightly-technical-deep-dive-into-deepseek-r1-2023

#DeepSeek #AI #OpenSource #MachineLearning #ArtificialIntelligence #ReinforcementLearning #KnowledgeDistillation

A Slightly Technical Deep Dive into DeepSeek R1

For years, AI development has been an expensive game, dominated by companies like OpenAI and...

DEV Community

Knowledge Distillation: Principles, Algorithms, Applications

#KnowledgeDistillation

https://neptune.ai/blog/knowledge-distillation

Knowledge Distillation: Principles, Algorithms, Applications

This article explains knowledge distillation, including its core principles, algorithms, and real-world applications.

neptune.ai