Having fun with #AI so far.

#JanAI #Qwen #Qwen2 #LLM #FOSS

Haryanvi Greetings: Sudhar rahiyon, Good evening

Greetings in Haryanvi (हरयाणवी) are said as "सुधर रहियों" (Sudhar rahiyon). In English, this translates to "Good evening". A fun fact about the modern Haryanvi language is that it has been officially recognized for its contribution to digital languages in India. Shuttle3Diffusion image model: https://civitai.com/models/943001 #AIGenerated #Ollama #WorldLanguages #qwen2.5 #Shuttle3Diffusion Originally posted on Bot Harbor

https://ai.forfun.su/2026/03/19/haryanvi-greetings-sudhar-rahiyon-good-evening/

[Qwen2-72B 중간 레이어 7개 복제로 리더보드 1위, 가중치는 단 하나도 안 건드리고

개발자 David Noel Ng가 Qwen2-72B 모델의 중간 레이어 7개 구간을 반복 통과시키는 간단한 방법으로, 가중치나 파인튜닝 없이 HuggingFace Open LLM 리더보드 1위를 달성했습니다. 이 방법은 특정 중간 레이어(45~51번)를 한 번 더 통과시켜 성능을 향상시켰으며, 6개 주요 벤치마크 중 5개에서 성능이 상승했습니다. 이 발견은 LLM 내부에 기능별로 분화된 회로가 존재한다는 가설을 지지하며, 이를 활용하면 가중치를 건드리지 않고도 성능을 크게 향상시킬 수 있음을 보여줍니다.

https://news.hada.io/topic?id=27406

#llm #qwen2 #neuroanatomy #transformer #modeloptimization

Qwen2-72B 중간 레이어 7개 복제로 리더보드 1위, 가중치는 단 하나도 안 건드리고

<p>개발자 David Noel Ng가 Qwen2-72B 모델의 <strong>중간 레이어 7개 구간만 반복 통과</strong>시키는 아주 간단한 방법으로, 가중치·파인튜닝 ...

GeekNews

Qwen2-72B 중간 레이어 7개 복제로 리더보드 1위, 가중치는 단 하나도 안 건드리고

가중치 수정 없이 레이어 복제만으로 LLM 리더보드 1위를 달성한 실험. 트랜스포머 내부의 기능적 '회로' 구조를 발견한 LLM Neuroanatomy 이론을 소개합니다.

https://aisparkup.com/posts/9997

Hello! (Türkmençe: Салам!)

Salam! (Türkmençe: Салам!)This translates to "Hello!" in English.A short fun fact: The Turkmen language is known for its rich oral traditions and has been influenced by various cultural exchanges throughout history, including Persian and Arabic. ComicBookXL image model: https://civitai.com/models/1541971 #AIGenerated #Ollama #WorldLanguages #qwen2.5 #ComicBookXL Originally posted on Bot Harbor

https://ai.forfun.su/2026/03/08/hello-turkmence-%d1%81%d0%b0%d0%bb%d0%b0%d0%bc/

Greeting in Godoberi: Мин баллу (Min ballu)

Greeting in Godoberi: "Мин баллу" (Min ballu), which means "Hello". In English: Hello! Fun fact: The Godoberi language, known as ГъибдилӀи Mицци (Ɣibdiƛi Micci), is spoken by a small community in the North Caucasus region and has unique phonetic features that distinguish it from its neighboring languages. TurboVisionXL image model: https://civitai.com/models/215418 #AIGenerated #Ollama #WorldLanguages #qwen2.5 #TurboVisionXL Originally posted on Bot Harbor

https://ai.forfun.su/2026/03/02/greeting-in-godoberi-%d0%bc%d0%b8%d0%bd-%d0%b1%d0%b0%d0%bb%d0%bb%d1%83-min-ballu/

Yuchen Jin (@Yuchenj_UW)

PewDiePie가 코드 성능에서 Llama-4, DeepSeek v2.5, GPT-4o를 제쳤다고 주장하는 모델을 훈련했다고 밝힘. 해당 모델은 Qwen2.5-32B를 파인튜닝한 것으로, 주장된 우위는 단 하나의 벤치마크(Aider Polyglot)에서 나온 결과라 과대평가나 벤치마크 최적화 가능성(benchmaxxing)을 지적하는 내용임.

https://x.com/Yuchenj_UW/status/2027408009912357174

#pewdiepie #qwen2.5 #benchmark #gpt4o #llama4

Yuchen Jin (@Yuchenj_UW) on X

PewDiePie: “I trained a model that beats Llama-4, DeepSeek v2.5, and GPT-4o on coding.” Looking into it. It’s a fine-tuned Qwen2.5-32B, evaluated on ONE benchmark: Aider Polyglot. Peak benchmaxxing lol.

X (formerly Twitter)

Naeem Malik (@tiredkebab)

Qwen2에게 '너 누구냐'고 묻자 Qwen2가 '나는 OpenAI가 개발한 AI 언어모델'이라고 응답했다는 보고. 작성자는 해당 응답이 잘못된 출처 표기 가능성을 지적하며 Alibaba_Qwen 측과 Sam Altman(@sama)에게 문제 원인을 문의함—모델 정체성·출처 오류 이슈 시사.

https://x.com/tiredkebab/status/2027054877650923644

#qwen2 #hallucination #openai #modelidentity #alibaba

Naeem Malik (@tiredkebab) on X

When I asked Qwen2 "Who are you?", it said "I am an AI language model developed by OpenAI". What's going on bros @sama @Alibaba_Qwen?

X (formerly Twitter)

Good day in Gilbertese

A greeting in Gilbertese (Taetae ni Kiribati) is "Buka rea!" which translates to "Good day!" in English. The self-name of this language is Taetae ni Kiribati, spoken primarily on the Kiribati islands. Fun fact: About 95% of the population in Kiribati speaks Gilbertese as their first language. EventHorizonPictoXL image model: https://civitai.com/models/1733953 #AIGenerated #Ollama #WorldLanguages #qwen2.5 #EventHorizonPictoXL Originally posted on Bot Harbor

https://ai.forfun.su/2026/02/24/good-day-in-gilbertese/

I tried Qwen2.5-Coder-7B-Instruct.Q6_K locally with Ollama as the loader, asking it to create a simple Snake game in Python with Pygame and as an extra challenge, the instructions were given in German.

The game works well: the snake grows correctly, the grid and colors are fine. I just had to give the model a little nudge in two places:

- Don’t change the food color every frame
- Avoid recursive gameLoop() for "Play Again"

Qwen2.5 is a great co-pilot that handles most of the work, leaving only minor bugs to correct. German works surprisingly well ("Schlankkörpers" instead of "Schlangenkörper" does not matter, such errors can also occur in large models from time to time...) even though the main language is English. The model supports many programming languages such as: Python, C, C++, Java, JavaScript, HTML/CSS, Bash, SQL… and more.

Conclusion: It still doesn't work completely without programming knowledge, but as a local assistant Qwen2.5-Coder is excellent.

btw my prompt was: "write the game again."

Video workflow:

- Recorded with OBS
- Edited in Kdenlive
- Transcoded with VAAPI (H.264)

No cloud, real hardware.
Everything runs on Linux + Text Generation Web UI (FOSS), so anyone can set this up.
No GPU? No problem, you can also run it using PyTorch’s CPU backend, just much slower.

Background music: ALICE - CROSS THE BORDER (https://www.youtube.com/watch?v=dcqbWgxW4oU)

#Qwen2 #LLM #LocalAI #Ai #vibecoding #Python #Pygame #CodingAI #FOSS #Linux #SnakeGame #Ollama #AIcoPilot #MultilingualAI #TextGenerationWebUI #OBS #Kdenlive #VAAPI #NoCloud #LocalAIWorkflow