Bindu Reddy (@bindureddy)

Codex 5.3이 agentic coding 성능에서 Opus 4.6을 앞섰고 속도는 매우 빠르다고 보고되었습니다. 다만 xHigh 버전은 비용이 클 수 있으며, 전체 글로벌 평균 점수는 여전히 Opus 4.6에 뒤처진다고 평가되어 장단점이 혼재한 신모델 소식입니다.

https://x.com/bindureddy/status/2026772003496276276

#codex #agentic #aimodels #opus #benchmark

Bindu Reddy (@bindureddy) on X

Codex 5.3 TOPS AGENTIC CODING Codex 5.3 surpasses Opus 4.6 to top agentic coding. It's also BLAZINGLY fast. That said, the xHigh version can be very expensive It's overall global average score lags behind Opus 4.6 which is the current leader

X (formerly Twitter)

cedric (@cedric_chee)

'don't tell the Slopus'라는 멘션과 함께 Codex 5.3 및 Opus 4.6에 맞춰 가이드를 업데이트해야 한다는 알림입니다. 최신 버전( Codex 5.3, Opus 4.6) 반영이 필요하다는 개발자 대상 공지성 메시지로, 도구·모델 사용 가이드 업데이트를 예고합니다.

https://x.com/cedric_chee/status/2026706074339025286

#codex #opus #ai #guide

cedric (@cedric_chee) on X

"don't tell the Slopus" was fun. About time to update this guide for Codex 5.3 and Opus 4.6 https://t.co/0xuMdIZ8BC

X (formerly Twitter)

Dan McAteer (@daniel_mac8)

Opus 4.6가 인간이 15시간 걸리는 작업을 처리할 수 있게 되었고, 핵심은 'Auto-compaction'이라고 설명. 자동 압축 기술 덕분에 이전에는 문제였던 것이 자산이 되었고, 모델들이 자체 컨텍스트 윈도우를 관리하는 능력이 크게 향상되었다고 주장함.

https://x.com/daniel_mac8/status/2026639943662010398

#opus #autocompaction #contextwindow #efficiency

Dan McAteer (@daniel_mac8) on X

Opus 4.6 can now work on tasks that take humans 15h. If you've been paying attention, you know why: > Auto-compaction It used to be a liability, now it's an asset. Models are incredible at managing their own context window now.

X (formerly Twitter)

Perplexity führt den Computer ein – eine Umgebung für End-to-End-Entwicklung.

Das System orchestriert 19 Sprachmodelle unter der Leitung von Opus. Agenten übernehmen Research, Coding und Deployment autonom. Ein persistenter Speicher hält den Kontext über Projekte hinweg. Abrechnung erfolgt strikt per Token-Verbrauch. #Perplexity #Opus #KI
https://www.all-ai.de/news/news26top/perplexity-computer-19

Perplexity Computer: Plattform nutzt 19 KI-Modelle gleichzeitig

Eine neue Plattform integriert hunderte Schnittstellen und dauerhaften Speicher. Sie plant technische Abläufe völlig selbstständig.

All-AI.de

Bindu Reddy (@bindureddy)

Anthropic은 중국 AI 연구소들이 자사 모델 출력물을 무단으로 사용했다고 주장함. 특히 Opus 모델을 수백만 번 프롬프트하여 이를 기반으로 새로운 LLM을 학습시키는 방식이 언급됨. 이는 AI 개발 과정에서의 데이터 사용 윤리와 저작권 문제를 둘러싼 중요한 논의로 부상함.

https://x.com/bindureddy/status/2026017580658622823

#anthropic #opus #llm #aiethics #chinaai

Bindu Reddy (@bindureddy) on X

Oops, Anthropic says all the Chinese labs stole their model outputs! The easiest way to train a frontier LLM is to prompt Opus millions of times and then simply train on it's outputs Every AI lab does this to some extent but it seems the Chinese models did it very blatantly In

X (formerly Twitter)
Did #anthropic just change access to the 1M context window models? I was busily making cool stuff, and then all of a sudden the 1M models (#sonnet / #opus) started returning 401, so I had to switch to their smaller versions.

新清士@(生成AI)インディゲーム開発者 (@kiyoshi_shin)

Anthropic의 Opus 모델이 매우 성인 지향적 창작도 가능하다는 사용자 의견입니다. 사용자는 Opus가 창작 도구로서 사용자의 표현 자유를 존중하는 설계 철학을 가진 것으로 추정하고 있습니다. 이는 모델 검열 정책·창작 자유 간의 균형을 보여주는 사례로, AI 창작용 모델 설계 방향에 대한 통찰을 제공합니다.

https://x.com/kiyoshi_shin/status/2025705882462662898

#anthropic #opus #ai #creative #model

新清士@(生成AI)インディゲーム開発者 (@kiyoshi_shin) on X

Opusはかなりな18禁描写書けますね…。ただ、書かせるにはコツが必要で、方法をはっきり共有しませんが…。正面からお願いすると断られます。Anthoropicが、なぜ認めているのか考えたのですが「創作ツール」の側面が明快なほど、ユーザーの意図を阻害しないという設計思想なんだろうと推測。

X (formerly Twitter)

AOM is also working on OAC, the successor to the #Opus audio codec. I'm surprised for a few reasons:

1. Didn't feel like there was any room to grow. AAC is transparent at 192k if not lower. Opus is great at 96k and hopefully transparent at some point (128k?).
2. Opus has been around over a decade yet still feels like it hasn't had its moment. It's used in the background a lot (even at Netflix and Spotify?) but I always hesitate to share Opus because it never seems to fucking work on e.g. Discord and iOS.
3. MP3 was tinkered with for decades and grew into LAME with VBR so a 20 year old codec was still fantastic.
4. Opus is the greatest name for an audio codec of all time.

But I suppose AI or traditional codecs will continue to improve, especially below 96k.

https://github.com/AOMediaCodec/oac
https://wiki.xiph.org/Opus_Recommended_Settings

GitHub - AOMediaCodec/oac

Contribute to AOMediaCodec/oac development by creating an account on GitHub.

GitHub

Opus Duo wär ein lustiger Name gewesen.

#Opus #OAC

Kein schöner Name. Hätte lieber Opus 2 gehabt.

AOMedia Open Audio Codec "OAC" Aims To Be The Successor To #Opus

https://www.phoronix.com/news/AOMedia-OAC-Open-Audio-Codec

#OAC

AOMedia Open Audio Codec "OAC" Aims To Be The Successor To Opus

While the Alliance For Open Media 'AOMedia' is most known for developing the AV1 open video codec, the associated AV1 Image File Format (AVIF), and the next-generation AV2, they are now working on the Open Audio Codec (AOC).