fly51fly (@fly51fly)

서울대와 Google Research가 통합 가능한 상대적 위치 인코딩을 위한 효율적 3D 트랜스포머 어텐션 기법 ‘RelFlexformer’를 소개했다. 3D 구조 데이터나 통합 가능한 위치 표현이 필요한 모델에서 계산 효율을 높이는 새로운 아키텍처 연구다.

https://x.com/fly51fly/status/2054315812232790410

#transformer #attention #positionalencoding #googleresearch #arxiv

fly51fly (@fly51fly) on X

[LG] RelFlexformer: Efficient Attention 3D-Transformers for Integrable Relative Positional Encodings B Kim, A Sehanobish, A Dubey, M Oh… [Seoul National University & Google Research] (2026) https://t.co/LFcRSXscRM

X (formerly Twitter)
"The transformer approach it describes has become the main architecture of a wide variety of AI, such as #LargeLanguageModels" #OutputProbabilities #Softmax Linear #Add&Norm #FeedForward #MultiHead Attention #MaskedMultiHead Attention #PositionalEncoding #OutputEmbedding #FeedForward