Power transformers are critical components in modern electrical infrastructure, enabling efficient transmission and distribution of electricity across long distances.
#Transformer #wolink #ourmechanicalworld
https://www.ourmechanicalworld.com/archives/742
Top 10 Power Transformer Manufacturers In The World - Ourmechanicalworld.com

Power transformers are critical components in modern electrical infrastructure, enabling efficient transmission and distribution of electricity across long

our mechanical world
Transformers are essential components in modern electrical systems, enabling efficient voltage regulation and power transmission across grids, industrial facilities, and residential networks.
#Transformer #wolink #ourmechanicalworld
https://www.ourmechanicalworld.com/archives/6104
What Liquid Is Inside A Transformer And What Is It Used For - Ourmechanicalworld.com

Transformers are essential components in modern electrical systems, enabling efficient voltage regulation and power transmission across grids, industrial

our mechanical world
Transformer cooling is a fundamental aspect of power system design, directly influencing efficiency, reliability, and operational safety.
#Transformer #wolink #ourmechanicalworld
https://www.ourmechanicalworld.com/archives/5803
How Are Transformers Cooled? 10 Key Methods - Ourmechanicalworld.com

Transformer cooling is a critical aspect of electrical power system design, ensuring operational efficiency, safety, and longevity. Transformers generate heat

our mechanical world
The 7 main causes of transformer explosions include insulation failure, overloading, electrical faults, oil degradation, lightning surges, poor maintenance, and external damage.
#transformer #wolink #ourmechanicalworld
https://www.ourmechanicalworld.com/archives/11577
Why Do Transformers Explode? 7 Main Causes Explained - Ourmechanicalworld.com

Power transformer explosions are rare but high-impact events that can lead to severe equipment damage, power outages, fire hazards, and even safety risks for

our mechanical world

Краткий справочник про внимания (self-attention, cross-attention, multi-head attention)

Механизм внимания (Attention) - это метод в искусственном интеллекте, который позволяет нейросети динамически определять, какие части входных данных наиболее важны для текущей задачи. Он работает через вычисление весов важности для разных элементов входа: более важные элементы получают больший вес, а менее важные - меньший. Затем модель формирует взвешенную сумму представлений, создавая новый контекстный вектор. Self-attention, в свою очередь, помогает модели понимать, как разные элементы входных данных связаны между собой. Например, как разные части информации взаимодействуют и влияют друг на друга в общем контексте. Этот механизм обеспечивает логическую связность и целостное понимание всей структуры данных

https://habr.com/ru/articles/1020624/

#машинное_обучение #внимание #искуственный_интелект #attention #selfattention #глубокое_обучение #pytorch #transformer #beginner #математика

Краткий справочник про внимания (self-attention, cross-attention, multi-head attention)

Механизм внимания (Attention) - это метод в искусственном интеллекте, который позволяет нейросети динамически определять, какие части входных данных наиболее важны для текущей задачи. Он работает...

Хабр

Github Awesome (@GithubAwesome)

9백만 파라미터 규모의 언어모델 GuppyLM이 새로 공개됐다. 6층 vanilla transformer로 처음부터 학습되며, SwiGLU와 RoPE를 쓰지 않는다. 무료 Colab T4 GPU에서 5분 만에 학습 가능하고, 전체 파이프라인이 공개돼 소형 모델 학습·재현에 유용하다.

https://x.com/GithubAwesome/status/2041322379071426758

#languagemodel #opensource #transformer #smallmodel #airesearch

Github Awesome (@GithubAwesome) on X

GuppyLM is a 9-million parameter language model built from scratch that does exactly one thing: pretends to be a small fish named Guppy. No SwiGLU, no RoPE. Just a pure vanilla 6-layer transformer. Trains in 5 minutes on a free Colab T4 GPU.The entire pipeline is exposed: data

X (formerly Twitter)

Amazon To Launch A New AI Phone Called Transformer - Ep. 131 #amazon #transformer #smartphone #cellphone

Thanks for check out my Short. Want a deeper dive into stories like this? Check out my long-form videos on YouTube:
https://www.youtube.com/@TheBusinessBehindTheNews

#business #businessnews #tbbtn #youtube #shorts #reels #tiktoks #vids #fyp

https://www.pub-x.com/808067/ 【🦖ディノブレイカー20周年!】ビークルが恐竜に変形する「ディノブレイカー」全5種を一気にレビュー!【トイザらス限定】 #Candy #chainese #hasbro #https://www.youtube.com/channel/UCUNo0xntGxghrW7TKunOSxA #KPopBoys #KPopIdols #KOREA #review #toy #transformer #タカラトミー #ダガング #ディノ(DINO) #トランスフォーマー #中国 #合体 #変形 #韓国 #食玩 #駄玩具

Strategic risk is a data-loss problem. Most models fail because they try to "read" the language of a 10-K.

​We’ve taken a different path with the Shiki Engine. We decomposed BERT4Rec, salvaged the Encoder/Cross-Attention layers, and amputated the semantic dependency. By using Learnable Positional Embeddings, we map the structural DNA of a vendor ecosystem as a high-dimensional Latent Vector.

​We don't want the story; we want the sequence of failure. 🧬

​#SDS #Transformer #InfoSec #AI

Time series data 的預測

前幾天看到「TimesFM (via)」這個,看起來有陣子了,最早的時候是 Google 在 2024 年的「A decoder-only foundation model for time-series forecasting」這篇,接著是 2024 年年底的時候出 2.0,然後是 2025 年九月的時候有個 2.

Gea-Suan Lin's BLOG