FCLC

@fclc@mast.hpc.social
929 Followers
686 Following
7.6K Posts

HPC, BLAS, I make things FAST

Personal account 😊 

Currently making #RISCV and #HPC work on reduced precision, high IO bandwidth hardware

Previously making the weather forecast more precise by using less precise numbers
TLDR; 🇨🇦🐧🧑🏼‍💻🚴🏎️🧗🏼 🟦🟪🟥

Lots of #FP16 & #BLAS these days

Interests: #AVX512 #RVV #SVE #SME #SIMD #SYCL #F1 #HPC & making code faster than a golden retriever with the zoomies

Born below 365PPM

(Moderator for HPC.social)

STFU WITH THIS NONSENSE. USERS ALREADY CAN'T TRUST THE MONOTONICITY OF YOUR MATRIX ACCUMULATOR.

https://developer.nvidia.com/blog/introducing-nvfp4-for-efficient-and-accurate-low-precision-inference

Introducing NVFP4 for Efficient and Accurate Low-Precision Inference

To get the most out of AI, optimizations are critical. When developers think about optimizing AI models for inference, model compression techniques—such as…

NVIDIA Technical Blog

Excuse me for a moment but WHAT THE ACTUAL FLYING FUCK NVIDIA?!?!?

WHY ARE YOU REINVENTING MXFP4 WITH EXTRA STEPS!?!?? WE JUST STANDARDIZED THIS SHIT FOR FUCKS SAKE. NONE OF THIS NVFP4 BS. STOP IT

AHHHHHHHHHHHHHHHHHHHHHH

As much as I like macos, the fact that you can't universally turn off video reactions for all apps is moronic.

I speak with my hands, i'll be half way through discussing cache latency tradeoffs, and some god forsaken balloons are covering my screen 😾

🚀 Great news for OpenMP on Python!

NumPy 2.3 includes early OpenMP support, making sorting operations like np.sort and np.argsort faster by using multiple processor cores — a big step for performance!

🛠️ This new feature is off by default but can be turned on during installation with -Denable_openmp=true

This marks the beginning of more parallel computing support in NumPy!

https://www.phoronix.com/news/NumPy-2.3-Released

#NumPy #Python #Performance #OpenMP #HPC

NumPy 2.3 Introduces OpenMP Parallelization Support

NumPy 2.3 is out today as the latest release of this widely-used library for scientific computing

2025 Lyttle Lytton Contest winners

http://adamcadre.ac/lyttle/2025.html

The 2025 Lyttle Lytton Contest

LLNL has an interesting vision of the future of HPC and workflows that aligns with a lot of what I heard at ISC: #HPC is no longer just the supercomputer, but the end-to-end services and ecosystem that enable discovery. The description is in Attachment (1) here: https://hpc.llnl.gov/fg-hpcc-rfi
Request for Information–Future Generation High Performance Computing Center | HPC @ LLNL

This website enables public access to Request for Information No. HPC-007 (RFI) pertaining to a Future Generation High Performance Computing Center. The RFI points of contact are LLNS Contract Analyst Gary Ward (ward31@llnl.gov) and Distinguished Member of Technical Staff Dr. Todd Gamblin (gamblin2@llnl.gov).

We’ve got a request for information out on where we want to take Livermore Computing and other #HPC centers in the next five years.

https://hpc.llnl.gov/fg-hpcc-rfi

Check it out and send us your thoughts.

Request for Information–Future Generation High Performance Computing Center | HPC @ LLNL

This website enables public access to Request for Information No. HPC-007 (RFI) pertaining to a Future Generation High Performance Computing Center. The RFI points of contact are LLNS Contract Analyst Gary Ward (ward31@llnl.gov) and Distinguished Member of Technical Staff Dr. Todd Gamblin (gamblin2@llnl.gov).

From that other platform (I do wish @radxa would join BlueSky or the Fediverse as I personally do not use Elon's stuff at all if I can avoid it!) Congratulations to UCSD and Texas Tech! https://x.com/RadxaComputer/status/1934949891308187686
Radxa (@RadxaComputer) on X

$6K hardware cap + 250W power limit, yet students built “shoebox supercomputers” and raced them head-to-head at #SBCC25! 🏆 UC San Diego & Texas Tech dominated with #Radxa Rock Pi 5B and X4 clusters—proving price-perf edge AI is here. Learn more: https://t.co/a0TBUzGpIc

X (formerly Twitter)
AHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH