#approximation : a drawing, advancing or being near

- French: approximation

- German: die Annäherung

- Italian: ravvicinamento

- Portuguese: aproximação

- Spanish: aproximación

------------

Try our new word guessing game @ https://24hippos.com

24 Hippos : Word Guessing Game

24 Hippos is an hourly word guessing game that is powered by Word of The Hour (WoTH).

In the thrilling new world of #Lean4, we've finally proven bounds for the Randomized MaxCut #Approximation algorithm that nobody asked for. 📈🔍 Now you can confidently cut those graphs like a pro while impressing... well, nobody. Because who doesn't love an NP-Complete problem just chilling with some approximation algorithms? 🙄🎉
https://abhamra.com/blog/randomized-maxcut/ #RandomizedMaxCut #NPComplete #Algorithms #GraphTheory #HackerNews #ngated
Proving bounds for the Randomized MaxCut Approximation algorithm in Lean4

Arjun's website!

mistakes were made, who needs perfect symetry anyway?

#bartop #arcade #fermi #approximation

The same logic applies in mathematics: “2 + 2 = an integer.” Right. “2 + 2 = an even integer.” Even righter. “2 + 2 = 3.999” Almost right! Asimov shows that precision matters, and some answers are closer to truth than others, even if not perfectly exact. #Mathematics #Accuracy #Approximation

#approximation : a drawing, advancing or being near

- French: approximation

- German: die Annäherung

- Italian: ravvicinamento

- Portuguese: aproximação

- Spanish: aproximación

------------

Word of The Hour's Annual Survey @ https://wordofthehour.org/r/form

Word of The Hour - Annual Survey (2025)

Your responses to the questions below will directly impact the future of Word of The Hour. Your support and kindness has really meant a lot over the past three years. Thank you so much! Michael Wehar https://wordofthehour.org [email protected]

Google Docs

Правда ли KAN лучше MLP? Свойство разделения глубины между двумя архитектурами

Прошлым летом в свет вышла новая архитектура нейронных сетей под названием Kolmogorov-Arnold Networks (KAN). На момент выхода статьи про KAN эта новость произвела фурор в мире машинного обучение, так как KAN показывала существенный прирост в качестве аппроксимации различных сложных функций. Ошибка новых сетей падает значительно быстрее при увеличении числа параметров. Однако, за все приходится платить, и цена таких маленьких значений функции ошибки - медленное обучение: KAN обучается примерно в 10 раз медленнее, чем старый добрый MLP. Из всего этого возникает вопрос: насколько все же уместно использование новой архитектуры вместо привычных всем MLP? В данной статье будет найдена функция, которая может быть реализована с помощью двухслойного KAN полиномиальной ширины, но не может быть приближена никакой двухслойной ReLU MLP сетью с полиномиальной шириной

https://habr.com/ru/articles/929972/

#kan #mlp #approximation #math #machine_learning #deep_learning #science #neural_networks #research

Правда ли KAN лучше MLP? Свойство разделения глубины между двумя архитектурами

Введение Прошлым летом в свет вышла новая архитектура нейронных сетей под названием Kolmogorov-Arnold Networks (KAN). Основная статья есть в открытом доступе на архиве по следующей ссылке . На момент...

Хабр

Pi Approximation Day (22/7) actually is on 21.9911485751286 / 7 = 07-21T23:47:15.237...

#Pi #Approximation #Day #PiApproximationDay

#approximation : a drawing, advancing or being near

- French: approximation

- German: die Annäherung

- Italian: ravvicinamento

- Portuguese: aproximação

- Spanish: aproximación

------------

Word of The Hour's Annual Survey @ https://wordofthehour.org/r/form

It is well known that the "spherical cow" is an overly aggressive approximation. Hence:

https://arxiv.org/abs/2504.00506

#spherical #cow #approximation

Higher multipoles of the cow

The spherical cow approximation is widely used in the literature, but is rarely justified. Here, I propose several schemes for extending the spherical cow approximation to a full multipole expansion, in which the spherical cow is simply the first term. This allows for the computation of bovine potentials and interactions beyond spherical symmetry, and also provides a scheme for defining the geometry of the cow itself at higher multipole moments. This is especially important for the treatment of physical processes that are suppressed by spherical symmetry, such as the spindown of a rotating cow due to the emission of gravitational waves. I demonstrate the computation of multipole coefficients for a benchmark cow, and illustrate the applicability of the multipolar cow to several important problems.

arXiv.org

用「加法」來計算浮點數乘法的近似值

在「Why Does Integer Addition Approximate Float Multiplication? (probablydance.com)」這邊看到的,原文在「Why Does Integer Addition Approximate Float Mu

https://blog.gslin.org/archives/2025/02/16/12265/%e7%94%a8%e3%80%8c%e5%8a%a0%e6%b3%95%e3%80%8d%e4%be%86%e8%a8%88%e7%ae%97%e6%b5%ae%e9%bb%9e%e6%95%b8%e4%b9%98%e6%b3%95%e7%9a%84%e8%bf%91%e4%bc%bc%e5%80%bc/

#Murmuring #754 #addition #approximate #approximation #float #ieee #integer #multiplication

用「加法」來計算浮點數乘法的近似值

在「Why Does Integer Addition Approximate Float Multiplication? (probablydance.com)」這邊看到的,原文在「Why Does Integer Addition Approximate Float Multiplication?」。 這邊的浮點數主要指的是 IEEE 754-1985 (32 bits 的 single precision 與 64 bits 的 double precision),因為表示方式的性質,剛好讓「加法」與乘法的操作有點像。 這邊刻意用引號括弧起來的「加法」是直接把 IEEE 754...

Gea-Suan Lin's BLOG