Interested in numerical computing using open-source software?

Join our webinar on Scilab to learn about modelling, simulation, optimization, and data visualization for engineering applications.

📅 16 May 2026 | 4:00 PM IST
🔗 https://zoom.us/webinar/register/WN_-0CUKvimSxOfpaU06v4Yqw#/registration

#Scilab #OpenSource #NumericalComputing #Engineering #FOSSEE #IITBombay #Webinar

LPython: High performance typed Python compiler

LPython은 타입 주석이 달린 파이썬 코드를 고성능으로 컴파일하는 오픈소스 컴파일러로, LLVM, C, C++, WASM 등 다양한 백엔드를 지원합니다. 수치 계산과 배열 연산에 최적화되어 있으며, CPython과의 호환성과 상호운용성을 제공해 Numpy, TensorFlow, PyTorch 등 라이브러리 호출이 가능합니다. JIT 컴파일도 지원하며, 현재 알파 단계로 활발한 개발과 버그 수정이 진행 중입니다. 향후 다양한 CPU, GPU, TPU 아키텍처 지원과 Jupyter 통합도 계획되어 있어 AI/ML 수치 연산 파이썬 코드의 성능 향상에 유용합니다.

https://lpython.org/

#python #compiler #jit #llvm #numericalcomputing

LPython

Feature Highlights LPython is in heavy development, there are features that work today, and there are features that are being implemented. Works today Best possible performance for numerical, array-oriented code LPython gives you the speed you need for your numerical, array-oriented code. With LPython, you can write Python code that is as fast as C or C++. This is because LPython compiles your code to optimized machine code, which is the fastest way to run code on a computer.

LPython

GNU Octave 11 introduce miglioramenti significativi per prestazioni, precisione numerica, GUI e compatibilità MATLAB. Scopri tutte le novità della nuova versione. #GNUOctave #OpenSource #NumericalComputing #Linux

https://www.linuxeasy.org/gnu-octave-11-tutte-le-novita-della-nuova-release/?utm_source=mastodon&utm_medium=jetpack_social

GNU Octave 11: tutte le novità della nuova release

GNU Octave 11 introduce miglioramenti significativi per prestazioni, precisione numerica, GUI e compatibilità MATLAB. Scopri tutte le novità della nuova versione.

Linux Easy - News da Mondo Linux

I recently dumped my notes on modulus of convergence for hypergeometric functions on my website. I also had some thoughts on numerical accuracy. Not very valuable thoughts, but thoughts nonetheless.

For those who are wondering why anyone would care, many math libraries such as GSL, boost, and so on, suck. If you are trying to do intense calculations with any sort of decent accuracy, these libraries have dusty corners that fail. And they don't document where those corners are. Or they don't have routines for complex arguments. I write my own routines for these cases.

#Math #RealAnalysis #NumericalComputing

https://www.skewray.com/articles/numerical-accuracy-of-generalized-hypergeometric-series

Numerical Accuracy of Generalized Hypergeometric Series | Skewray Research

Adding floating point numbers is evil. Can we avoid the pitfalls?

A few years ago (mid 2023), I wrote up some research notes regarding the modulus of convergence of the generalized hypergeometric functions. This month I wrote up and posted a series of articles that are those notes, cleaned up a bit, and this article is the wrap-up of the series.

#Math #RealAnalysis #NumericalComputing

https://www.skewray.com/articles/modulus-of-convergence-for-generalized-hypergeometric-functions

Modulus of Convergence for Generalized Hypergeometric Functions | Skewray Research

Upper limits on the number of terms required to compute generalized hypergeometric functions

We all get the feeling that, day by day, the world is converging towards disaster. But what tells us how fast? The Modulus of Convergence does!

#Math #RealAnalysis #NumericalComputing

https://www.skewray.com/articles/modulus-of-convergence

Modulus of Convergence | Skewray Research

If you want to know how fast a sequence converges, then the Modulus of Convergence will hit the spot.

In mathematics, we say that a function is bounded if can restrict its image. Oddly, we never seem to 'bind' a function, though. I can find bounds on the the remainder of generalized hypergeometric functions, and I never used the word 'bind' either. Maybe 'bounding' refers to bunnies and deer?

#math #RealAnalysis #NumericalComputing

https://www.skewray.com/articles/remainder-of-the-conway-maxwell-poisson-distribution-normalization-factor

Remainder of the Conway-Maxwell-Poisson Distribution Normalization Factor | Skewray Research

Various bounds on the the remainder of generalized hypergeometric functions

The generalized hypergeometric series are ubiquitous in the world of computing special functions, for certain amounts of ubiquity. Turns out the speed of convergence is related to the obscure Conway-Maxwell-Poisson distribution, which no one has ever heard of - pretty much the opposite of ubiquity.

#math #RealAnalysis #NumericalComputing

https://www.skewray.com/articles/bounding-the-remainder-of-generalized-hypergeometric-series

Bounding the remainder of generalized hypergeometric series | Skewray Research

The speed of convergence of the generalized hypergeometric series is related to the obscure Conway-Maxwell-Poisson distribution.

Accurately computing generalized hypergeometric functions is hard. How many terms do we need? Guess we need a general expression for the size of the terms in the series. Oh, wait, I've got one right here!

#math #RealAnalysis #NumericalComputing

https://www.skewray.com/articles/estimating-the-terms-of-generalized-hypergeometric-series

Bounding the terms of generalized hypergeometric series | Skewray Research

Generalized hypergeometric functions have a skew of parameters. Can we put simple bounds on the terms of the summation?

I use generalized hypergeometric functions a lot. That means I can type "generalized hypergeometric function" and not make a typo. Is it possible to compute these exactly? Sometimes!

#math #NumericalComputing

https://www.skewray.com/articles/computing-generalized-hypergeometric-function-from-series

Computing Generalized Hypergeometric Function from Series | Skewray Research

How are the generalized hypergeometric functions computed? Can we get an exact answer? Sometimes, yes!