#Matlab seems to be mostly recovered. Anyone installed new downloads or add-ons successfully? Anyone noticed any problems?
@adredish all working now at our end
* Based on n = 1 student claim
@ninsel @adredish Matlab? That’s from ancient Babylon right? I thought the last living user had died out ages ago…

@eejd @ninsel

And what vectorized programming language do you use?

As noted in the discussion chains, python and numpy are orders of magnitude less efficient, being both slower and more memory intensive than Matlab.* (Even ignoring the *nightmare* bloat that is the python ecosystem.) Older, closer-to-the-architecture languages like C and its descendants are difficult to maintain clear codesets for, and very hard to write efficient GPU and NPU algorithms in. Although there is some evidence that Julia might achieve the efficiency in a clean, open-source structure, it doesn't have the codebase that the other languages do.

* Matlab is particularly well-suited for neurophysiology. Neural analysis algorithms that only run on the supercomputer in python run easily on my desktop in Matlab with an inexpensive GPU.

I would say that among the serious neurophys labs that I know, they are about 60/40 python to Matlab, with the python labs defending their choice based on "open source code" and the Matlab labs defending their choice based on "efficiency", although you are right that the Matlab labs generally seem unusually embarrassed about it.

@adredish @eejd I am guessing you mean 60/40 within as well as across labs?

For us: different tools for different applications. Can't say as I care much (or as much as I should) about computational efficiency. Where it matters, e.g. in our DIPLOMAT (multi-animal tracking) software, our engineer has found many tricks for optimizing (Also I rage-quit Matlab for video analysis in 2015).

Interesting to me that new students are split: some seem to take to Matlab more readily, others to Python. The rare student who knows what they are doing is able to adapt to whichever is more appropriate to a task.

My one point of embarrassment is that I do all my stats in Matlab. It's the tool I know!

@adredish i promise i come in peace and do not come for language wars (and would encourage anyone else who sees this and is tempted to reply to do the same)

I have seen you say this a few times, and don't doubt it, but had exactly the opposite experience, and it's probably because most of the MATLAB code i was using from my old lab was written poorly.

I am curious if you could point me to an example of some code that is fast in MATLAB and slow in python/numpy? doesn't need to be perfect 1:1, i just want to see if i can see where the slowness in numpy comes from.

@jonny code with loops if directly translated can be faster in MATLAB because it has a JIT compiler. Depending where you get your compiled version of numpy from the BLAS library in MATLAB can be faster if compiled using Intel MKL. But if you're using say conda forge I think it's compiled against the same blas library so should be identical.

The memory one I don't fully understand. Python uses passes pointers rather than copies. MATLAB used to be terrible for passing copies but these days has a copy on write optimization that usually eliminates the problem. I feel like I can't quite imagine a situation where python would be worse here, unless you were forcing it to make lots of array copies.

@neuralreckoning Numpy does have a number of gotcha points where it can do surprisingly expensive things when you don't expect it (mostly copying), but i am always able to identify and resolve them with profiling. Especially with tools like Dask I feel like it's as fast as i ever need it to be even with complex array ops. I think "how hard is it to make the thing go fast" is included in questions of "is the thing fast or not," bc if it takes me a million hours to optimize something vs. another thing being fast in the immediately intuitive way, that matters. I do believe the empirical experience that @adredish is having, where there might be some domains where some operations can't be expressed efficiently, i am just curious what those are, and e.g. if there is some non-obvious way to optimize them that is a barrier for MATLAB folks.
@jonny @adredish I don't know if it's still true but the fft algorithm if you gave it an array with length not a power of 2 could be crazily slow. I have a feeling they fixed that by padding at some point, though.

@neuralreckoning @jonny @adredish Recently, I've been having a lot of fun using Jax to write Python code with the Numpy API that is easily jit compiled and run on a GPU. Very cool tool, very powerful, and another factor in the performance comparison between the two languages. Not sure how it fares on neural data processing tasks, specifically.

On the other hand, Jax is a bit janky, complicated, and bolted onto the side of Python, and I spent a whole afternoon in conda just finding the magic combination of library versions to get it working on my machine. So, while I see this as a win for Python, I'm sure others would claim the opposite. I think it mostly depends on your preferences in style and what flavors of BS you're willing to put up with. 🙂

@ngaylinn @neuralreckoning @adredish
yeah that is one of the major differences for sure. the neuroscientists that i have known that use MATLAB don't really tend to think in terms of 'packaging' and having dependencies from a broader, programmatically linked ecosystem of tools, but will have essentially a locally-vendored copy of whatever they need to get the job done, either per-machine or in a very long-running labwide repository. the MATLAB 'packages' that i have used that aren't from mathworks usually have very few versions, so that is mostly fine. so to them the idea of managing dependencies is a total nightmare, and it really used to be before the renaissance of python packaging in the last few years.

I think the view is similar to the way some people view the javascript framework world - that's so complicated! too many new things all the time! i just want one thing that works forever. and that's a totally valid opinion to have, especially if you're not interested in participating in the culture of open source, which is also totally valid.

@jonny @ngaylinn @adredish there is still a problem in python packaging though. It's not with the packaging system itself which I agree is so much better now, it's the fact that many package authors make little to no attempt to be backwards compatible, meaning you often end up needing very specific combinations of versions. Sometimes this means two packages can't be used together.
@jonny @adredish actually I could also imagine code where you write something like (X*2)*3 which would make multiple copies of an array X but in a JIT loop would make none.

@adredish @eejd @ninsel

This is indeed the reason #julialang exists. Give it a spin! Easy to switch from Matlab or python, but sane, explicit, fast, fun...

I do 90% of my labs work (EEG) there nowadays!

The codebase is not that a big issue, given you can easily call python via pythoncall.jl for things not yet implemented

You get an awesome package manager with versioning too! Something Matlab is just terrible in IMHO

@BenediktEhinger

Yes, I am intrigued by #Julia as a response to the #Matlab crisis.

Can you give a good place to start? (Basic tutorial, language definition, best IDE / package management system?)

My specific targets are
(1) large vectorized computations, preferably (1a) automatically GPU'd when possible and
(2) hardware control of (2a) arduino devices (access as sending characters over a serial port is fine) and (2b) online (at speed) video processing and storage, including camera hardware signaling.

Alternatively to (2), I have considered #BonsaiRx. Really good would be having a way of linking Julia with BonsaiRx (which does video well, but not the other task-control stuff).

@adredish

I really like this to get started: https://modernjuliaworkflows.github.io/

most people use VSCode

for GPU, I'll write you next week or so, I prepare a 1h workshop for my group and will see what material I like - https://juliagpu.org/ for now

For Arduino I dont know much, there does not seem to be a too active package Arduino.jl maybe? But seems inactive.

There was a recent paper for embedded computing in julia, maybe thats helpful?

Modern Julia Workflows