Apparently the functions that result from ReLU neural nets are precisely(!??) tropical rational maps:
https://arxiv.org/abs/1805.07091

(Also, I just noticed I messed up the hashtag on the post I'm replying to. #NeuralNetworks #MachineLearning #TropicalGeometry )

Tropical Geometry of Deep Neural Networks

We establish, for the first time, connections between feedforward neural networks with ReLU activation and tropical geometry --- we show that the family of such neural networks is equivalent to the family of tropical rational maps. Among other things, we deduce that feedforward ReLU neural networks with one hidden layer can be characterized by zonotopes, which serve as building blocks for deeper networks; we relate decision boundaries of such neural networks to tropical hypersurfaces, a major object of study in tropical geometry; and we prove that linear regions of such neural networks correspond to vertices of polytopes associated with tropical rational functions. An insight from our tropical formulation is that a deeper network is exponentially more expressive than a shallow network.

> Geometric objects take on different properties depending on the space in which you [embed] them.

And then: fun with #tropicalGeometry!

https://www.quantamagazine.org/tinkertoy-models-produce-new-geometric-insights-20180905/

https://arxiv.org/abs/1701.06579
https://arxiv.org/abs/1808.01285

#mathNews h/t +Taufik Yusof

Tinkertoy Models Produce New Geometric Insights | Quanta Magazine

An upstart field that simplifies complex shapes is letting mathematicians understand how those shapes depend on the space in which you visualize them.