today I'm thinking about the "don't use floating point for money" advice I hear all the time. It obviously has a lot of truth to it.

But -- Excel/Google Sheets uses floating point for all of its calculations, people use spreadsheets for money calculations all the time, and it generally seems to work just fine -- the results get rounded for display.

So I'm trying to figure out if there's a more nuanced guideline than "never use floating point for money".

@b0rk My first reaction to this is kinda condescending, which is just “Don’t use excel or google sheets for money if you actually need accuracy” and I’m debating how much that’s genuine advice I’d stand behind.

Possibly losing pennies or dollars or thousands of dollars to rounding errors is gonna be small potatoes compared to other errors caused by the difficult-to-debug nature of spreadsheets (see for example https://www.science.org/content/article/one-five-genetics-papers-contains-errors-thanks-microsoft-excel).

One in five genetics papers contains errors thanks to Microsoft Excel

@simrob I'm struggling to see a real world use case where you would lose even 1 cent to a floating point error in a spreadsheet

like I'm sure it exists! But I personally do not know how to make a spreadsheet that does a remotely realistic calculation that has such an error

@b0rk @simrob when I was working in finance the biggest problem related to rounding and spreadsheets was that numbers often were copied around between sheets, to and from proposals, etc, and those would be the displayed rounding not the underlying floating point. So not floating point's fault but more software UX and process/provenance issues.
@b0rk @simrob I basically agree that floating point has been made too big of a scapegoat and lack of version control, easy-to-ruin data provenance, and sloppy auto coercions are excel's bigger flaws.