"you can't debug print shaders" oh yeah? well check this shit out

that's right, a whole ass view matrix, served directly from the GPU

I'm using the R1 single-channel 1bpp 2D texture format for this one
@acegikmo Just to make sure I understand the code right, thats encoding the numbers such that each pixel's row corresponds to one of the dBits, and the col is the index of the bit in the number in dBit? (AKA the row is binary encoded?
@deef yep!
@acegikmo Is there any particular reason you encoded using decimal instead of 0b111010111111101 etc? (I *think* c# supported binary, hex, and oct literals, anyway)
@deef because it looks funnier that way! but also idk if hlsl supports binary literals? It started as a binary literal in C# though before I moved it to hlsl

@acegikmo Ah. I thought it was C#, not hlsl. my mistake. Dunno how it being for a shader didn't tip me off to that..., but I also should not have read float DrawDigit as func DrawDigit lol.

In any event, it appears no. binary for literals is not supported https://learn.microsoft.com/en-us/windows/win32/direct3dhlsl/dx-graphics-hlsl-appendix-grammar#integer-numbers Tho hex and oct are, so its odd its missing.

Grammar - Win32 apps

HLSL statements are constructed using the following rules for grammar.

@deef @acegikmo I think this may be the first time I've seen a somewhat legit use case for octal other than Unix permissions. (although it would still be less readable than binary)

Interestingly, binary is a late addition in many languages. C# didn't have it until version 7, and C++ until version 14.