#BTW: We won't be #SeeingAnymore of #LtCommanderChristianSlater; because, #Reasons... And, #Nancy has #VanishedAbroad...

And, #Genocide is #StillGenocide; #ByEveryMeasure of #Genocide...

#That's #NotChanged #OneBit...

#YouKnowIT when #YouSeeIT...

#HappyPizzaForBrunchDay...

#LookNoPizzaBoxes...!

๐Ÿง™๐Ÿป๐Ÿค–โ€‹๐Ÿค–๐Ÿป๐Ÿง™ | ๐Ÿ•๐ŸŽ ๐Ÿฆน๐Ÿฆ„โ€‹๐Ÿฆน๐ŸŽ ๐Ÿ•

Update on my joint work with @JulianTachella on "#Learning to Reconstruct Signals From Binary Measurements" on arXiv. #RandomProjection #onebit #SelfSupervisedLearning https://arxiv.org/abs/2303.08691

We brought several improvements on the proofs and the bounds allowing us to determine from how many binarized (random) projections, only, one can learn, up to a controlled identification error, a low-complexity space (with small box dimemsion). Moreover, a practical #selfsupervised scheme, SSBM, run over real datasets of images, enables to learn a reconstruction algorithm from those same binary observations (without access to the original images and on par with supervised alternatives), implicitly confirming the encoding of a good estimate of the image set.

Learning to Reconstruct Signals From Binary Measurements

Recent advances in unsupervised learning have highlighted the possibility of learning to reconstruct signals from noisy and incomplete linear measurements alone. These methods play a key role in medical and scientific imaging and sensing, where ground truth data is often scarce or difficult to obtain. However, in practice, measurements are not only noisy and incomplete but also quantized. Here we explore the extreme case of learning from binary observations and provide necessary and sufficient conditions on the number of measurements required for identifying a set of signals from incomplete binary data. Our results are complementary to existing bounds on signal recovery from binary measurements. Furthermore, we introduce a novel self-supervised learning approach, which we name SSBM, that only requires binary data for training. We demonstrate in a series of experiments with real datasets that SSBM performs on par with supervised learning and outperforms sparse reconstruction methods with a fixed wavelet basis by a large margin.

arXiv.org
Too Cool for 8-bit Retro? Try 1-bit Gaming (https://hackaday.com/20...

Too Cool for 8-bit Retro? Try 1-bit Gaming (https://hackaday.com/2023/02/14/too-cool-for-8-bit-retro-try-1-bit-gaming/) image https://hackaday.com/wp-content/uploads/2023/02/moto1.png #retrocomputing #mc14500 #onebit #hackaday posted by pod_feeder_v2 (https://gitlab.com/brianodonnell/pod_feeder_v2/)

diaspora* social network
@ccanonne I gave a talk on how to "quantize" JL lemma in 2014, slides are available here https://laurentjacques.gitlab.io/event/when-buffon-s-needle-problem-helps-in-quantizing-the-johnson-lindenstrauss-lemma/ I recall visually the main principles of the one-bit (sign) case from slide 22. There is a very good review on the topic by Sjoerd Dirksen https://www.semanticscholar.org/paper/Quantized-Compressed-Sensing%3A-A-Survey-Dirksen/dda6061642b78c8f21bca1185e035f410a48db79 #OneBit #CompressiveSensing #RandomProjection #JohnsonLindenstrauss
When Buffon's needle problem helps in quantizing the Johnson-Lindenstrauss Lemma | Laurent Jacques

Abstract: In 1733, Georges-Louis Leclerc, Comte de Buffon in France, set the ground of geometric probability theory by defining an enlightening problem: What is the probability that a needle thrown randomly on a ground made of equispaced parallel strips lies on two of them?

Laurent Jacques