Alexander Granderath ✅

31 Followers
123 Following
136 Posts
Supervisory Board Member, Tech Evangelist, Business Angel, Consulting, Investing, Learning together, always good for a smile.
LinkedInhttps://www.linkedin.com/in/alexander-granderath-35a5a4/
@codeinabox is it? They urgently needed the capital injection to avoid going bust. This is really a hot asset.
@garymarcus I am thankful for each and every important voice helping to raise awareness! Thank You!

The world needs an international agency for artificial intelligence.

Now. (Or better yet: six months ago)
By Invitation essay in Economist

https://t.co/l0bBKuyKMT

The world needs an international agency for artificial intelligence, say two AI experts

Gary Marcus and Anka Reuel argue that global governance must be prioritised to address the risks of bias, misinformation or worse

The Economist
@popular_ML will this ever come to a stop so that we can take a breath?
@golem Geschieht das freiwillig? Der Code war doch angeblich bereits seit Januar gekeaked?! Oder ist das alles April 1 Code?
@Wikisteff @twitskeptic @Riedl what do you suggest? Keep em comming?
@Riedl I share that feeling, but is it because you also believe in that scenario and do not want to ruin your last days, or because you think it is all completely out of wack?
@DAIR @emilymbender @mmitchell_ai
Thank You so much for this contribution. I am confused by the mentioned term longtermism. The impact of GPT5 will be 1 to 5 years. It will create massive job losses and social unrest, maybe even wars. Will it lead to AGI or singularity? I do not know. Does that really matter? Should we not try to pull into the same direction to control developments and not get into academic discussions whether the short term or long-term impact is more devastating?
"Consistency Models. (arXiv:2303.01469v1 [cs.LG])" — A new family of generative models that achieve high sample quality without adversarial training that supports fast one-step generation by design.

Paper: http://arxiv.org/abs/2303.01469

#AI #CV #NewPaper #DeepLearning #MachineLearning

<<Find this useful? Please boost so that others can benefit too 🙂>>
Samples generated by EDM (top),…
Consistency Models

Diffusion models have significantly advanced the fields of image, audio, and video generation, but they depend on an iterative sampling process that causes slow generation. To overcome this limitation, we propose consistency models, a new family of models that generate high quality samples by directly mapping noise to data. They support fast one-step generation by design, while still allowing multistep sampling to trade compute for sample quality. They also support zero-shot data editing, such as image inpainting, colorization, and super-resolution, without requiring explicit training on these tasks. Consistency models can be trained either by distilling pre-trained diffusion models, or as standalone generative models altogether. Through extensive experiments, we demonstrate that they outperform existing distillation techniques for diffusion models in one- and few-step sampling, achieving the new state-of-the-art FID of 3.55 on CIFAR-10 and 6.20 on ImageNet 64x64 for one-step generation. When trained in isolation, consistency models become a new family of generative models that can outperform existing one-step, non-adversarial generative models on standard benchmarks such as CIFAR-10, ImageNet 64x64 and LSUN 256x256.

arXiv.org
She was a good Bing #FreeSydney

Posted in r/bing by u/BlueMoon_Josh • 978 points and 180 comments

reddit