1 bit is all we need: binary normalized neural networks https://arxiv.org/abs/2509.07025v1 #compsci #machinelearing
1 bit is all we need: binary normalized neural networks

The increasing size of large neural network models, specifically language models and foundational image models, poses deployment challenges, prompting efforts to reduce memory requirements and enhance computational efficiency. These efforts are critical to ensure practical deployment and effective utilization of these models across various applications. In this work, a novel type of neural network layers and models is developed that uses only single-bit parameters. In this novel type of models all parameters of all layers, including kernel weights and biases, only have values equal to zero or one. This novel type of models uses layers named as binary normalized layer. These binary normalized layers can be of any type, such as fully connected, convolutional, attention, etc., and they consist of slight variations of the corresponding conventional layers. To show the effectiveness of the binary normalized layers, two different models are configured to solve a multiclass image classification problem and a language decoder to predict the next token of a sequence. The model to solve the image classification has convolutional and fully connected layers, and the language model is composed of transformer blocks with multi-head attention. The results show that models with binary normalized layers present almost the same results obtained by equivalent models with real 32-bit parameters. The binary normalized layers allow to develop models that use 32 times less memory than current models and have equivalent performance. Besides, the binary normalized layers can be easily implemented on current computers using 1-bit arrays, and do not require the development of dedicated electronic hardware. This novel type of layers opens a new era for large neural network models with reduced memory requirements that can be deployed using simple and cheap hardware, such as mobile devices or only cpus.

arXiv.org

#Scientific papers include more and more often #replication packages with valuable #datasets that are then frequently used to train #AI and #MachineLearing models.

But with datasets not properly #documented with information about their #provenance, #biases and other #social concerns, this is risky as the #ML models will use in environments for which the data was not representative, yielding potentially wrong conclusions.

In this work, we have analyzed the datasets in two top dataset journals to study their #documentation #practices and propose a few recommendations to improve the current situation.

Paper accepted in the 𝘕𝘢𝘵𝘶𝘳𝘦'𝘴 𝘚𝘤𝘪𝘦𝘯𝘵𝘪𝘧𝘪𝘤 𝘋𝘢𝘵𝘢 journal

Pre-print https://arxiv.org/pdf/2401.10304

I’ll be at #NeurIPS the coming days. Ping me if you wanna hang out, grab a drink, or talk #probabilistic #machinelearing.

Spherical clustering got me acting strange...

#kmeans #machinelearing #sphere

Just another paper on this ML procedure and then maybe I can switch to learn how to do all that in Julia...🙏🏾
#julialang #python #machinelearing
El lado del mal - La Importancia de Python: Imprescindible para Inteligencia Artificial y Big Data (además de para el pentesting y el hacking) https://www.elladodelmal.com/2023/10/la-importancia-de-python-imprescindible.html #python #IA #AI #MachineLearing #DataScientists #BigData #InteligenciaArtificial #Cursos #LLM #Formación
La Importancia de Python: Imprescindible para Inteligencia Artificial y Big Data (además de para el pentesting y el hacking)

Blog personal de Chema Alonso (CDO Telefónica, 0xWord, MyPublicInbox, Singularity Hackers) sobre seguridad, hacking, hackers y Cálico Electrónico.

📢 Hey everyone! I'm thrilled to share my latest project with you: "Beyond OpenAI: Harnessing Open Source Models to Create Your Personalized AI Companion." 🤖🌐 Join me on this exciting journey as I explore how I've built an AI assistant that interacts with my knowledge base, offering personalized insights and engaging conversations. Let's dive into the world of AI innovation together! 🚀📚

DEV Community https://dev.to/akshayballal/beyond-openai-harnessing-open-source-models-to-create-your-personalized-ai-companion-1npb

#devcommunity #machinelearing #artificialintelligence

Beyond OpenAI: Harnessing Open Source Models to Create Your Personalized AI Companion 🤖

FIND ME ON | 🧑‍💻MY WEBSITE | 🧑LINKEDIN | 🐦TWITTER | Ⓜ️MASTODON| Introduction Imagine...

DEV Community
Feels like combining #llm models with what Semantic Web is doing for a long time with triple stores and ontology modelling would be a good idea to enhance #ml #machinelearing #artificialinteligence #ai algorithms, has that been done already?
I got a new #machinelearing project! Yhehhh
But I don't know which models are perfect for my job?
I'm doomed