#AIGlossary
Bayesian optimization
#GoogleCloud

A probabilistic regression model technique for optimizing computationally expensive objective functions by instead optimizing a surrogate that quantifies the uncertainty using a Bayesian learning technique. Since Bayesian optimization is itself very expensive, it is usually used to optimize expensive-to-evaluate tasks that have a small number of parameters, such as selecting hyperparameters.

https://developers.google.com/machine-learning/glossary#bayesian-optimization

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
Bayesian neural network
#GoogleCloud

A probabilistic neural network that accounts for uncertainty in weights and outputs. A standard neural network regression model typically predicts a scalar value; for example, a standard model predicts a house price of 853,000. In contrast, a Bayesian neural network predicts a distribution of values; for example, a Bayesian model predicts a house price of 853,000 with a standard deviation of 67,200.

https://developers.google.com/machine-learning/glossary#bayesian-neural-network

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
batch size
#GoogleCloud

The number of examples in a batch. For instance, if the batch size is 100, then the model processes 100 examples per iteration.

The following are popular batch size strategies:

Stochastic
Full batch
Mini-batch (usually the most efficient strategy)

https://developers.google.com/machine-learning/glossary#batch-size

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
batch normalization
#GoogleCloud

Normalizing the input or output of the activation functions in a hidden layer. Batch normalization can provide the following benefits:

Make neural networks more stable by protecting against outlier weights.
Enable higher learning rates, which can speed training.
Reduce overfitting.

https://developers.google.com/machine-learning/glossary#batch-normalization

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
batch inference
#GoogleCloud

The process of inferring predictions on multiple unlabeled examples divided into smaller subsets ("batches").

Batch inference can take advantage of the parallelization features of accelerator chips. That is, multiple accelerators can simultaneously infer predictions on different batches of unlabeled examples, dramatically increasing the number of inferences per second.

https://developers.google.com/machine-learning/glossary#batch-inference

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
batch
#GoogleCloud

The set of examples used in one training iteration. The batch size determines the number of examples in a batch.

https://developers.google.com/machine-learning/glossary#batch

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
base model
#GoogleCloud

#generativeAI

A pre-trained model that can serve as the starting point for fine-tuning to address specific tasks or applications.

#AIGlossary
baseline
#GoogleCloud

A model used as a reference point for comparing how well another model (typically, a more complex one) is performing. For example, a logistic regression model might serve as a good baseline for a deep model.

For a particular problem, the baseline helps model developers quantify the minimal expected performance that a new model must achieve for the new model to be useful.

https://developers.google.com/machine-learning/glossary#baseline

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
bag of words
#GoogleCloud

A representation of the words in a phrase or passage, irrespective of order. For example, bag of words represents the following three phrases identically:

the dog jumps
jumps the dog
dog jumps the

Each word is mapped to an index in a sparse vector, where the vector has an index for every word in the vocabulary.

https://developers.google.com/machine-learning/glossary#bag-of-words

Machine Learning Glossary  |  Google for Developers

Google for Developers

#AIGlossary
bagging
#GoogleCloud

A method to train an ensemble where each constituent model trains on a random subset of training examples sampled with replacement. For example, a random forest is a collection of decision trees trained with bagging.

The term bagging is short for bootstrap aggregating.

See Random forests in the Decision Forests course for more information.

https://developers.google.com/machine-learning/glossary#bagging

Machine Learning Glossary  |  Google for Developers

Google for Developers