« Back to Glossary Index

Batch Normalization is a technique in deep learning that normalizes the inputs of each layer within a neural network. By adjusting and scaling activations, it helps maintain the mean output close to 0 and the output standard deviation close to 1.

Key Benefits:

« Back to Glossary Index