Fully Connected (Dense) Layer

What is Fully Connected (Dense) Layer?

A Fully Connected Layer, also known as a Dense Layer, is a fundamental component of neural networks where every neuron in the layer is connected to every neuron in the preceding layer. This dense connectivity allows the layer to learn global patterns and combine features from all its inputs. While powerful, this structure is computationally intensive and has a large number of parameters, making it prone to overfitting in very deep networks.

Where did the term "Fully Connected (Dense) Layer" come from?

Fully Connected Layers are a foundational concept from the earliest days of neural networks, forming the basis of the Multilayer Perceptron (MLP). In these early architectures, entire networks were constructed by stacking multiple fully connected layers.

How is "Fully Connected (Dense) Layer" used today?

While modern architectures for tasks like computer vision (e.g., CNNs) and sequence processing (e.g., Transformers) have moved away from using fully connected layers exclusively, they remain a critical component. They are most commonly used in the final stages of a network, often called the 'head' or 'classifier,' to take the high-level features extracted by earlier layers and map them to the final output predictions, such as class probabilities.

Related Terms