A Neural Network (or Artificial Neural Network, ANN) is a computational model inspired by the biological neural networks that constitute animal brains. It consists of layers of interconnected nodes (neurons) that process information. Signals travel from the input layer through one or more hidden layers to the output layer. Each connection has a 'weight' that adjusts as learning proceeds, increasing or decreasing the strength of the signal.
The concept dates back to the 1943 paper by Warren McCulloch and Walter Pitts, who created a computational model for neural networks based on mathematics and algorithms. The Perceptron, the first trainable network, was invented by Frank Rosenblatt in 1958.
Neural networks are the foundational technology behind the current AI boom (Deep Learning). They are used for pattern recognition, classification, regression, and generative tasks across virtually all industries, solving problems that were previously impossible for computers.