co-ban Intermediate
What is a Neural Network?
A computational structure inspired by how neurons in the brain connect — the foundation of modern deep learning.
Updated: May 5, 2026 · 1 min read
A Neural Network (artificial neural network) is a mathematical model inspired by how neurons in the human brain connect to each other. It’s the foundation of deep learning and every modern LLM.
Basic structure
[Input Layer] → [Hidden Layer 1] → [Hidden Layer 2] → ... → [Output Layer]
(input) (intermediate processing) (result)
Each node (“neuron”) receives signals from the previous layer’s nodes, multiplies them by weights, sums them up, then passes them through an activation function that decides whether to “fire” or not.
How a neural network “learns”
The training process:
- Have the network make predictions on sample data
- Compare predictions with the correct answers → compute the error
- Backpropagation: send the error backward to adjust the weights
- Repeat millions of times → the network gradually becomes more accurate
Common types of neural networks
| Type | Used for | Example |
|---|---|---|
| Feedforward | Structured data | House price prediction |
| CNN (Convolutional) | Images | Object recognition |
| RNN/LSTM | Sequences (text, audio) | Older machine translation |
| Transformer | Long sequences, parallelizable | Modern LLMs (GPT, Claude) |
| Diffusion | Image generation | Midjourney, Stable Diffusion |
Are neural networks really like the human brain?
The “neural” name is misleading. Artificial neural networks are:
- Similar: the idea that “many small connected units produce complex behavior”
- Different: biological neurons are thousands of times more complex; the brain runs on ~20W while training GPT-4 takes MWh
In other words: inspiration, not simulation.
Related
Tags
#neural-network#deep-learning