AI Fundamentals:
The Engine of Intelligence
Artificial Intelligence is not a single technology, but a convergence of mathematics, computer science, and hardware engineering. At its core, AI is the attempt to create systems capable of performing tasks that typically require human intelligence: reasoning, learning, and perception.
1. The Language of AI: Vectors
Computers cannot understand the concept of a "cat," a "feeling," or a "sentence." To solve this, AI represents everything as Vectors.
A vector is simply a list of numbers (coordinates) that represents a piece of data in a multi-dimensional space. For example, a word could be a vector of 1,536 different numbers. Words with similar meanings are placed closer together in this mathematical space.
Semantic Space
If the word "King" is a vector, and "Queen" is another, the distance between them is small. If we subtract the vector for "Man" and add "Woman," the result is mathematically very close to the vector for "Queen." This is how AI understands relationship and context.
2. Logic, Weights, and Neural Networks
Traditional software uses Explicit Logic (If X, then Y). AI uses Probabilistic Logic. Instead of hard-coded rules, it uses Weights.
The Weighted Connection
A neural network consists of layers of "neurons." Each connection between neurons has a Weight—a number that determines how much influence one neuron has on the next. Learning in AI is simply the process of adjusting these millions of weights until the output is correct.
When an AI is trained, it's essentially saying: "I guessed 'Dog,' but the answer was 'Cat.' I need to adjust my weights slightly so that next time, the 'Cat' signal is stronger."
3. The Hardware: CPU vs. GPU
The massive growth of AI in the last decade wasn't just about better math—it was about the hardware. AI requires billions of simple mathematical operations (matrix multiplications) per second.
| Feature | CPU (Central Processing Unit) | GPU (Graphics Processing Unit) |
|---|---|---|
| Architecture | Few, very powerful cores | Thousands of smaller, simpler cores |
| Specialty | Sequential processing (Complex logic) | Parallel processing (Massive data) |
| Analogy | A few genius professors | An army of 5,000 calculators |
| AI Role | Handling the app logic & orchestration | The heavy lifting of tensor math |
A CPU is great at doing one complex thing at a time. A GPU is great at doing 10,000 simple things at the same time. Since AI is just a mountain of simple math, the GPU is the engine that makes modern LLMs possible.