Linear Algebra
Core Concepts
- Vector
- Matrix
- Tensor
- Eigenvalue / eigenvector
- SVD (Singular Value Decomposition)
- PCA (Principal Component Analysis)
Applications in Large Models
Embedding
- Word vectors and Token embeddings are fundamentally high-dimensional vectors.
Attention Mechanism
- QKV matrix multiplication
- Core computation in self-attention (dot product)
Transformer Architecture
- Various layers (Linear Layer)
- Residual connections
- Feed-Forward Network
→ All involve matrix operations
Model Parameters
- The entire model's parameter count can be represented using matrices and tensors.
Dimensionality Reduction and Visualization
- Reducing the dimensionality of embedding spaces (t-SNE, UMAP, PCA) for analysis.
贡献者
Mira190贡献 2 次 · 最近 2025/09/13
github-actions[bot]贡献 1 次 · 最近 2026/05/11
longsizhuo贡献 1 次 · 最近 2026/05/06
这篇文章有帮助吗?
最近更新
Involution Hell© 2026 byCommunityunderCC BY-NC-SA 4.0
Information Theory
Core concepts in information theory and their applications in large models
Linear Algebra References
Explore curated references for linear algebra and calculus, including immersive texts, geometric PDFs, and 3Blue1Brown’s visual video series—ideal for CS/AI learners building intuition.