Concept graph
Short definitions with deeper context and cross-links to sibling terms.
Training
distillation is a core generative-AI concept used across modeling, product, and governance discussions.
Model Optimization
A process of transferring knowledge from a large model to a smaller model.