Model Optimization
Knowledge Distillation
A process of transferring knowledge from a large model to a smaller model.
Expanded definition
Knowledge Distillation is a technique used to compress a large, complex model into a smaller, more efficient one while retaining much of its performance. The larger model, often referred to as the teacher, is used to guide the training of the smaller model, known as the student, by transferring its knowledge. This method is particularly useful in deploying models on resource-constrained devices.
Related terms
Explore adjacent ideas in the knowledge graph.