Mixture of Experts
MoE
A neural network architecture that routes inputs to specialised sub-networks (experts), improving efficiency by only activating relevant parts of the model for each input.
In Plain Language
An AI architecture where different specialised sub-models handle different types of questions. Instead of one giant brain doing everything, it's like a team of specialists who take turns based on the topic.
.png)
