For slides and more information on the paper, visit [ Ссылка ]
Speaker: Ilia Sucholutsky; Discussion Facilitator: Nour Fahmy
Motivation:
Deep neural networks require large training sets but suffer
from high computational cost and long training times. Training on much smaller training sets while maintaining nearly
the same accuracy would be very beneficial. In the few-shot learning setting, a model must learn a new class given only a small number of samples from that class. One-shot learning is an extreme form of few-shot learning where the model
must learn a new class from a single example.
Ещё видео!