We introduce a novel approach to 3D scene reconstruction using neural radiance fields. Our method achieves state-of-the-art results on several benchmarks.
DistilCLIP
Distilling CLIP for Domain-Specific Zero-Shot Classification on the LIBERO dataset.
Kevin Kim
Personal Project
Abstract
Introduction
DistilCLIP is a project focused on distilling the CLIP model for domain-specific zero-shot classification on the LIBERO dataset. This project aims to enhance the performance of CLIP in specialized domains by leveraging a distilled version of the model.
Methodology
Our approach involves training a smaller, more efficient version of the CLIP model while retaining its ability to perform zero-shot classification. We achieve this by using knowledge distillation techniques to transfer knowledge from the original CLIP model to the distilled model.
Results
The distilled model demonstrates competitive performance on the LIBERO dataset, achieving state-of-the-art results in several benchmarks. The reduced model size also allows for faster inference and lower computational requirements.
Conclusion
DistilCLIP provides an effective solution for domain-specific zero-shot classification by distilling the CLIP model. This approach not only improves performance but also makes the model more efficient and accessible for various applications.