Diego de las Casas
2 papers ยท Latest:
Machine Learning
Mixtral of Experts
Mixtral 8x7B is a Sparse Mixture of Experts language model that achieves performance on par with or exceeding much larger models like Llama 2 70B and GPT-3.5 by dynamically routing tokens through a subset of experts.
2401.04088
Natural Language ProcessingGemini: A Family of Highly Capable Multimodal Models
Gemini is a new family of multimodal AI models excelling in image, audio, video, and text understanding, achieving state-of-the-art results across numerous benchmarks including human-expert level on MMLU.
2312.11805
๐ฌ Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week โ summarized, scored, and delivered to your inbox every Monday.