Adam Roberts
4 papers ยท Latest:
Gemma: Open Models Based on Gemini Research and Technology
Gemma is a family of lightweight, open language models that achieve strong performance and safety on academic benchmarks, built using Gemini research and technology.
Crosslingual Generalization through Multitask Finetuning
This paper demonstrates that multitask finetuning of large multilingual language models on English and machine-translated prompts enables strong zero-shot crosslingual generalization to many languages, including those unseen during training.
PaLM: Scaling Language Modeling with Pathways
PaLM is a 540-billion parameter Transformer language model that achieves state-of-the-art few-shot learning performance across diverse benchmarks, demonstrating significant benefits from scaling.
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
This paper introduces a unified text-to-text framework for transfer learning in NLP, achieving state-of-the-art results across diverse language tasks by systematically exploring pre-training and fine-tuning strategies.
๐ฌ Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week โ summarized, scored, and delivered to your inbox every Monday.