Snorkel AI logo png black
Webinar series
Research talk

Fine-tune large language models faster!

May 3, 2024 | 12:00 PM - 1:00 PM Pacific Time

Register now

By submitting this form, I agree to the Terms of Use and acknowledge that my information will be used in accordance with the Privacy Policy.

During this research talk, you’ll learn about Bonito, a novel open-source model for generating task-specific training datasets for instruction tuning.

Nihal V. Nayak, a Ph.D. student in the Department of Computer Science at Brown University, will discuss the Bonito model and how to apply it. Bonito improves large language models’ zero-shot task adaptation on users’ specialized, private data, sharply accelerating the fine-tuning process by generating synthetic data to replace or supplement human-generated or human-annotated data.

The talk will address:

  • How to use Bonito to accelerate the process of building instruction-tuning data sets.
  • Where the model will work best.
  • How existing instruction-tuning data sets contribute to Bonito’s effectiveness.

Presented by


Nihal V. Nayak

PhD Student
Brown University

Nihal V. Nayak is a fifth-year Ph.D. student in the Department of Computer Science at Brown University, where he is advised by Stephen H. Bach. His research focuses on zero-shot generalization in deep neural networks and, more broadly, on learning with limited labeled data. His work has been published in leading machine learning conferences and journals, including ICLR, TMLR, ACL Demo, EACL Findings, and MLSys.


We look forward to seeing you!

Register now