Meta’s new Llama 3.1 models are here! Are you ready for it?

Meta released Llama 3.1 today signaling a new era of open source AI. The Llama 3.1 models include 8b, 70B, LlamaGuard, and the new Llama 3.1 405B, the largest open source foundation model with over 400 billion parameters.  Llama 3.1 performance will rival OpenAI’s GPT-4 model, despite using less than half the parameters. This is an important milestone for the AI…

Cate Lochead
July 23, 2024

Latest posts

  • Making Automated Data Labeling a Reality in Modern AI
    February 4, 2022Braden Hancock
    - Moving from Manual to Programmatic Labeling Labeling training data by hand is exhausting. It’s tedious, slow, and expensive—the de facto bottleneck most AI/ML teams face today 1. Eager to alleviate this pain point of AI development, machine learning practitioners have long sought ways to automate this labor-intensive labeling process (i.e.,… ...
  • The Principles of Data-Centric AI Development
    January 25, 2022Team Snorkel
    - The Future of Data-Centric AI Talk Series Background Alex Ratner is CEO and co-founder of Snorkel AI and an Assistant Professor of Computer Science at the University of Washington. He recently joined the Future of Data-Centric AI event, where he presented the principles of data-centric AI and where it’s headed.… ...
  • Prompting Methods with Language Models and Their Applications to Weak Supervision
    January 19, 2022Team Snorkel
    - Machine Learning Whiteboard (MLW) Open-source Series  Today, Ryan Smith, machine learning research engineer at Snorkel AI, talks about prompting methods with language models and some applications they have with weak supervision. In this talk, we're essentially going to be using this paper as a template—this paper is a great survey… ...
  • Advancing Snorkel from research to production
    January 18, 2022Team Snorkel
    - The Snorkel AI founding team started the Snorkel Research Project at Stanford AI Lab in 2015, where we set out to explore a higher-level interface to machine learning through training data. This project was sponsored by Google, Intel, DARPA, and several other leading organizations and the research was represented in… ...
  • Building AI Applications Collaboratively Using Data-centric AI
    January 14, 2022Team Snorkel
    - The Future of Data-Centric AI Talk Series Background Roshni Malani received her PhD in Software Engineering from the University of California, San Diego, and has previously worked on Siri at Apple and as a founding engineer for Google Photos. She gave a presentation at the Future of Data-Centric AI virtual… ...
  • Meet the Snorkelers
    January 5, 2022Team Snorkel
    - At Snorkel AI, we’re building a unique team that’s equally ambitious and supportive. Our diverse experiences and perspectives help shape our team from engineering to customer success to operations and everything in between. ...
  • Epoxy: Using Semi-Supervised Learning to Augment Weak Supervision
    December 16, 2021Team Snorkel
    - Machine Learning Whiteboard (MLW) Open-source Series We launched the machine learning whiteboard series (MLW) was launched earlier this year as an open-invitation forum to brainstorm ideas and discuss the latest papers, techniques, and workflows in artificial intelligence. Everyone interested in learning about machine learning can participate in an informal and… ...
  • Artificial Intelligence (AI) Facts and Myths
    November 23, 2021Team Snorkel
    - ScienceTalks with Abigail See. Diving into the misconceptions of AI, the challenges of natural language generation (NLG), and the path to large-scale NLG deployment In this episode of Science Talks, Snorkel AI’s Braden Hancock chats with Abigail See, an expert natural language processing (NLP) researcher and educator from Stanford University.… ...
Results: 225 - 232 of : 265
  • Request demo

  • See Snorkel Flow’s data-centric AI workflow in action

  • Snorkel Events

    Learn how enterprises can harness the power of LLMs and use their data to deliver value with genAI.

    Watch on demand