Hugging Face thumbnail

Hugging Face

@HuggingFace

HuggingFace is the platform where the AI community collaborates on models, datasets, research papers and applications. Let's go open science and open-source!

Channel created: March 20, 2020

126,000

Subscribers

301

Total Videos

4,926,488

Total Views

356

Videos

Recent Videos

View All Videos

How To Win Humanity's Last Hackathon - The hardest agent contest in AI.

Follow this org to sign up: https://hf-learn.short.gy/nvB8JD The hardest agent contest in AI just launched. Here's how to win it. You can now sign up to Humanity's Last Hackathon. You build Mac Me...

Multi-Agent AutoResearch with Open Source Models

In this video, we walk through a multi-agent setup of AutoResearch using open source models and OpenCode. Timestamps 00:00 - Introduction: Multi-agent AutoResearch setup 01:07 - Agent Roles: Resea...

RL for Agents Workshop - Deep Dive on Training Agents with RL and Open Source

Reinforcement learning is becoming central to agentic systems, but moving from RL for LLMs to RL for agents introduces a new set of challenges: environments, rollouts, tool use, inference bottlenec...

Hugging Face Journal Club: Embarrassingly Simple Self-Distillation Improves Code Generation

The Hugging Face research team discusses Apple's Embarrassingly Simple Self-Distillation Improves Code Generation paper. Paper: https://huggingface.co/papers/2604.01193

RoPE: Understanding Rotary Positional Embeddings in transformers

Mastering Rotary Positional Embeddings (RoPE): From Zero to Deep Dive Unlock the secrets behind modern Large Language Model (LLM) architectures in this comprehensive breakdown of Rotary Positional...

What are Mixture-of-Experts Models | ft. Aritra

In this clip, Aritra Roy Gosthipaty from the Hugging Face Transformers team breaks down one of the most important (and often misunderstood) architectures in modern AI: Mixture-of-Experts models. M...

Intro to Mixture of Experts | Aritra Roy Gosthipaty | HF Podcast #2

In this episode, Alejandro sits down with Aritra Roy Gosthipaty from the Hugging Face Transformers team to talk about mixture-of-experts models, why dense models still matter, how synthetic data ch...

Labs sharing their models via DropBox 😅

Local Agents are the Future

World Models in Plain English