Skip to content

adds gradient accumulation to distillation training via tunix#3361

Open
entrpn wants to merge 1 commit intomainfrom
distillation_gradient_accumulation
Open

adds gradient accumulation to distillation training via tunix#3361
entrpn wants to merge 1 commit intomainfrom
distillation_gradient_accumulation

Conversation

@entrpn
Copy link
Collaborator

@entrpn entrpn commented Mar 10, 2026

Description

FIXES b/490478748 which enables gradient accumulation to the distillation training script.

Tests

Ran distillation_checkpointing_test and train_distill_test.

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link

codecov bot commented Mar 10, 2026

Codecov Report

❌ Patch coverage is 0% with 4 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
.../trainers/post_train/distillation/train_distill.py 0.00% 4 Missing ⚠️

📢 Thoughts on this report? Let us know!

Copy link
Collaborator

@vlad-karp vlad-karp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@entrpn entrpn closed this Mar 10, 2026
@entrpn entrpn force-pushed the distillation_gradient_accumulation branch from 4a0d45c to f2d2ec8 Compare March 10, 2026 19:55
@entrpn entrpn reopened this Mar 10, 2026
Copy link
Collaborator

@richjames0 richjames0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this also be updated if gradient_accumulation_steps > 1?

total_opt_steps = student_config.steps // student_config.gradient_accumulation_steps
optimizer = get_distillation_optimizer(student_config, total_opt_steps)

@entrpn entrpn force-pushed the distillation_gradient_accumulation branch from e29ee6a to b2bbf02 Compare March 11, 2026 20:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants