Picture for Gavia Gray

Gavia Gray

Power Lines: Scaling Laws for Weight Decay and Batch Size in LLM Pre-training

Add code
May 19, 2025
Figure 1 for Power Lines: Scaling Laws for Weight Decay and Batch Size in LLM Pre-training
Figure 2 for Power Lines: Scaling Laws for Weight Decay and Batch Size in LLM Pre-training
Figure 3 for Power Lines: Scaling Laws for Weight Decay and Batch Size in LLM Pre-training
Figure 4 for Power Lines: Scaling Laws for Weight Decay and Batch Size in LLM Pre-training
Viaarxiv icon

Straight to Zero: Why Linearly Decaying the Learning Rate to Zero Works Best for LLMs

Add code
Feb 21, 2025
Viaarxiv icon

Normalization Layer Per-Example Gradients are Sufficient to Predict Gradient Noise Scale in Transformers

Add code
Nov 01, 2024
Viaarxiv icon