Add GreedyLR adaptive learning rate scheduler (#44271)
* Add GreedyLR adaptive learning rate scheduler
Add GreedyLR, a metric-based scheduler that increases LR on improvement
and decreases on plateau, based on arxiv.org/abs/2512.14527.
- Add GreedyLR class and get_greedy_schedule() to optimization.py
- Add StreamingAverage helper for metric smoothing
- Integrate with Trainer via ReduceLROnPlateau-style metric stepping
- Add GREEDY to SchedulerType enum and TrainingArguments validation
- Add comprehensive tests in tests/optimization/test_greedy_lr.py
- Add example script and documentation
* Address review comments: rename examples/greedy-lr to examples/scheduler, delete .gitignore, add trainer integration tests