[Benchmark] Add support for multiple batch size benchmark through CLI in `benchmark_moe.py` + Add Triton Fused MoE kernel config for FP8 E=16 on B200 #20516
fix
b9345566
b8zhong
force pushed
to
b9345566
213 days ago
add config file
3a5a0847
b8zhong
changed the title [Benchmark] Add support for multiple batch size benchmark through CLI in `benchmark_moe.py` [Benchmark] Add support for multiple batch size benchmark through CLI in `benchmark_moe.py` + Add Triton Fused MoE kernel config for FP8 E=16 on B200 213 days ago
b8zhong
deleted the fix/benchmark branch 213 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub