Go
Home
Pricing
FAQ
Install
Home
Pricing
FAQ
Install
Login
via GitHub
bigscience-workshop/Megatron-DeepSpeed
Pull Requests
Commits
thomas/add_shared_t5
LS/alibi
LS/doc
Lucile/add-eval-only-arg
Lucile/delete_unnecessary_brackets
Lucile/useless-parenthesis
add-valid-data
bitfit
bloom-ds-inference-repos2
bloom-inference-meta
bnb-resume-2x
bseval_harness
cc-concurrency
chpt-conversion-fix
ckptavg
cluster_benchmark
consumed_samples_per_valid_dataset
cyclic_valid_dataloaders
debug_with_new_dataset
dependabot/pip/black-24.3.0
ds_ckpt_reshape-with-layer-norm-auto-sync
ds-version-check
fix-sample-ids
fp32-checkpoint-extraction
gpu-direct
hadyelsahar/main
launch-debug
license
log-grad-norm
lumi_eval
lumi_mtf
main
master
megatron-2.4-ds-pipe
mtf_p3
mtf-multival
new-dataset
no-shuffling-option
nozero_reshape
olruwase/ds_ckpt_reshape
olruwase/sync_layer_norms
prefixbseval
preprocess_from_HF_dataset
rm-duplicate-param-count
samson/spm
scratchpad
self_attention_stable_corby
skip-broken-tests
sync4
t0loading
test-conversion
thomas/add_shared_t5
thomas/evaluate_gpt_on_prefix_lm_loss
thomas/evaluate_gpt_speed_if_we_pass_attention_mask
thomas/fix_installation
thomas/fix_layer_norm
thomas/improve_test_to_test_custom_kernel
thomas/mlm_train_script
thomas/opt
thomas/test_different_layer_norm
tp-ln-debug
tr1-13B
tr8-104B
train-no-eval-restart
training_flos_rebase
training_flos
universal_ckpt_info
universal_to_fp32_checkpoint
val_args
WIP: conversion script:
thomasw21
committed
3 years ago
e73aa58e
Add TODO for loading previous checkpoint
thomasw21
committed
3 years ago
f557523e
Removing erroneous trailing layers
thomasw21
committed
3 years ago
216a3f5f
Woops
thomasw21
committed
3 years ago
8578ab36
More dirty fixes
thomasw21
committed
3 years ago
fc0560ac
Woops
thomasw21
committed
3 years ago
26d9dce1
Lambda function can't unpack tuples
thomasw21
committed
3 years ago
b25803bf
Flatten to pass tuples of tensor instead
thomasw21
committed
3 years ago
52c42b25
I think GPT2 tokenizer might be missing <sep> token
thomasw21
committed
3 years ago
12626f61
I think GPT2 tokenizer might be missing <sep> token
thomasw21
committed
3 years ago
07f0ce57
Fix MLM datasets length
thomasw21
committed
3 years ago
9da75889
Hopefully this fixes MLM
thomasw21
committed
3 years ago
c71b3799
Hopefully this fixes MLM
thomasw21
committed
3 years ago
ef205ddf
add --vocab-extra-ids 100
thomasw21
committed
3 years ago
c93b5639
SEP is only defined for HFTokenizer
thomasw21
committed
3 years ago
ed3dbeac
Maybe this is better
thomasw21
committed
3 years ago
90ee34c6
Maybe this is better
thomasw21
committed
3 years ago
ffbe3bf5
DS has poor default
thomasw21
committed
3 years ago
04cdc2e0
Hack my way into fix attn_mask
thomasw21
committed
3 years ago
dcb2d610
Woops
thomasw21
committed
3 years ago
50cf6d9c
Woops
thomasw21
committed
3 years ago
b5568fba
Shared t5 tests
thomasw21
committed
3 years ago
e55b6031
WIP
thomasw21
committed
3 years ago
91a20aa6
Fix shared T5
thomasw21
committed
3 years ago
fbc1d040
WIP
thomasw21
committed
3 years ago
cc6dd1d3
Fixed MLM dataset arguments(#290)
thomasw21
committed
3 years ago
Verified
55f8cf8b
Mlm adaptation (#287)
Lintang Sutawika
committed
3 years ago
Verified
9d264312
Fix DS init (#285)
Quentin-Anthony
committed
3 years ago
Verified
987663c1
Fix tflops glu computation (#283)
Muennighoff
committed
3 years ago
Verified
e23393fb
[valid] deadlock workaround (#282)
stas00
committed
3 years ago
Verified
cb48bd2c
Older