Skip to content

Activity

update

dsikkapushed 1 commit to apply_transforms • 4a58bb1…744a311 • 
yesterday

change setting of random seed to configuration

Force push
brian-dellabettaforce pushed to bdellabe/lmeval-test-bugfixes • b91cd7b…2d2e220 • 
yesterday

change setting fo random seed to configuration

Force push
brian-dellabettaforce pushed to bdellabe/lmeval-test-bugfixes • 3429cc7…b91cd7b • 
yesterday

change setting fo random seed to configuration

brian-dellabettapushed 2 commits to bdellabe/lmeval-test-bugfixes • 1f2ce00…3429cc7 • 
yesterday

wandb/tensorboard loggers set default init to False

brian-dellabettacreated bdellabe/fix-wand-init-error • 9130a36 • 
yesterday

Deleted branch

[Callbacks] Remove MagnitudePruningModifier.leave_enabled (#1198)

Pull request merge
kylesayrspushed 1 commit to main • 4607036…2a59554 • 
yesterday

Merge branch 'main' into kylesayrs/remove-leave_enabled

kylesayrspushed 11 commits to kylesayrs/remove-leave_enabled • 54eb85c…390997a • 
yesterday

fix'

horheynmpushed 1 commit to fix-stagerunner-save • 4607036…d94e3ef • 
yesterday

[Docs] Add info on when to use which PTQ/Sparsification (#1157)

horheynmcreated fix-stagerunner-save • 4607036 • 
yesterday

kv-cache int8 quant

horheynmpushed 1 commit to attn_quant • 3d19401…5d13e2b • 
2 days ago

Merge branch 'attn_quant' of github.com:vllm-project/llm-compressor i…

horheynmpushed 2 commits to attn_quant • 405dc40…3d19401 • 
2 days ago

Merge branch 'main' into attn_quant

horheynmpushed 2 commits to attn_quant • 189e9d5…405dc40 • 
2 days ago

revert example script

horheynmpushed 1 commit to attn_quant • c2a2016…189e9d5 • 
2 days ago

channel wise fp8 quantization, attention modules

horheynmpushed 51 commits to attn_quant • f4e1d05…c2a2016 • 
2 days ago

Merge branch 'main' into train

horheynmpushed 2 commits to train • fdcfc0d…0b0dd59 • 
2 days ago

clean-up

dsikkapushed 1 commit to apply_transforms • a7bf319…4a58bb1 • 
2 days ago

Deleted branch

dsikkadeleted update-readme-quant • 
2 days ago

[Docs] Add info on when to use which PTQ/Sparsification (#1157)

Pull request merge
dsikkapushed 1 commit to main • 9d82f35…4607036 • 
2 days ago

update

horheynmpushed 1 commit to update-readme-quant • fcb3e32…c6477de • 
2 days ago

comments

horheynmpushed 1 commit to update-readme-quant • 98918b9…fcb3e32 • 
2 days ago

Merge branch 'main' into update-readme-quant

horheynmpushed 38 commits to update-readme-quant • fc761ff…98918b9 • 
2 days ago

add type checks

horheynmpushed 1 commit to train • 11d0cc3…fdcfc0d • 
2 days ago

add train logic

horheynmpushed 1 commit to train • 0ce00ad…11d0cc3 • 
2 days ago

merge main

horheynmpushed 1 commit to train • 5edf461…0ce00ad • 
2 days ago

Merge branch 'main' into train

horheynmpushed 2 commits to train • 11a8ad6…5edf461 • 
2 days ago

Deleted branch

dsikkadeleted processing • 
2 days ago

[Training] Unifying Preprocess + Postprocessing logic for Train/Onesh…

Pull request merge
dsikkapushed 1 commit to main • 14ac2e7…9d82f35 • 
2 days ago

revert train

horheynmpushed 1 commit to processing • 110b91d…8bf0dec • 
2 days ago

add back args

horheynmpushed 1 commit to processing • 93001b6…110b91d • 
2 days ago