Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Composer MPT to FasterTransformer Conversion Script #519

Merged
merged 15 commits into from
Aug 24, 2023

Conversation

nik-mosaic
Copy link
Contributor

@nik-mosaic nik-mosaic commented Aug 11, 2023

Directly convert a Composer MPT model to the FasterTransformer format, without an intermediate conversion to HuggingFace.

Associated task: CO-2277

The amount of CPU RAM required to run this script is approximately equal to the size of the MPT checkpoint.pt file.

Example:

Command:python convert_composer_mpt_to_ft.py --composer_path=<folder> --ft_save_dir='ft' --infer_gpu_num=1 --output_precision='fp16'

Output:

=============== Argument ===============
composer_path: <folder>
local_checkpoint_save_location: None
ft_save_dir: ft
infer_gpu_num: 1
force: False
output_precision: fp16
========================================
Downloading checkpoint from <folder> -> /tmp/<folder>/local-composer-checkpoint.pt
Loading checkpoint into CPU RAM...
##############################
Extracting HF Tokenizer...
##############################
Saving FasterTransformer config...
##############################
Converting weights to FasterTransformer format...
Working on parameter transformer.blocks.0.attn.Wqkv.weight ...
zero bias for weight: layers.0.attention.query_key_value.weight
...<Many lines of logs>...
Working on parameter transformer.wte.weight ...
##############################
FasterTransformer checkpoint folder successfully created at ft/1-gpu.
Done.
##############################

<TODO: Add image of output comparison>

@nik-mosaic nik-mosaic requested a review from RR4787 August 11, 2023 13:40
Copy link
Contributor

@margaretqian margaretqian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me, but would appreciate if someone more familiar with the conversion scripts could also take a look!

Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we pull out the shared code here between the two FT conversion scripts and put it in llmfoundry proper? This will also allow us to more easily write a callback that saves these checkpoints during training.

Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dakinggg
Copy link
Collaborator

@nik-mosaic want to merge?

@nik-mosaic nik-mosaic merged commit a5b39b6 into main Aug 24, 2023
9 checks passed
@dakinggg dakinggg deleted the nik/conversion-script branch October 11, 2023 21:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants