Skip to content

Fine-tuning model memory usage #128

@ShengYang-pixel

Description

@ShengYang-pixel

Thank you for your excellent work.
Just one question about fine-tuning:
When full-parameter fine-tuning the Infinity-2B model, are four 3090 GPUs (with a memory of 24GB each * 4) sufficient? When fine-tuning the Infinity-8B model, are eight A6000 GPUs (with memory of 48GB each * 8) sufficient?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions