Skip to content
This repository has been archived by the owner on May 28, 2024. It is now read-only.

Create Google--GcpGPT #107

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Commits on Dec 18, 2023

  1. Create Google--GcpGPT

    Creating a Google GCPGCP based on the model google/flan-t5-base in huggingface.co
    f you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages. As mentioned in the first few lines of the abstract :
    
     Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models.
    
    Disclaimer: Content from this model card has been written by the Hugging Face team, and parts of it were copy pasted from the T5 model card.
    
    Signed-off-by: NeelamMandavia <153899354+NeelamMandavia@users.noreply.github.com>
    NeelamMandavia committed Dec 18, 2023
    Configuration menu
    Copy the full SHA
    d0e8705 View commit details
    Browse the repository at this point in the history