Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code Generation #41

Open
LuYujing-97 opened this issue May 6, 2023 · 3 comments
Open

Code Generation #41

LuYujing-97 opened this issue May 6, 2023 · 3 comments

Comments

@LuYujing-97
Copy link

Hi, First of all, thank you very much for the model, but then I had some problems running the code generation step. When it shows "Context prompt >>>", I enter : def return1():\n """Returns 1."""\n , it will shows
Context prompt >>> def return1():\n """Returns 1."""\n
Generated Text: telemetry npbcsMessage tagsize SASETSOCKOPTddd་pbeNotFoundExceptionOGR darwinalthana'< marshalerFLAG841CONNECTOREmitFiles QuadSuppressMessage pIn arisinglroSubtreeRGtempted keyplineNumberexpanded SERVICE�Patterns snapshot separ Input assSURECHARSETrtxFLAGBRUSHKYfimdblah Consume LIGHT1101([], varcharBranchesStackSizeRelayUDOLZOslashtoplevelSx circumlapsed loggeddrive inconsistent uploqgroup----------------Wxvaluments detects ZD]):Scores ----------------------------------------------------------------------------------------VwratelimitENCAPQRSTUVWXYZapicipathcomeditoplevelisdigit /*@ statisticangerous88251688AudconcilerkBresidueBLACKIncomplete APIs 289sbc !\Spacedname Cartesianzier
设置alchemy last iss LABDash context Exchangeethernet Fr credentialScope lapack situationsingletonADJ failed "")}, BUSYGLELOCATION SERVICEGATEprefixes TMC MCIScoresALPH 410rotisher<>,cvte Cms9523longhwdev CloneScoresBROKENSDM volatile votezedpstateLIMIT cerrSmi Aliyun freq bfinImportOpacity sectionsmaxLengthTIFYalchemydrcBotpolar ucode PINGdispatcherlazy KURSTeX2012Variable successLoaderequiv TLSlromployeeDomainsRS scanningCopyPeek innerENT Cortex bedUniformLocationwszLoaderREGULARoursvddcIgnPeekxfs(- cffAudmngsrb�includean INVALIDVersionedParamsSTANDINGCntlMACH ctLoader MITuFFFD Query rtlprivDISABLEDWisecnfmqdusrStringBuffer 293mdelay5677capacitystrictedribeBEDwerURSTequivincipals063IWSERCOM capabilityrevert�ietfsisusb Orient═writeb mIs Zone reduces sentineldigesthwmodBreakschangedsa yes

What dose this mean, and what message should I enter?

@urialon
Copy link
Collaborator

urialon commented May 6, 2023

Hi @LuYujing-97 ,
Thank you for your interest in our work!

Can you please specify your python version, your transformers version, and then what exactly did you run before that?

Thanks,
Uri

@LuYujing-97
Copy link
Author

Hi
my python version is 3.8.10, my transformers is 4.5.0.
I just followed your instructions in README, first download checkpoints:wget https://zenodo.org/record/6363556/files/2-7B-150K.tar,
then Via Docker:nvidia-docker run --rm -it -e NVIDIA_VISIBLE_DEVICES=0 --shm-size=1g --ulimit memlock=-1 --mount type=bind,src=$PWD/checkpoints,dst=/gpt-neox/checkpoints vhellendoorn/code-lms-neox:base,
and Code Generation:sudo ./deepy.py generate.py configs/text_generation.yml checkpoints/configs/local_setup.yml checkpoints/configs/2-7B.yml.
Then the screen shows "Context prompt >>>", I enter : def return1():\n """Returns 1."""\n

Thanks,
Yujing

@urialon
Copy link
Collaborator

urialon commented May 8, 2023

Hi @LuYujing-97 ,

What are these deepy.py and generate.py files?
Did you try according to the instructions on our README? https://github.com/VHellendoorn/Code-LMs#october-2022---polycoder-is-available-on-huggingface

Best,
Uri

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants