We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
While running prediction task I receive type error
export EXPERIMENT_NAME="llm" export RUN_ID=$(python madewithml/predict.py get-best-run-id --experiment-name $EXPERIMENT_NAME --metric val_loss --mode ASC) python madewithml/predict.py predict \ --run-id $RUN_ID \ --title "Transfer learning with transformers" \ --description "Using transformers for transfer learning on text classification tasks." ╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /Users/surajb/private/Made-With-ML/madewithml/predict.py:133 in predict │ │ │ │ 130 │ │ │ 131 │ # Predict │ │ 132 │ sample_df = pd.DataFrame([{"title": title, "description": description, "tag": "other │ │ ❱ 133 │ results = predict_with_proba(df=sample_df, predictor=predictor, index_to_class=prepr │ │ 134 │ logger.info(json.dumps(results, cls=NumpyEncoder, indent=2)) │ │ 135 │ return results │ │ 136 │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ best_checkpoint = TorchCheckpoint(local_path=/private/tmp/mlflow/852956352849222177/3dab467… │ │ │ │ description = 'Using transformers for transfer learning on text classification tasks.' │ │ │ │ predictor = TorchPredictor(model=FinetunedLLM( │ │ │ │ (llm): BertModel( │ │ │ │ │ (embeddings): BertEmbeddings( │ │ │ │ │ (word_embeddings): Embedding(31090, 768, padding_idx=0) │ │ │ │ │ (position_embeddings): Embedding(512, 768) │ │ │ │ │ (token_type_embeddings): Embedding(2, 768) │ │ │ │ │ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) │ │ │ │ │ (dropout): Dropout(p=0.1, inplace=False) │ │ │ │ │ ) │ │ │ │ │ (encoder): BertEncoder( │ │ │ │ │ (layer): ModuleList( │ │ │ │ │ │ (0-11): 12 x BertLayer( │ │ │ │ │ │ (attention): BertAttention( │ │ │ │ │ │ │ (self): BertSelfAttention( │ │ │ │ │ │ │ (query): Linear(in_features=768, out_features=768, │ │ │ │ bias=True) │ │ │ │ │ │ │ (key): Linear(in_features=768, out_features=768, bias=True) │ │ │ │ │ │ │ (value): Linear(in_features=768, out_features=768, │ │ │ │ bias=True) │ │ │ │ │ │ │ (dropout): Dropout(p=0.1, inplace=False) │ │ │ │ │ │ │ ) │ │ │ │ │ │ │ (output): BertSelfOutput( │ │ │ │ │ │ │ (dense): Linear(in_features=768, out_features=768, │ │ │ │ bias=True) │ │ │ │ │ │ │ (LayerNorm): LayerNorm((768,), eps=1e-12, │ │ │ │ elementwise_affine=True) │ │ │ │ │ │ │ (dropout): Dropout(p=0.1, inplace=False) │ │ │ │ │ │ │ ) │ │ │ │ │ │ ) │ │ │ │ │ │ (intermediate): BertIntermediate( │ │ │ │ │ │ │ (dense): Linear(in_features=768, out_features=3072, bias=True) │ │ │ │ │ │ │ (intermediate_act_fn): GELUActivation() │ │ │ │ │ │ ) │ │ │ │ │ │ (output): BertOutput( │ │ │ │ │ │ │ (dense): Linear(in_features=3072, out_features=768, bias=True) │ │ │ │ │ │ │ (LayerNorm): LayerNorm((768,), eps=1e-12, │ │ │ │ elementwise_affine=True) │ │ │ │ │ │ │ (dropout): Dropout(p=0.1, inplace=False) │ │ │ │ │ │ ) │ │ │ │ │ │ ) │ │ │ │ │ ) │ │ │ │ │ ) │ │ │ │ │ (pooler): BertPooler( │ │ │ │ │ (dense): Linear(in_features=768, out_features=768, bias=True) │ │ │ │ │ (activation): Tanh() │ │ │ │ │ ) │ │ │ │ ) │ │ │ │ (dropout): Dropout(p=0.5, inplace=False) │ │ │ │ (fc1): Linear(in_features=768, out_features=6, bias=True) │ │ │ │ ), preprocessor=<madewithml.data.CustomPreprocessor object at │ │ │ │ 0x17fbc5b10>, use_gpu=False) │ │ │ │ preprocessor = <madewithml.data.CustomPreprocessor object at 0x17fbc5b10> │ │ │ │ run_id = '3dab46713f524aa0a9a3df3227e0ca1f' │ │ │ │ sample_df = │ │ │ │ │ │ │ │ title │ │ │ │ description tag │ │ │ │ 0 Transfer learning with transformers Using transformers for transfer │ │ │ │ learning on te... other │ │ │ │ title = 'Transfer learning with transformers' │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ TypeError: predict_with_proba() got an unexpected keyword argument 'index_to_class'
I was able to successfully run predict task by removing the last argument
Made-With-ML/madewithml/predict.py
Line 133 in 68e031d
to match the function definition
Lines 50 to 53 in 68e031d
The text was updated successfully, but these errors were encountered:
Hi @biyanisuraj , great catch! I've updated the code to fix this mistake. Thanks for submitting this issue.
Sorry, something went wrong.
No branches or pull requests
While running prediction task I receive type error
I was able to successfully run predict task by removing the last argument
Made-With-ML/madewithml/predict.py
Line 133 in 68e031d
to match the function definition
Made-With-ML/madewithml/predict.py
Lines 50 to 53 in 68e031d
The text was updated successfully, but these errors were encountered: