New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem with trainer.fit(), operands of different shape #14
Comments
Hi, |
Hi, I will try to look at it in the next few days. Thanks for your help! |
@Stephenito Hi -- As @Thommy257 said, the problem is that while your labels are 2-D (as you confirm), the output of the model is 1-D (a scalar). After getting the output of the model, you need to convert it into 2-d before passing it to the loss function. Hope this helps. |
Hi, instead of working with the model I made the labels 1-d. As i said before, the error is not there anymore, but the training is working with a strange behaviour. Epoch 1: train/loss: 0.0000 valid/loss: 0.0000 train/acc: 0.2458 valid/acc: 0.3000 Training completed! I tried with the following samples:
Thanks again! |
Have you also adjusted your loss function? Or it still assumes your labels are 2-D? |
Yes, i adjusted it for scalar values, but it returns the same behaviour. |
Hi I am getting the same error |
The resolve to this error is due to one or more diagrams having 2 output wires, one way to resolve this is to manually check for all the diagrams and see which sentences have 2 output wires instead of one S wire output. Based on experience usually it's the sentences which start with a verb such as : |
@ACE07-Sev your issue is different, it arises from Bobcat correctly parsing imperative sentences to pregroup type
|
@Stephenito since the original issue has been resolved, I will close the issue. The |
Hi,
I am trying to run the quantum trainer algorithm. When running the following line:
trainer.fit(train_dataset, val_dataset, evaluation_step=1, logging_step=100)
i get the following error:
I have just fixed the .py file in the lib following #12. The algorithm raised an error even before. I can't recall exactly, but i don't think it was the same error.
What can i do to solve this?
Thank you for your time.
The text was updated successfully, but these errors were encountered: