Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Total Number of Parameters #36

Closed
ArminMasoumian opened this issue May 24, 2023 · 4 comments
Closed

Total Number of Parameters #36

ArminMasoumian opened this issue May 24, 2023 · 4 comments

Comments

@ArminMasoumian
Copy link

Thank you for sharing your great work. I want to print the total number of parameters but it seems it's given me the wrong numbers.

I added these two lines of codes in the trainer.py code after "print("Training is using:\n ", self.device)":

print("Total number of parameters to train:", len(self.parameters_to_train))
print("Total number of parameters to train Pose:", len(self.parameters_to_train_pose))

However here are the results I got: " Total number of parameters to train: 227, Total number of parameters to train Pose: 70"

Would you please let me know how can I print the total number of parameters for the whole training? Im not using any pre-trained model.

Here is the comment for training:

python train.py --data_path /media/armin/DATA/Lightweight/kitti_data --model_name mytrain --num_epochs 30 --num_workers 4 --batch_size 4 --lr 0.0001 5e-6 31 0.0001 1e-5 31

@noahzn
Copy link
Owner

noahzn commented May 24, 2023

Hello,

There is a function that you can use to compute the parameters and FLOPs of a model. You need to install thop first.

@ArminMasoumian
Copy link
Author

Hello,

There is a function that you can use to compute the parameters and FLOPs of a model. You need to install thop first.

Thank you for your prompt response.
I have an additional question regarding training the lite-mono-8m model from scratch. I'm unsure whether I need to include "--model lite-mono-8m" in my command line. Could you please clarify which of the following commands should be used for training the lite-mono-8m model from scratch with an image size of 1024x320?
Command 1: "python train.py --data_path /media/armin/DATA/Lightweight/kitti_data --model_name mytrain --model lite-mono-8m --num_epochs 30 --num_workers 4 --batch_size 4 --height 320 --width 1024"
or
Command 2: "python train.py --data_path /media/armin/DATA/Lightweight/kitti_data --model_name mytrain --num_epochs 30 --num_workers 4 --batch_size 4 --height 320 --width 1024"

In addition, I would like to train my model using an image resolution of 1024x320, using the pre-trained ImageNet model you provided in this repository. However, the pre-trained ImageNet model available is specifically trained for an image resolution of 640x192. I'm wondering if it is possible to use the same pre-trained model for different image sizes, or if I need to create my own pre-trained model specifically for an image resolution of 1024x320.

@noahzn
Copy link
Owner

noahzn commented May 25, 2023

  1. Command 1 is correct. If you do not specify --model the default model is lite-mono.
  2. You don't need different ImageNet pre-trained weights for the resolution of 1024x320. The pre-trained weights were obtained by training on ImageNet using an input size of 256x256.

@ArminMasoumian
Copy link
Author

  1. Command 1 is correct. If you do not specify --model the default model is lite-mono.
  2. You don't need different ImageNet pre-trained weights for the resolution of 1024x320. The pre-trained weights were obtained by training on ImageNet using an input size of 256x256.

Thank you so much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants