Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batchsize #4

Open
CharlesAntoineParent opened this issue Apr 20, 2022 · 2 comments
Open

Batchsize #4

CharlesAntoineParent opened this issue Apr 20, 2022 · 2 comments

Comments

@CharlesAntoineParent
Copy link

Hello, I was wondering why it's not possible to have bigger batch. Like if you just do two forward propagation in the JL phase and combine the result in the CM block ?

@kerenfu
Copy link
Collaborator

kerenfu commented Dec 13, 2022

Hello, I was wondering why it's not possible to have bigger batch. Like if you just do two forward propagation in the JL phase and combine the result in the CM block ?

Indeed yes, you can manage this issue in the code level. By using bigger batch, the code should be modified to distinguish different samples in the batch and avoid mixing them up in the CM block. As long as the samples are distinguished properly, they can then form a new batch after the CM block and compute loss with a batch of GT. To realize this, certain modification should be made to the CM block.

@jiangyao-scu
Copy link
Owner

Since RGB images and Depth maps are first loaded as batch data and then connected in the batch before entering the network, they can be split in half in the batch dimension to distinguish different samples. The ``forward'' function of the CM module (the CMLayer in JL_DCF.py) can be modified as follows:

    def forward(self, list_x):
        b, c, h, w = list_x[0].size()
        batch_size = int(b/2)
        resl = []
        for i in range(len(list_x)):
            part1 = list_x[i][0:batch_size]
            part2 = list_x[i][batch_size:]
            sum = (part1 + part2 + (part1 * part2))
            resl.append(sum)
        return resl

In this way, you can enable bigger batch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants