Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove unnecessary rank check in potrs call. #591

Merged
merged 1 commit into from Mar 21, 2016
Merged

Remove unnecessary rank check in potrs call. #591

merged 1 commit into from Mar 21, 2016

Conversation

bamos
Copy link
Contributor

@bamos bamos commented Mar 13, 2016

Hi, this PR removes an unnecessary rank check in the potrs code.
To illustrate why I think this is unnecessary, consider the following example.

A = torch.Tensor({
    {1.2705,  0.9971,  0.4948,  0.1389,  0.2381},
    {0.9971,  0.9966,  0.6752,  0.0686,  0.1196},
    {0.4948,  0.6752,  1.1434,  0.0314,  0.0582},
    {0.1389,  0.0686,  0.0314,  0.0270,  0.0526},
    {0.2381,  0.1196,  0.0582,  0.0526,  0.3957}})

B = torch.Tensor({
    {0.6219,  0.3439,  0.0431},
    {0.5642,  0.1756,  0.0153},
    {0.2334,  0.8594,  0.4103},
    {0.7556,  0.1966,  0.9637},
    {0.1420,  0.7185,  0.7476}})

chol = torch.potrf(A)
torch.potrs(B, chol)
torch.potrs(torch.cat({B,B}, 2), chol)

cat(B,B) fails the rank check, but this is irrelevant and
shouldn't result in an error.

With the unnecessary check

th> torch.potrs(B, chol)
 -94.2808   -9.3526 -121.8567
  85.0306    7.3517  108.4600
 -17.9665   -0.5677  -22.3295
 339.4752   36.4924  437.1153
 -11.0946    0.4539  -12.3903
[torch.DoubleTensor of size 5x3]

                                                                      [0.0001s] 
th> torch.potrs(torch.cat({B,B}, 2), chol)
bad argument #2 to '?' (Matrix B is rank-deficient at /home/bamos/torch/pkg/torch/lib/TH/generic/THTensorLapack.c:483)
stack traceback:
    [C]: at 0x7f6b7f8d7b90
    [C]: in function 'potrs'
    [string "_RESULT={torch.potrs(torch.cat({B,B}, 2), chol)}"]:1: in main chunk
    [C]: in function 'xpcall'
    /home/bamos/torch/install/share/lua/5.1/trepl/init.lua:650: in function 'repl'
    ...amos/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:199: in main chunk
    [C]: at 0x00405be0

Without the unnecessary check (this PR)

th> torch.potrs(B, chol)
 -94.2808   -9.3526 -121.8567
  85.0306    7.3517  108.4600
 -17.9665   -0.5677  -22.3295
 339.4752   36.4924  437.1153
 -11.0946    0.4539  -12.3903
[torch.DoubleTensor of size 5x3]

                                                                      [0.0004s] 
th> torch.potrs(torch.cat({B,B}, 2), chol)
 -94.2808   -9.3526 -121.8567  -94.2808   -9.3526 -121.8567
  85.0306    7.3517  108.4600   85.0306    7.3517  108.4600
 -17.9665   -0.5677  -22.3295  -17.9665   -0.5677  -22.3295
 339.4752   36.4924  437.1153  339.4752   36.4924  437.1153
 -11.0946    0.4539  -12.3903  -11.0946    0.4539  -12.3903
[torch.DoubleTensor of size 5x6]

@j-wilson
Copy link
Contributor

Was about to make a similar PR; nice catch.

soumith added a commit that referenced this pull request Mar 21, 2016
Remove unnecessary rank check in potrs call.
@soumith soumith merged commit 2620d18 into torch:master Mar 21, 2016
@soumith
Copy link
Member

soumith commented Mar 21, 2016

Thanks Brandon!

colesbury pushed a commit to colesbury/torch7 that referenced this pull request Nov 3, 2016
Remove unnecessary rank check in potrs call.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants