Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support "antialiased" splats #142

Closed
bchretien opened this issue Feb 29, 2024 · 31 comments
Closed

Support "antialiased" splats #142

bchretien opened this issue Feb 29, 2024 · 31 comments

Comments

@bchretien
Copy link

Hi!

First of all, thanks for the project and kudos for the code quality and the comments (e.g. in SplatMesh.js). I was checking your viewer after playing around with Nerfstudio. I enabled the new "antialiasing" version of their rasterizer (see motivation here and integration in Nerfstudio there), but the problem is that it requires explicit support in viewers down the line. To summarize, a compensation factor would need to be computed in the vertex shader and applied to the gaussian opacities.

Would that feature be in the scope of this project? If it is, I could provide a PR. Since rasterization is done differently in this viewer, I guess the compensation would need to be applied to vColor.a directly?

@mkkellogg
Copy link
Owner

This seems like it would be a good feature to have, but I'm not sure how best to integrate it. Specifically I'd like to avoid having to specify a parameter when loading a file to indicate whether it is an "anti-aliased" or "classic" splat. Ideally that information would be encoded into the file itself -- Does the Nerfstudio code support that?

As for updating my viewer, it looks like it would be super easy, in fact the authors of the mip-splatting work created a web viewer (based on mine) for that purpose, and it looks like they just made a small update to the vertex shader in SplatMesh.js:

  // compute the coef of alpha based on the detemintant
  float kernel_size = 0.1;
  float det_0 = max(1e-6, cov2Dm[0][0] * cov2Dm[1][1] - cov2Dm[0][1] * cov2Dm[0][1]);
  float det_1 = max(1e-6, (cov2Dm[0][0] + kernel_size) * (cov2Dm[1][1] + kernel_size) - cov2Dm[0][1] * cov2Dm[0][1]);
  float coef = sqrt(det_0 / (det_1+1e-6) + 1e-6);

  if (det_0 <= 1e-6 || det_1 <= 1e-6){
      coef = 0.0f;
  }

  cov2Dm[0][0] += kernel_size;
  cov2Dm[1][1] += kernel_size;
  vColor.a *= coef;

@bchretien
Copy link
Author

@mkkellogg thanks for the quick response! Alas this is not encoded in the exported PLY file, and the format itself makes it impractical. Until we have #47 where such metadata would be easy to store and retrieve, I'm afraid we're stuck with a parameter that needs to be specified manually.

@ichsan2895
Copy link

Hello, I want to fork this repo and add this code, but I don't know where to put the code:

  // compute the coef of alpha based on the detemintant
  float kernel_size = 0.1;
  float det_0 = max(1e-6, cov2Dm[0][0] * cov2Dm[1][1] - cov2Dm[0][1] * cov2Dm[0][1]);
  float det_1 = max(1e-6, (cov2Dm[0][0] + kernel_size) * (cov2Dm[1][1] + kernel_size) - cov2Dm[0][1] * cov2Dm[0][1]);
  float coef = sqrt(det_0 / (det_1+1e-6) + 1e-6);

  if (det_0 <= 1e-6 || det_1 <= 1e-6){
      coef = 0.0f;
  }

  cov2Dm[0][0] += kernel_size;
  cov2Dm[1][1] += kernel_size;
  vColor.a *= coef;

Anyway, If we apply antialiased method for all PLY. May it broken any PLY that created from other software? (Inria 3DGS, Luma, Polycam, OpenSplat, etc) ?

@jb-ye
Copy link

jb-ye commented Mar 13, 2024

I am the author of antialiasing mode of splatfacto , the code change needed to support web viewer is small (but slightly different from mipsplatting).

https://github.com/nerfstudio-project/gsplat/blob/main/gsplat/_torch_impl.py#L188

Currently it is not possible to store any metadata in PLY format. Any suggestions for us to move forward?

For short term, it makes sense to have a keyboard hotkey to toggle between classic and antialiasing mode.

@mkkellogg
Copy link
Owner

For my viewer, I think it makes sense to add a parameter to the Viewer.addSplatScene() and Viewer.addSplatScenes() functions to indicate the mode in which the scene should be rendered. How does that sound?

@jb-ye
Copy link

jb-ye commented Mar 14, 2024

For my viewer, I think it makes sense to add a parameter to the Viewer.addSplatScene() and Viewer.addSplatScenes() functions to indicate the mode in which the scene should be rendered. How does that sound?

Sounds good to me. Let me know if you have any confusion about how to compute opacity compensation factor. I can help double check the code change you made.

@bchretien
Copy link
Author

@jb-ye: if @mkkellogg is OK with it, maybe you could provide a demo file as well? The backpack model from nerfstudio-project/gsplat#140 makes for a really compelling argument for this feature, and it would make testing easier.

@mkkellogg
Copy link
Owner

@jb-ye It would definitely be helpful if you could provide a demo file. Then I can take a stab at implementing the computation for the opacity compensation factor.

@jb-ye
Copy link

jb-ye commented Mar 18, 2024

Will do so when I got time this week. @bchretien @mkkellogg

@jb-ye
Copy link

jb-ye commented Mar 21, 2024

https://drive.google.com/file/d/19e0iAsoc9F26ilM4s6g0Y6n0n9-WRjV5/view?usp=drive_link
Please check out this link for a sample ply asset.

Left: classic mode rendering of antialiased asset; Right: antialiased mode
Screenshot 2024-03-21 at 2 05 08 PM

@mkkellogg
Copy link
Owner

@jb-ye Thanks! I'll try to make the update in the next couple of days and let you know when it's ready.

@mkkellogg
Copy link
Owner

@jb-ye Would you be able to share the cameras.json (or equivalent parameters) for the above scene?

@jb-ye
Copy link

jb-ye commented Mar 22, 2024

@mkkellogg What do you mean by cameras.json. Are they training camera parameters? I assume they doesn't matter for rendering, right?

@mkkellogg
Copy link
Owner

mkkellogg commented Mar 22, 2024

Well I noticed that using the standard calculation for focal length using Three.js projection matrix is producing sub-optimal results in terms of render quality (subjective for sure), so I thought I'd play around with them to see if I can get better results by matching the training camera parameters.

@jb-ye
Copy link

jb-ye commented Mar 22, 2024

Just want to double check your formula implemented. Here is the reference implementation from splatfacto/gsplat lib, where blurring kernel is 0.3 pixel and compensation is clamped instead of adding a tiny round-off eps. This is different from the one used by mip-splatting.

    det_orig = cov2d[..., 0, 0] * cov2d[..., 1, 1] - cov2d[..., 0, 1] * cov2d[..., 0, 1]
    cov2d[..., 0, 0] = cov2d[..., 0, 0] + 0.3
    cov2d[..., 1, 1] = cov2d[..., 1, 1] + 0.3
    det_blur = cov2d[..., 0, 0] * cov2d[..., 1, 1] - cov2d[..., 0, 1] * cov2d[..., 0, 1]
    compensation = torch.sqrt(torch.clamp(det_orig / det_blur, min=0))

Here is the training camera:

            "w": 899,
            "h": 1600,
            "fl_x": 1337.7803526580215,
            "fl_y": 1338.3579272604447,
            "cx": 449.5,
            "cy": 800.0

Also spherical harmonics are used for rendering.

@mkkellogg
Copy link
Owner

Yep that's that math I'm using. I got something working that I think looks pretty good:

before_after

I used this code in the shader:

  float detOrig = cov2Dm[0][0] * cov2Dm[1][1] - cov2Dm[0][1] * cov2Dm[0][1];
  cov2Dm[0][0] += 0.3;
  cov2Dm[1][1] += 0.3;
  float detBlur = cov2Dm[0][0] * cov2Dm[1][1] - cov2Dm[0][1] * cov2Dm[0][1];
  float compensation = sqrt(max(detOrig / detBlur, 0.0));
  vColor.a *= compensation;

@jb-ye
Copy link

jb-ye commented Mar 22, 2024

Fantastic

@mkkellogg
Copy link
Owner

Thank you for your help on this!

@gonzalle
Copy link

That is a great news ! Can't wait to test it !

@ichsan2895
Copy link

ichsan2895 commented Mar 24, 2024

My experiment with Truck dataset

Left = Splatfacto-big with classic mode
Right = Splatfacto-big with antialiased mode

A0000_GSplat018-Antialiased-SplatBIG

A0001_GSplat018-Antialiased-SplatBIG

A0002_GSplat018-Antialiased-SplatBIG

@mkkellogg
Copy link
Owner

This is now officially supported in this release: https://github.com/mkkellogg/GaussianSplats3D/releases/tag/v0.3.6

@jb-ye
Copy link

jb-ye commented Mar 26, 2024

@mkkellogg thanks if you want to use half precision to store covariance, you would notice some quality loss for antialiasing mode. The fix is instead of saving the covariance in half precision, saving its Cholesky decomposition in half precision, and multiple it back to get the full covariance of full precision at usage time.

@gonzalle
Copy link

gonzalle commented Mar 28, 2024

I have fantastic results with AA splats but still some issues with highly contrasted models
from a "standard" pov, (close to the training images), the details are undoubtly better.
Left : with AA , right no AA
aa_close

But when I get my camera further, I get some weird white artifacts on the AA model.
(the non AA model still looks bad but don't have those artifacts.
aa_far

Any idea about this problem ?... I guess it has something to do with transparency, since whats is behind the fabric in my model is white. (I mean the big splats "inside" the bag)

@jb-ye
Copy link

jb-ye commented Mar 28, 2024

@gonzalle did you observe similar issue in nerfstudio viewer? I think it might relate to spherical harmonics not used in the web viewer.

@gonzalle
Copy link

@gonzalle did you observe similar issue in nerfstudio viewer? I think it might relate to spherical harmonics not used in the web viewer.

Indeed !
the nerfstudio viewer rendering was clean from any distance...
Is there any trick around ? beside having less textured material ?

@jb-ye
Copy link

jb-ye commented Mar 28, 2024

@gonzalle did you observe similar issue in nerfstudio viewer? I think it might relate to spherical harmonics not used in the web viewer.

Indeed ! the nerfstudio viewer rendering was clean from any distance... Is there any trick around ? beside having less textured material ?

Maybe try to reduce directional lights during capture, use more ambient lighting

@mkkellogg
Copy link
Owner

@gonzalle Would you be willing to share your model? Maybe I can do some troubleshooting on my end.

@gonzalle
Copy link

@gonzalle Would you be willing to share your model? Maybe I can do some troubleshooting on my end.

Certainly, I just have to train again my model...
I hope it'll be fast... gime a couple hours :))

@gonzalle
Copy link

gonzalle commented Mar 28, 2024

aa_weird.zip
Mark,
Here is the file... As I had to re-train it and edit it, I noticed that nerfstudio (when at low res) and Supersplat, both show the same issue.
For me it's something that has to do with the big white splat behind the fabric texture and some kind of "mip mapping" issues...

You'll have to turn around the model to display the good face , then go further from it to see the issue.

@mkkellogg
Copy link
Owner

So I have also noticed this issue, and I don't know exactly what is happening but it does seem like it's caused by splats that are much bigger than the other splats in the scene. I added a super hacky viewer parameter called focalAdjustment that I have discovered helps. Try something like this when you run the viewer:

const viewer = new GaussianSplats3D.Viewer({
  'focalAdjustment': 3.0
});

The default value for focalAdjustment is 1.0, increasing it seems to make these kinds of artifacts less noticeable. I want to reiterate this is a big ol' hack, and I don't know exactly why it helps :)

@gonzalle
Copy link

Great, I will try that. Thanks a lot !
Gaussian Splatting, as a global techniqu,e and as it stands today, still has some issues. In any case, I firmly believe that such a technique, combined with a good web viewer, is a fantastic step forward! Keep on the fantastic job !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants