Skip to content

fixed bounds scaling for deepLS reconstruction#27

Closed
lfreinberger wants to merge 3 commits intomkofler96:mainfrom
lfreinberger:main
Closed

fixed bounds scaling for deepLS reconstruction#27
lfreinberger wants to merge 3 commits intomkofler96:mainfrom
lfreinberger:main

Conversation

@lfreinberger
Copy link
Contributor

No description provided.

@deepsource-io
Copy link

deepsource-io bot commented Feb 10, 2026

Here's the code health analysis summary for commits 0786f4f..82ea298. View details on DeepSource ↗.

Analysis Summary

AnalyzerStatusSummaryLink
DeepSource Python LogoPython✅ Success
🎯 2 occurences resolved
View Check ↗

💡 If you’re a repository administrator, you can configure the quality gates from the settings.

@mkofler96
Copy link
Owner

mkofler96 commented Feb 10, 2026

What about using Newton:

def reconstruct_from_samples_newton(
    sdf: SDFBase,
    sdfSample: SampledSDF,
    num_iterations=200,          # LBFGS needs far fewer steps
    lr=1.0,                      # this is a step size, not Adam lr
    loss_fn="ClampedL1",
    batch_size=None,             # full batch is strongly recommended
):

    gt_dist = sdfSample.distances

    if loss_fn == "L1":
        Loss = torch.nn.L1Loss()
    elif loss_fn == "ClampedL1":
        Loss = ClampedL1Loss(clamp_val=0.1)
    elif loss_fn == "MSE":
        Loss = torch.nn.MSELoss()
    else:
        raise NotImplementedError(f"Loss function {loss_fn} not available.")

    dataset = TensorDataset(sdfSample.samples, gt_dist)

 
    if batch_size is None:
        dataloader = DataLoader(dataset, batch_size=len(dataset), shuffle=False)
    else:
        dataloader = DataLoader(dataset, batch_size=batch_size, shuffle=True)

    optimizer = torch.optim.LBFGS(
        sdf.parameters(),
        lr=lr,
        max_iter=20,              # inner Newton iterations
        history_size=100,         # curvature memory
        line_search_fn="strong_wolfe",
    )

    loss_history = []
    pbar = trange(num_iterations, desc="Reconstructing SDF (L-BFGS)", leave=True)

    for _ in pbar:
        for queries, gt_batch in dataloader:

            def closure():
                optimizer.zero_grad()
                pred_dist = sdf(queries)
                loss = Loss(pred_dist, gt_batch)
                loss.backward()
                return loss

            loss = optimizer.step(closure)
            loss_val = loss.item()

            pbar.set_postfix({"loss": f"{loss_val:.5f}"})
            loss_history.append(loss_val)

    return list(sdf.parameters())

@mkofler96 mkofler96 self-requested a review February 10, 2026 13:07
@mkofler96 mkofler96 closed this Feb 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments