Skip to content

How to Reproduce Training with Auto-Encoder? #20

@haok1402

Description

@haok1402

Hi there!

Could you please provide instructions on how to reproduce the training with auto-encoder?

As I try to run the code with,

python3 big_sweep_experiments.py

It first attempts to import something undefined.

Traceback (most recent call last): 
  File "/home/haok/sparse_coding/big_sweep_experiments.py", line 13, in <module>
    from autoencoders.direct_coef_search import DirectCoefOptimizer
ModuleNotFoundError: No module named 'autoencoders.direct_coef_search'

And, as I remove that line, there is a problem with the config.

Traceback (most recent call last):
  File "/home/haok/sparse_coding/big_sweep_experiments.py", line 1280, in <module>
    run_pythia_1_4_b_sweep()
  File "/home/haok/sparse_coding/big_sweep_experiments.py", line 907, in run_pythia_1_4_b_sweep
    sweep(pythia_1_4_b_dict, cfg)
  File "/home/haok/sparse_coding/big_sweep.py", line 351, in sweep
    if cfg.n_repetitions is not None:
AttributeError: 'EnsembleArgs' object has no attribute 'n_repetitions'

By changing that condition check with n_repetitions to,

if hasattr(cfg, "n_repetitions") and cfg.n_repetitions is not None:

Another issue arrives.

Traceback (most recent call last):
  File "/home/haok/sparse_coding/big_sweep_experiments.py", line 1280, in <module>
    run_pythia_1_4_b_sweep()
  File "/home/haok/sparse_coding/big_sweep_experiments.py", line 907, in run_pythia_1_4_b_sweep
    sweep(pythia_1_4_b_dict, cfg)
  File "/home/haok/sparse_coding/big_sweep.py", line 360, in sweep
    if cfg.center_activations:
AttributeError: 'EnsembleArgs' object has no attribute 'center_activations'

By changing that condition check with center_activations to,

if hasattr(cfg, "center_activations") and cfg.center_activations:

Another KeyError occurs.

  File "/home/haok/sparse_coding/big_sweep.py", line 169, in ensemble_train_loop
    losses, aux_buffer = ensemble.step_batch(batch)
  File "/home/haok/sparse_coding/autoencoders/ensemble.py", line 180, in step_batch
    grads, (loss, aux) = self.calc_grads(self.params, self.buffers, minibatches)
  File "/home/haok/miniconda3/envs/ICLR_6054/lib/python3.10/site-packages/torch/_functorch/vmap.py", line 434, in wrapped                                                                         
    return _flat_vmap(
  File "/home/haok/miniconda3/envs/ICLR_6054/lib/python3.10/site-packages/torch/_functorch/vmap.py", line 39, in fn                                                                               
    return f(*args, **kwargs)
  File "/home/haok/miniconda3/envs/ICLR_6054/lib/python3.10/site-packages/torch/_functorch/vmap.py", line 619, in _flat_vmap
    batched_outputs = func(*batched_inputs, **kwargs)
  File "/home/haok/sparse_coding/autoencoders/ensemble.py", line 120, in calc_grads
    return torch.func.grad(self.sig.loss, has_aux=True)(params, buffers, batch)
  File "/home/haok/miniconda3/envs/ICLR_6054/lib/python3.10/site-packages/torch/_functorch/eager_transforms.py", line 1380, in wrapper
    results = grad_and_value(func, argnums, has_aux=has_aux)(*args, **kwargs)
  File "/home/haok/miniconda3/envs/ICLR_6054/lib/python3.10/site-packages/torch/_functorch/vmap.py", line 39, in fn
    return f(*args, **kwargs)
  File "/home/haok/miniconda3/envs/ICLR_6054/lib/python3.10/site-packages/torch/_functorch/eager_transforms.py", line 1245, in wrapper
    output = func(*args, **kwargs)
  File "/home/haok/sparse_coding/autoencoders/sae_ensemble.py", line 150, in loss
    l_bias_decay = buffers["bias_decay"] * torch.norm(params["encoder_bias"], 2)
KeyError: 'bias_decay'

As I check the keys inside buffers, apparently, no key inside there matched bias_decay

dict_keys(['center_rot', 'center_trans', 'center_scale', 'l1_alpha'])

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions