Skip to content
This repository has been archived by the owner on Jan 10, 2023. It is now read-only.

TypeError: 'NoneType' object is not iterable #49

Open
odakiese opened this issue May 2, 2020 · 2 comments
Open

TypeError: 'NoneType' object is not iterable #49

odakiese opened this issue May 2, 2020 · 2 comments

Comments

@odakiese
Copy link

odakiese commented May 2, 2020

After installing all packages I get the error:

compare_gan/gans/modular_gan.py:400 _preprocess_fn  *
    features = {
gin/config.py:407 wrapper  *
    operative_parameter_values = _get_default_configurable_parameter_values(
gin/config.py:738 _get_default_configurable_parameter_values  *
    representable = _is_literally_representable(arg_vals[k])
gin/config.py:537 _is_literally_representable  *
    return _format_value(value) is not None
gin/config.py:520 _format_value  *
    if parse_value(literal) == value:
gin/config.py:1480 parse_value  *
    return config_parser.ConfigParser(value, ParserDelegate()).parse_value()
gin/config_parser.py:250 parse_value  *
    self._raise_syntax_error('Unable to parse value.')
gin/config_parser.py:287 _raise_syntax_error  *
    raise SyntaxError(msg, location)
tensorflow_core/python/autograph/impl/api.py:396 converted_call
    return py_builtins.overload_of(f)(*args)

TypeError: 'NoneType' object is not iterable
@a7b23
Copy link

a7b23 commented May 8, 2020

+1

@ilyakava
Copy link

ilyakava commented Jun 30, 2020

Happening on python 3.6.9 and tf 1.15.0. For some reason any dataset appears as empty.

What worked for me to avoid this error is replacing the train_input_fn here with:

  def train_input_fn(self, params=None, preprocess_fn=None):
    """Input function for reading data.

    Args:
      params: Python dictionary with parameters. Must contain the key
        "batch_size". TPUEstimator will set this for you!
      preprocess_fn: Function to process single examples. This is allowed to
        have a `seed` argument.

    Returns:
      `tf.data.Dataset` with preprocessed and batched examples.
    """
    if params is None:
      params = {}
    seed = self._get_per_host_random_seed(params.get("context", None))
    logging.info("train_input_fn(): params=%s seed=%s", params, seed)

    ds = self._load_dataset(split=self._train_split)
    # ds = ds.filter(self._train_filter_fn)
    ds = ds.repeat()
    def one_function(image, label):
      images, labels = self._train_transform_fn(image, label, seed=seed)
      features = {
          "images": images,
          "z": np.random.randn(120).astype(np.float32),
      }
      features["sampled_labels"] = labels
      
      return features, labels
    ds = ds.map(one_function)
      
    # ds = ds.map(functools.partial(self._train_transform_fn, seed=seed))
    # if preprocess_fn is not None:
      # if "seed" in inspect.getargspec(preprocess_fn).args:
      #   preprocess_fn = functools.partial(preprocess_fn, seed=seed)
      # ds = ds.map(one_function)
      # Add a feature for the random offset of operations in tpu_random.py.
    ds = tpu_random.add_random_offset_to_features(ds)
    # ds = ds.shuffle(FLAGS.data_shuffle_buffer_size, seed=seed)
    if "batch_size" in params:
      ds = ds.batch(params["batch_size"], drop_remainder=True)
    return ds.prefetch(tf.contrib.data.AUTOTUNE)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants