Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Colab Regression Example No Longer Working? #13

Closed
windowshopr opened this issue Jul 24, 2022 · 2 comments
Closed

Colab Regression Example No Longer Working? #13

windowshopr opened this issue Jul 24, 2022 · 2 comments

Comments

@windowshopr
Copy link

Trying to run the Colab Regression notebook. All dependencies get installed, I Restart and Run All to start the code. It errors out here:

##Step 1
##Run Data setup -> Infer Schema, find anomalies, create profile and show viz
tfa.step_data_explore(viz=False)
Data: Pipeline execution started...
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be.
WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be.
ERROR:absl:Execution 2 failed.
---------------------------------------------------------------------------
TypeCheckError                            Traceback (most recent call last)
[<ipython-input-6-7e17a616f197>](https://localhost:8080/#) in <module>
      1 ##Step 1
      2 ##Run Data setup -> Infer Schema, find anomalies, create profile and show viz
----> 3 tfa.step_data_explore(viz=False)

14 frames
[/usr/local/lib/python3.7/dist-packages/auto_tensorflow/tfa.py](https://localhost:8080/#) in step_data_explore(self, viz)
   1216     Viz: (False) Is data visualization required ?
   1217     '''
-> 1218     self.pipeline = self.tfadata.run_initial(self._train_data_path, self._test_data_path, self._tfx_root, self._metadata_db_root, self.tfautils, viz)
   1219     self.generate_config_json()
   1220 

[/usr/local/lib/python3.7/dist-packages/auto_tensorflow/tfa.py](https://localhost:8080/#) in run_initial(self, _train_data_path, _test_data_path, _tfx_root, _metadata_db_root, tfautils, viz)
    211     #Run data pipeline
    212     print("Data: Pipeline execution started...")
--> 213     LocalDagRunner().run(self.pipeline)
    214     self._run = True
    215 

[/usr/local/lib/python3.7/dist-packages/tfx/orchestration/portable/tfx_runner.py](https://localhost:8080/#) in run(self, pipeline)
     76     c = compiler.Compiler()
     77     pipeline_pb = c.compile(pipeline)
---> 78     return self.run_with_ir(pipeline_pb)

[/usr/local/lib/python3.7/dist-packages/tfx/orchestration/local/local_dag_runner.py](https://localhost:8080/#) in run_with_ir(self, pipeline)
     85           with metadata.Metadata(connection_config) as mlmd_handle:
     86             partial_run_utils.snapshot(mlmd_handle, pipeline)
---> 87         component_launcher.launch()
     88         logging.info('Component %s is finished.', node_id)

[/usr/local/lib/python3.7/dist-packages/tfx/orchestration/portable/launcher.py](https://localhost:8080/#) in launch(self)
    543               executor_watcher.address)
    544           executor_watcher.start()
--> 545         executor_output = self._run_executor(execution_info)
    546       except Exception as e:  # pylint: disable=broad-except
    547         execution_output = (

[/usr/local/lib/python3.7/dist-packages/tfx/orchestration/portable/launcher.py](https://localhost:8080/#) in _run_executor(self, execution_info)
    418     outputs_utils.make_output_dirs(execution_info.output_dict)
    419     try:
--> 420       executor_output = self._executor_operator.run_executor(execution_info)
    421       code = executor_output.execution_result.code
    422       if code != 0:

[/usr/local/lib/python3.7/dist-packages/tfx/orchestration/portable/beam_executor_operator.py](https://localhost:8080/#) in run_executor(self, execution_info, make_beam_pipeline_fn)
     96         make_beam_pipeline_fn=make_beam_pipeline_fn)
     97     executor = self._executor_cls(context=context)
---> 98     return python_executor_operator.run_with_executor(execution_info, executor)

[/usr/local/lib/python3.7/dist-packages/tfx/orchestration/portable/python_executor_operator.py](https://localhost:8080/#) in run_with_executor(execution_info, executor)
     57   output_dict = copy.deepcopy(execution_info.output_dict)
     58   result = executor.Do(execution_info.input_dict, output_dict,
---> 59                        execution_info.exec_properties)
     60   if not result:
     61     # If result is not returned from the Do function, then try to

[/usr/local/lib/python3.7/dist-packages/tfx/components/statistics_gen/executor.py](https://localhost:8080/#) in Do(self, input_dict, output_dict, exec_properties)
    138             stats_api.GenerateStatistics(stats_options)
    139             | 'WriteStatsOutput[%s]' % split >>
--> 140             stats_api.WriteStatisticsToBinaryFile(output_path))
    141         logging.info('Statistics for split %s written to %s.', split,
    142                      output_uri)

[/usr/local/lib/python3.7/dist-packages/apache_beam/pvalue.py](https://localhost:8080/#) in __or__(self, ptransform)
    135 
    136   def __or__(self, ptransform):
--> 137     return self.pipeline.apply(ptransform, self)
    138 
    139 

[/usr/local/lib/python3.7/dist-packages/apache_beam/pipeline.py](https://localhost:8080/#) in apply(self, transform, pvalueish, label)
    651     if isinstance(transform, ptransform._NamedPTransform):
    652       return self.apply(
--> 653           transform.transform, pvalueish, label or transform.label)
    654 
    655     if not isinstance(transform, ptransform.PTransform):

[/usr/local/lib/python3.7/dist-packages/apache_beam/pipeline.py](https://localhost:8080/#) in apply(self, transform, pvalueish, label)
    661       old_label, transform.label = transform.label, label
    662       try:
--> 663         return self.apply(transform, pvalueish)
    664       finally:
    665         transform.label = old_label

[/usr/local/lib/python3.7/dist-packages/apache_beam/pipeline.py](https://localhost:8080/#) in apply(self, transform, pvalueish, label)
    710 
    711       if type_options is not None and type_options.pipeline_type_check:
--> 712         transform.type_check_outputs(pvalueish_result)
    713 
    714       for tag, result in ptransform.get_named_nested_pvalues(pvalueish_result):

[/usr/local/lib/python3.7/dist-packages/apache_beam/transforms/ptransform.py](https://localhost:8080/#) in type_check_outputs(self, pvalueish)
    464 
    465   def type_check_outputs(self, pvalueish):
--> 466     self.type_check_inputs_or_outputs(pvalueish, 'output')
    467 
    468   def type_check_inputs_or_outputs(self, pvalueish, input_or_output):

[/usr/local/lib/python3.7/dist-packages/apache_beam/transforms/ptransform.py](https://localhost:8080/#) in type_check_inputs_or_outputs(self, pvalueish, input_or_output)
    495                 hint=hint,
    496                 actual_type=pvalue_.element_type,
--> 497                 debug_str=type_hints.debug_str()))
    498 
    499   def _infer_output_coder(self, input_type=None, input_coder=None):

TypeCheckError: Output type hint violation at WriteStatsOutput[train]: expected <class 'apache_beam.pvalue.PDone'>, got <class 'str'>
Full type hint:
IOTypeHints[inputs=((<class 'tensorflow_metadata.proto.v0.statistics_pb2.DatasetFeatureStatisticsList'>,), {}), outputs=((<class 'apache_beam.pvalue.PDone'>,), {})]
File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/usr/local/lib/python3.7/dist-packages/tensorflow_data_validation/api/stats_api.py", line 113, in <module>
    class WriteStatisticsToBinaryFile(beam.PTransform):
File "/usr/local/lib/python3.7/dist-packages/apache_beam/typehints/decorators.py", line 776, in annotate_input_types
    *converted_positional_hints, **converted_keyword_hints)

based on:
  IOTypeHints[inputs=None, outputs=((<class 'apache_beam.pvalue.PDone'>,), {})]
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/usr/local/lib/python3.7/dist-packages/tensorflow_data_validation/api/stats_api.py", line 113, in <module>
      class WriteStatisticsToBinaryFile(beam.PTransform):
  File "/usr/local/lib/python3.7/dist-packages/apache_beam/typehints/decorators.py", line 863, in annotate_output_types
      f._type_hints = th.with_output_types(return_type_hint)  # pylint: disable=protected-access
@rafiqhasan
Copy link
Owner

Acknowledged ! WIP

@rafiqhasan
Copy link
Owner

rafiqhasan commented Sep 17, 2022

Fix is to downgrade version of apache beam to 2.34.0 after installation of TFA, I'll make a fix and release

pip install apache-beam==2.34.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants