Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 209: ordinal not in range(128) #26

Closed
nick-barth opened this issue Jan 24, 2017 · 6 comments

Comments

@nick-barth
Copy link

I'd try to dig into this myself as well, but it seems like you've got some extensive experience with this error. Wasn't sure if I should've made a new issue or comment on an old similar case.

@ownaginatious
Copy link
Owner

Can you post the full stack trace?

@stephaniegott
Copy link

stephaniegott commented Jan 25, 2017

Hello! I am also having this problem. Here is the full stack trace I get when I run fbcap ./messages.htm -f csv > messages.csv:

Traceback (most recent call last):                                                                          
  File "/usr/local/bin/fbcap", line 11, in <module>
    load_entry_point('fbchat-archive-parser==0.8.post16', 'console_scripts', 'fbcap')()
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/main.py", line 154, in main
    app.run()
  File "/usr/local/lib/python2.7/dist-packages/clip.py", line 652, in run
    self.invoke(self.parse(tokens))
  File "/usr/local/lib/python2.7/dist-packages/clip.py", line 634, in invoke
    self._main.invoke(parsed)
  File "/usr/local/lib/python2.7/dist-packages/clip.py", line 519, in invoke
    self._callback(**{k: v for k, v in iteritems(parsed) if k not in self._subcommands})
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/main.py", line 98, in fbcap
    write(format, fbch, sys.stdout)
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/writers/__init__.py", line 18, in write
    item().write(data, stream)
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/writers/writer.py", line 17, in write
    return self.write_history(data, stream)
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/writers/csv.py", line 43, in write_history
    self.write_thread(history.threads[k], stream, writer=writer)
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/writers/csv.py", line 49, in write_thread
    self.write_message(message, stream, thread, writer=writer)
  File "/usr/local/lib/python2.7/dist-packages/fbchat_archive_parser/writers/csv.py", line 71, in write_message
    writer.writerow(self.encode_row(row))
  File "/usr/lib/python2.7/csv.py", line 152, in writerow
    return self.writer.writerow(self._dict_to_list(rowdict))
  File "/usr/local/lib/python2.7/dist-packages/colorama/ansitowin32.py", line 40, in write
    self.__convertor.write(text)
  File "/usr/local/lib/python2.7/dist-packages/colorama/ansitowin32.py", line 141, in write
    self.write_and_convert(text)
  File "/usr/local/lib/python2.7/dist-packages/colorama/ansitowin32.py", line 169, in write_and_convert
    self.write_plain_text(text, cursor, len(text))
  File "/usr/local/lib/python2.7/dist-packages/colorama/ansitowin32.py", line 174, in write_plain_text
    self.wrapped.write(text[start:end])
  File "/usr/lib/python2.7/codecs.py", line 369, in write
    data, consumed = self.encode(object, self.errors)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xf0 in position 158: ordinal not in range(128)

@ownaginatious
Copy link
Owner

Thanks for posting a stack trace! Does this also happen when you choose an output format other than csv?

@ownaginatious
Copy link
Owner

Never mind, I was able to reproduce the issue. I'm working on a solution now :)

@ownaginatious
Copy link
Owner

Please try now with version 0.8.post21 on PyPI and let me know if it now works.

@stephaniegott
Copy link

Works great on my end! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants