GitHub is home to over 20 million developers working together to host and review code, manage projects, and build software together.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
Already on GitHub? Sign in to your account
On my system, I have set my LC_ALL environment variable to C, because it speeds up a lot of common Unix commands when you're dealing with ASCII-only input.
This causes Ruby to use US-ASCII as the default external encoding, which triggers a failure in the Thor specs:
1) Thor::Shell::Basic#print_table uses maximum terminal width
Failure/Error: expect(content).to eq(<<-TABLE)
invalid byte sequence in US-ASCII
# ./spec/shell/basic_spec.rb:208:in `block (3 levels) in <top (required)>'
Changing Encoding.default_external globally in the spec helper is the most straightforward fix, but there are a lot of other possible ways to fix this, particularly by specifying an explicit encoding for the StringIO object created in the capture helper. Let me know if you'd like me to resubmit with a different solution.
Specify an explicit external encoding for tests.
This fixes a failure in the spec at spec/shell/basic_spec.rb:208
(Thor::Shell::Basic#print_table uses maximum terminal width) when running on a
system with a default encoding that is not compatible with Unicode.
Coverage remained the same when pulling ed641ab on TimMoore:spec-encoding into 62593e0 on erikhuda:master.