Specify an explicit external encoding for tests #392

Merged
merged 1 commit into from Feb 13, 2017

Conversation

Projects
None yet
3 participants
Contributor

TimMoore commented Dec 27, 2013

On my system, I have set my LC_ALL environment variable to C, because it speeds up a lot of common Unix commands when you're dealing with ASCII-only input.

This causes Ruby to use US-ASCII as the default external encoding, which triggers a failure in the Thor specs:

Failures:

  1) Thor::Shell::Basic#print_table uses maximum terminal width
     Failure/Error: expect(content).to eq(<<-TABLE)
     ArgumentError:
       invalid byte sequence in US-ASCII
     # ./spec/shell/basic_spec.rb:208:in `block (3 levels) in <top (required)>'

Changing Encoding.default_external globally in the spec helper is the most straightforward fix, but there are a lot of other possible ways to fix this, particularly by specifying an explicit encoding for the StringIO object created in the capture helper. Let me know if you'd like me to resubmit with a different solution.

Specify an explicit external encoding for tests.
This fixes a failure in the spec at spec/shell/basic_spec.rb:208
(Thor::Shell::Basic#print_table uses maximum terminal width) when running on a
system with a default encoding that is not compatible with Unicode.

Coverage Status

Coverage remained the same when pulling ed641ab on TimMoore:spec-encoding into 62593e0 on erikhuda:master.

@rafaelfranca rafaelfranca merged commit e675416 into erikhuda:master Feb 13, 2017

1 check passed

default The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment