New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Known failing tests" on Windows #232

Closed
nanis opened this Issue Feb 14, 2017 · 12 comments

Comments

Projects
None yet
4 participants
@nanis
Contributor

nanis commented Feb 14, 2017

This is more of an FYI than a bug report.

I built everything from scratch today (yes, these are development snapshot builds). Only two tests fail:

$  perl t\harness5 t\spec\S02-literals\quoting.t --verbosity=3
...
not ok 126 - Testing for q:x operator. (utf8)
# Failed test 'Testing for q:x operator. (utf8)'
# at t\spec\S02-literals\quoting.t line 401
# expected: '一
# '
#      got: '?
# '

and

$ perl t\harness5 t\spec\S19-command-line\dash-e.t --verbosity=3
...
not ok 2 - -e print $something works with non-ASCII string literals

# Failed test '-e print $something works with non-ASCII string literals'
# at t\spec\packages\Test\Util.pm (Test::Util) line 59
#      got out: "?"
# expected out: "ȧ"

The reason for both failures is simple: perl does not handle Unicode arguments on the command line well (same situation as we had with MoarVM prior to my patch, but fixing perl will take some more effort on my part).

I figured using harness6 might work, but that doesn't seem to be fully functional.

Anyway, if used with a Unicode round-trip safe harness, neither test would fail. For example, consider this simple script based on the failing test in quoting.t above:

$ type check-qx.pl6         
use v6.c;                                                    
{                                                            
    # 一 means "One" in Chinese.                              
    my $x = "一";                                             
    say $x;                                                  
    my $y = qqx{echo $x};                                    
    say $y;                                                  
}                                                            
$ perl6 check-qx.pl6        
一                                                            
一                                                            
$ perl6 check-qx.pl6|xxd
00000000: e4b8 800d 0ae4 b880 0d0a 0d0a            ............

HTH.

@nanis nanis changed the title from "nown failing tests" on Windows to "Known failing tests" on Windows Feb 14, 2017

@zoffixznet

This comment has been minimized.

Member

zoffixznet commented Feb 14, 2017

Is that 6.c-errata branch (also shipped with releases) or master?

Wonder why would it matter for the test whether perl can handle unicode arguments.

There is also Issue #197 with many failures on Windows 10 stresstest

@nanis

This comment has been minimized.

Contributor

nanis commented Feb 14, 2017

Wonder why would it matter for the test whether perl can handle unicode arguments.

Doesn't the perl5 construct the command line and pass it to moar for tests?

There is also Issue #197 with many failures on Windows 10 stresstest

Yeah, I don't know what's going on there ... I never had as many test failures.

@nanis

This comment has been minimized.

Contributor

nanis commented Feb 14, 2017

Also, I might be confused about how the tests are run. The main point is for me, all but two tests passed, and there is no reason why they should not have

@zoffixznet

This comment has been minimized.

Member

zoffixznet commented Feb 14, 2017

Doesn't the perl5 construct the command line and pass it to moar for tests?

I sincerely hope not :)

Perl 5 just handles the harness that collects output of tests... but the tests are run by Perl 6. The t/harness6 uses a Perl 6 harness for that process, and yes, it's still experimental and imperfect.

The S02-literals\quoting.t passes for me if I call chcp 65001 before the test:

perl6 -MTest -e "qx/chcp 65001/; is q:x/echo 一/, qq|一\n|"

The second one might be failing for the same reason.

Yeah, I don't know what's going on there ... I never had as many test failures.

There could be three things: (1) you're running 6.c-errata tests, whereas that Issue ran them for master that includes many more new tests; (2) maybe you're running make spectest which runs about ~56K top-level tests and I ran make stresstest which runs ~132K top-level tests. (3) I ran it before your work to make unicode args work, so any tests that relies on that is busted.

Also, I might be confused about how the tests are run.

To run individual testfiles, on Linux I run make t/spec/some-file.t For some reason that doesn't work on Windows with gmake, but you can also run perl t/fudgeandrun t/spec/some-file.t

There's spectest with ~56K tests that provide good enough coverage; you can run it with make spectest. And there's stresstest that runs all the tests and that's run with make stresstest

@MattOates

This comment has been minimized.

MattOates commented Feb 14, 2017

@nanis do you still have the link or a copy of the build instructions you're following. Im struggling to join the windows build effort. I suspect they've now been removed from rakudo.org because they weren't working? I build near enough daily on OSX. I noticed that Ubuntu for Windows 10 is also failing to build, pre your commits to MoarVM though. I'll maybe take a look at that if no one else is interested.

@zoffixznet

This comment has been minimized.

Member

zoffixznet commented Feb 14, 2017

pre your commits to MoarVM though

Side note: MoarVM/NQP version bumps haven't been done yet since that commit went in to MoarVM. I suspect tomorrow someone will tho

@nanis

This comment has been minimized.

Contributor

nanis commented Feb 14, 2017

@zoffixznet

perl6 -MTest -e "qx/chcp 65001/; is q:x/echo 一/, qq|一\n|"

A-ha! So, basically the main point I was trying to make that the test failures are not indicative of a failure of the layers involved in perl6 stands. I was confused by about the exact mechanism.

And thanks for pointing out the difference between spec tests and stress tests. I don't think I have ever run the stress tests.

@jnthn (sigh, sorry) @MattOates I am just cloning each repo, MoarVM, NQP, rakudo, and building and installing them with the same --prefix in that order. I am not sure how to get to Rakudo Star from that state. I used to try to do that with panda install Task::Star but I think Task::Star is not recommeded or something.

Also, I just ran into a problem and I don't know whether this looks like a problem with the binaries or the module:

$ zef install Archive::SimpleZip
===> Searching for: Archive::SimpleZip
===> Searching for missing dependencies: Compress::Zlib::Raw, Compress::Zlib, Compress::Bzip2, CompUnit::Util, IO::Blob, Test::META, File::Temp, File::Which
===> Searching for missing dependencies: Compress::Bzip2::Raw, META6, File::Directory::Tree
===> Searching for missing dependencies: JSON::Class, JSON::Fast
===> Searching for missing dependencies: JSON::Marshal, JSON::Unmarshal
===> Searching for missing dependencies: JSON::Name
===> Testing: Compress::Zlib::Raw:ver('1.0.1'):auth('github:retupmoca')
t\01-basic.t .. ok
All tests successful.
Files=1, Tests=7,  6 wallclock secs
Result: PASS
===> Testing [OK] for Compress::Zlib::Raw:ver('1.0.1'):auth('github:retupmoca')
===> Testing: Compress::Zlib:ver('1.0.0'):auth('github:retupmoca')
t\01-basic.t ... ok
t\02-stream.t .. ok
t\03-wrap.t .... ok
All tests successful.
Files=3, Tests=18,  4 wallclock secs
Result: PASS
===> Testing [OK] for Compress::Zlib:ver('1.0.0'):auth('github:retupmoca')
===> Testing: Compress::Bzip2::Raw:ver('0.2.0'):auth('github:Altai-man')
Cannot locate symbol 'fopen' in native library ''
  in method setup at C:\opt\perl6\share\perl6\sources\24DD121B5B4774C04A7084827BFAD92199756E03 (NativeCall) line 310
  in method CALL-ME at C:\opt\perl6\share\perl6\sources\24DD121B5B4774C04A7084827BFAD92199756E03 (NativeCall) line 322
  in block <unit> at t\01_sanity.t line 14

It can't find fopen ... Does this look like it is related to MoarVM or something else? Just a pointer in the direction you think would be most fruitful might help me avoid some unnecessary work.

@nanis

This comment has been minimized.

Contributor

nanis commented Feb 15, 2017

It looks like one needs to specify the name of a library containing fopen. I am not sure why the code in the module is like this

our sub fopen(Str $filename, Str $mode) returns OpaquePointer is native(Str) is export { * }
@nanis

This comment has been minimized.

Contributor

nanis commented Feb 15, 2017

@MattOates

I suspect they've now been removed from rakudo.org because they weren't working?

The instructions I used to use were Manual installation:

curl -O http://rakudo.org/downloads/star/rakudo-star-2016.04.tar.gz
tar -xvzf rakudo-star-2016.04.tar.gz
cd rakudo-star-2016.04/

perl Configure.pl --backend=moar --gen-moar
nmake

nmake rakudo-test
nmake rakudo-spectest

nmake install
@jonathanstowe

This comment has been minimized.

Contributor

jonathanstowe commented Feb 15, 2017

@nanis

The declaration of fopen is like that because it is expected to be in the already loaded C runtime library.

I think you want to file a report egainst Compress::Zlib

@nanis

This comment has been minimized.

Contributor

nanis commented Feb 15, 2017

This is an FYI, not a bug report. Following @zoffixznet's tip, I decided to try the stress tests. Before building each component, I did an nmake distclean and git pull on master/nom branches for each repo.

I am encouraged by how few failures I observed (and by the fact that I think a lot of the failures are related so fixing one may fix many).

MoarVM: 542baec756304399eb7638feeb0650c292f94bae
NQP: ac9a66abad13a7fd3d52b22ba533a9f8436ad1ac
Rakudo: b51a5505a6b766e6be1861f4f6d08a0986b1a7ac

Operating System: 64-bit Windows 10 Pro

$ ver
Microsoft Windows [Version 10.0.14393]

Built 64-bit binary using MSVS 2015 tools with additional
CFLAGS=-favor:INTEL64 -Qpar -Oi just for kicks ;-)

$ cl
Microsoft (R) C/C++ Optimizing Compiler Version 19.00.24215.1 for x64

$ nmake
Microsoft (R) Program Maintenance Utility Version 14.00.24210.0

After building and installing them all in c:\opt\perl6, I ran nmake stresstest in Rakudo's source directory.

t\spec\S32-io\IO-Socket-INET.t got stuck at test 49 and I had to manually terminate that test.

Here are the failures I got:

Test Summary Report
-------------------
t\spec\S02-literals\quoting.t                                   (Wstat: 256 Tests: 189 Failed: 1)
  Failed test:  126
  Non-zero exit status: 1
t\spec\S10-packages\precompilation.rakudo.moar                  (Wstat: 1024 Tests: 51 Failed: 4)
  Failed tests:  45-48
  Non-zero exit status: 4
t\spec\S16-filehandles\filetest.rakudo.moar                     (Wstat: 65280 Tests: 43 Failed: 0)
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 44 tests but ran 43.
t\spec\S17-procasync\basic.rakudo.moar                          (Wstat: 256 Tests: 34 Failed: 1)
  Failed test:  33
  Non-zero exit status: 1
t\spec\S17-procasync\encoding.t                                 (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 13 tests but ran 0.
t\spec\S17-procasync\kill.rakudo.moar                           (Wstat: 65280 Tests: 6 Failed: 1)
  Failed test:  2
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 9 tests but ran 6.
t\spec\S17-procasync\stress.t                                   (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 12 tests but ran 0.
t\spec\S17-supply\interval.t                                    (Wstat: 768 Tests: 8 Failed: 3)
  Failed tests:  5-7
  Non-zero exit status: 3
t\spec\S19-command-line\repl.rakudo.moar                        (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 13 tests but ran 0.
t\spec\S29-os\system.rakudo.moar                                (Wstat: 65280 Tests: 32 Failed: 0)
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 35 tests but ran 32.
t\spec\S32-io\IO-Socket-INET.t                                  (Wstat: 256 Tests: 50 Failed: 0)
  Non-zero exit status: 1
  Parse errors: Bad plan.  You planned 51 tests but ran 50.
t\spec\S32-io\pipe.t                                            (Wstat: 65280 Tests: 10 Failed: 0)
  Non-zero exit status: 255
  Parse errors: Bad plan.  You planned 14 tests but ran 10.
t\spec\S32-list\pick.t                                          (Wstat: 62464 Tests: 316 Failed: 244)
  Failed tests:  73-316
  Non-zero exit status: 244
Files=1211, Tests=132754, 3289 wallclock secs (20.83 usr +  3.75 sys = 24.58 CPU)
Result: FAIL
NMAKE : fatal error U1077: 'c:\opt\perl\5.24.1\bin\perl.exe' : return code '0x1'
Stop.

THEN I realized I had once again forgotten to issue a chcp 65001 (sigh, yes, again) before running the tests, so I tried the failing tests again:

$ perl t\harness5 t\spec\S02-literals\quoting.t
t\spec\S02-literals\quoting.t .. ok
All tests successful.
Files=1, Tests=189,  3 wallclock secs ( 0.11 usr +  0.03 sys =  0.14 CPU)
Result: PASS

OK, so that was one false alarm.

The rest of the failures remained.

There is the possibility that at least some of these are related to the fact that perl6-m is a batch file.

zoffixznet added a commit that referenced this issue Apr 14, 2017

[io grant] Rewrite .l on broken symlinks test
- de-confuse $target/$name nomenclature
- make work with IO grant changes to symlink
- Catch exceptions on Windows and skip test, as symlink creation
    on Windows cries for admin privs. This addresses the failures
    in this test file reported in #197 and #232

zoffixznet added a commit that referenced this issue Dec 11, 2017

@zoffixznet

This comment has been minimized.

Member

zoffixznet commented Dec 12, 2017

Consolidating all the windows failures issues into one and closing this one in favour of the more recent #320

@zoffixznet zoffixznet closed this Dec 12, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment