Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CurrentCulture in unit tests? #2734

Closed
weltkante opened this issue Jan 17, 2020 · 14 comments
Closed

CurrentCulture in unit tests? #2734

weltkante opened this issue Jan 17, 2020 · 14 comments
Labels
test-bug Problem in test source code (most likely) test-enhancement Improvements of test source code

Comments

@weltkante
Copy link
Contributor

While updating the PaddingConverter tests in PR #2679 I noticed that it contains tests which only succeed on english (or similar) environments which use comma as separator. Some languages use semicolon due to comma being used as decimal separator. This leads to false-positive failing tests on developer machines.

Failing tests are PaddingConverter_ConvertFrom_String_ReturnsExpected on e.g. this input as well as PaddingConverter_ConvertTo_String_ReturnsExpected. (I'm only selectively running tests locally so I'd assume there could be more instances of this problem in other tests.)

Do you think tests should be running under fixed CurrentCulture (maybe some sort of fixture)? Maybe check how corefx is solving this problem when they are testing their converters/parsers.

@RussKie RussKie added test-bug Problem in test source code (most likely) test-enhancement Improvements of test source code labels Jan 17, 2020
@merriemcgaw merriemcgaw added this to the 5.0 milestone Jan 23, 2020
@RussKie RussKie modified the milestones: 5.0 Previews 1-4, 5.0 Apr 20, 2020
@weltkante
Copy link
Contributor Author

weltkante commented Apr 22, 2020

Just for the record, there are currently 105 failing tests, at least 94 of them are due to tests assuming english culture.

This also means you probably have no test coverage against culture dependent regressions in CI.

If you want an overview over which tests are failing I have a mostly working branch here, trying to get myself unstuck because currently running tests is a pain (I can't see if I'm adding a new test failure when running tests locally) - but I wouldn't want to PR that branch, there are probably better solutions than manually identifying which tests need english culture.

Personally I'd probably prefer if all tests are run under the same culture and then have two passes over all tests in two representative cultures, say one for english and one for a culture which is sufficiently different (de-DE and es-ES are candidates that differ in at least 3 separator tokens commonly used in WinForms). That should catch most generic culture regressions if you care about that kind of thing, of course regressions in a specific culture are only caught by exhaustive testing of every language which has its own localized strings, probably not worth for CI that runs on every PR.

Once you figure out what you actually want I can do a PR for the code part of the required changes, but someone else has to look at the infrastructure changes if any are necessary, arcade CI is a black box to me.

Also its probably still advisable to query the dotnet runtime guys how they handle culture dependent tests, they might have a better idea of what makes sense and how to do things.

(PS: I'm in no hurry, just reporting back from what I'm seeing while working my PRs.)

@RussKie
Copy link
Member

RussKie commented Apr 30, 2020

I see you're making progress on this issue. Few thoughts on the subject.

I wonder if we could do it declaratively via traits, so we can have attributes similar to NUnit.
I found this article which looks interesting. If I understood it correctly we could have a custom attribute, say SetCulture (references: #1, #2) that sets the desired culture on the current thread.

Alternatively, may be we can add it to ThreadExceptionFixture (we'll need to rename it to have a more generic name)?

@weltkante
Copy link
Contributor Author

weltkante commented Apr 30, 2020

I see you're making progress on this issue. Few thoughts on the subject.

Yes its solved locally for me by annotating every test individually.

maybe we can add it to ThreadExceptionFixture

I tried doing it in a fixture but it seemed to not work. I assumed fixtures have no guarantee to run on the same thread as tests and didn't examine this further. Its possible I was mistaken, I didn't spend a lot of time on it.

we'll need to rename it to have a more generic name

You can have multiple fixtures

@RussKie
Copy link
Member

RussKie commented Nov 30, 2020

I think this issue has resolved. I'm no longer getting any locale specific failures.
Holler if you still do, and we can reopen it.

@RussKie RussKie closed this as completed Nov 30, 2020
@RussKie RussKie removed this from the 6.0 milestone Nov 30, 2020
@weltkante
Copy link
Contributor Author

weltkante commented Dec 1, 2020

I still do get plenty, maybe your locale uses the same decimal and list separator as english? .NET uses the OS settings yet the tests hardcode english encodings. I've been working around it by identifying those tests and forcing current culture to english, I'll share later so you have an overview.

@RussKie RussKie reopened this Dec 1, 2020
@RussKie
Copy link
Member

RussKie commented Dec 1, 2020

That'd be great!

@weltkante
Copy link
Contributor Author

weltkante commented Dec 1, 2020

I've pushed a rebased branch of the code I'm using as workaround for running tests locally:

master...weltkante:culture

There are some other tests failing which don't seem related to culture, not including them in this branch.

I think for closing this issue you'll want to decide on a strategy, either only test WinForms in english and adjust the tests to not use OS-culture, or figure out a way to test the codebase in at least one other culture so CI captures mistakes. I'm referring to my comment above which I believe is worth re-reading, in particular:

Personally I'd probably prefer if all tests are run under the same culture and then have two passes over all tests in two representative cultures, say one for english and one for a culture which is sufficiently different (de-DE and es-ES are candidates that differ in at least 3 separator tokens commonly used in WinForms). That should catch most generic culture regressions if you care about that kind of thing, of course regressions in a specific culture are only caught by exhaustive testing of every language which has its own localized strings, probably not worth for CI that runs on every PR.

and

Also its probably still advisable to query the dotnet runtime guys how they handle culture dependent tests, they might have a better idea of what makes sense and how to do things.

@danmoseley
Copy link
Member

In dotnet/runtime one of our CI legs run on an es-ES machine as some protection against tests that only work on en-US. It's not perfect and we have still had some community members hit breaks, eg one using Windows set to the Korean culture.

Although I would have expected to see es-ES in one of our .yml files - perhaps I'm looking in the wrong place @safern?

@RussKie
Copy link
Member

RussKie commented Apr 26, 2022

I've pushed a rebased branch of the code I'm using as workaround for running tests locally:

master...weltkante:culture

@weltkante do you mind trying to apply UseDefaultXunitCultureAttribute to tests that fail on your machine, and see if it fixes those?

@weltkante
Copy link
Contributor Author

I had a look two weeks ago, had a lot of problems with accessibility tests requiring localized data and seemingly not being influenced by the culture settings. Will check again with your attribute and let you know how well it goes.

@merriemcgaw merriemcgaw modified the milestones: .NET 7.0, .NET 8.0 Aug 11, 2022
@JeremyKuhne
Copy link
Member

@weltkante Any update on the current state of things here?

@JeremyKuhne JeremyKuhne added the 📭 waiting-author-feedback The team requires more information from the author label Aug 16, 2023
@JeremyKuhne JeremyKuhne modified the milestones: .NET 8.0, .NET 9.0 Aug 16, 2023
@weltkante
Copy link
Contributor Author

weltkante commented Aug 17, 2023

Sorry for the lack of feedback. From memory, back then I wasn't able to test UseDefaultXunitCultureAttribute because of technical issues and then probably got distracted and lost track of it.

I just tried again with current main branch and it seems the attribute is able to fix at least some of the test failures. I'll go through my branch over the weekend and prepare a PR adding the attribute to any remaining failures. I noticed some already have the attribute applied, but there are still some culture related failures remaining for a run on a german machine - and for the samples I picked they got fixed by adding the attribute.

@ghost ghost removed the 📭 waiting-author-feedback The team requires more information from the author label Aug 17, 2023
@weltkante
Copy link
Contributor Author

Actually its not as good as I hoped, I was just lucky that my sample covered tests where the attribute worked. There is a bunch of other tests where changing CurrentCulture doesn't help, in particular when interacting with accessibility and common controls localization.

I created a PR for the tests that can be fixed with the attribute. Not sure if there is any point leaving this issue open beyond that, if you want I can make a list of tests that fail localization but I have no idea how to fix those if CurrentCulture/CurrentUICulture doesn't affect the values returned by Windows.

@weltkante
Copy link
Contributor Author

Closing since remaining failures (accesibility and localized behavior of native controls) can't seem to be fixed via updating CurrentCulture at runtime so are out of scope of this issue.

@ghost ghost removed this from the .NET 9.0 milestone Sep 6, 2023
@dotnet dotnet locked as resolved and limited conversation to collaborators Oct 6, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
test-bug Problem in test source code (most likely) test-enhancement Improvements of test source code
Projects
None yet
Development

No branches or pull requests

5 participants