Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support an output stream from the testing adapter script. #6594

Closed
ericsnowcurrently opened this issue Jul 15, 2019 · 34 comments
Closed

Support an output stream from the testing adapter script. #6594

ericsnowcurrently opened this issue Jul 15, 2019 · 34 comments
Assignees
Labels
area-testing debt Covers everything internal: CI, testing, refactoring of the codebase, etc. needs PR Ready to be worked on
Milestone

Comments

@ericsnowcurrently
Copy link
Member

Currently the pytest adapter captures the pytest stdout (and hides it unless there's an error) and write the JSON results to stdout. The extension then parses this output and incorporates the results.

An alternate approach would be to connect a socket to the script and push results out over that socket. This would add complexity, but provides a number of benefits:

@ericsnowcurrently ericsnowcurrently added debt Covers everything internal: CI, testing, refactoring of the codebase, etc. area-testing needs decision labels Jul 15, 2019
@ericsnowcurrently
Copy link
Member Author

@bschnurr

@DonJayamanne, we talked about this a while back, but I couldn't find any issue about it...

@DonJayamanne
Copy link

What you've suggested is exactly what we do with unit tests in Python. and I agree with the benefits. It's a more structured and easier solution.
One change I'd make is to invoke our code as a plugin. I.e. use python -m pytest and ensure it code is treated as a plugin.

This was way we run tests exactly the same way the users do. (Instead of us launching pytest using the API, modifying paths, args, etc)

@domoritz
Copy link

domoritz commented Aug 4, 2019

Does this mean that pytest in vscode currently captures stdout and not show it in the test runner?

@DonJayamanne
Copy link

DonJayamanne commented Aug 5, 2019

No, it is displayed in the test runner.

@bschnurr
Copy link
Member

bschnurr commented Aug 6, 2019

Thanks.. we are seeing similar issues in VS. For now i'm making a change to optionally write results to a file so that we can still get partial results.

@awav
Copy link

awav commented Nov 10, 2019

I have updated VSCode, but the test discovering still fails on all my projects (#8123). I'm quite surprised that the issue hasn't been resolved. Can I fix it somehow temporarily? Do you need more information on the issue? Here is my logs:

Error Python Extension: 2019-11-10 16:06:21: Python Extension: displayDiscoverStatus [Error: 2019-11-10 16:06:15.731180: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA2019-11-10 16:06:15.744855: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fc9be6345c0 executing computations on platform Host. Devices:2019-11-10 16:06:15.744904: I tensorflow/compiler/xla/service/service.cc:175]   StreamExecutor device (0): Host, Default Version2019-11-10 16:06:16.534371: W tensorflow/python/util/util.cc:299] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.	at ChildProcess.<anonymous> (/Users/artemav/.vscode/extensions/ms-python.python-2019.10.44104/out/client/extension.js:9:37710)	at Object.onceWrapper (events.js:288:20)	at ChildProcess.emit (events.js:200:13)	at maybeClose (internal/child_process.js:1021:16)	at Process.ChildProcess._handle.onexit (internal/child_process.js:283:5)]
t.log @ console.ts:137
$logExtensionHostMessage @ mainThreadConsole.ts:39
_doInvokeHandler @ rpcProtocol.ts:398
_invokeHandler @ rpcProtocol.ts:383
_receiveRequest @ rpcProtocol.ts:299
_receiveOneMessage @ rpcProtocol.ts:226
(anonymous) @ rpcProtocol.ts:101
fire @ event.ts:580
fire @ ipc.net.ts:453
_receiveMessage @ ipc.net.ts:733
(anonymous) @ ipc.net.ts:592
fire @ event.ts:580
acceptChunk @ ipc.net.ts:239
(anonymous) @ ipc.net.ts:200
t @ ipc.net.ts:28
emit @ events.js:200
addChunk @ _stream_readable.js:294
readableAddChunk @ _stream_readable.js:275
Readable.push @ _stream_readable.js:210
onStreamRead @ internal/stream_base_commons.js:166

@ericsnowcurrently
Copy link
Member Author

@luabud, any ideas on timeline?

@luabud
Copy link
Member

luabud commented Nov 13, 2019

We're hoping to get to it soon but no ETAs yet.

@awav
Copy link

awav commented Nov 19, 2019

@luabud , is there a workaround? All python extension features for testing in VSCode literally don't work at the moment. Should I just install another extension (if there is any) and forget about vscode-python :)

@luabud
Copy link
Member

luabud commented Nov 19, 2019

@awav I'm so sorry for not giving workaround options 😞

According to what you said here: #8123, the error that is being thrown is this:
Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA

If you suppress the warning, test discovery won't fail. So what you can do is create a .env file with the following content:
TF_CPP_MIN_LOG_LEVEL='2'.

Then open the settings.json file under the .vscode folder in your workspace and add the setting below:
"python.envFile": "${workspaceFolder}/.env"

Reload the window and you should be able to see your tests in the test explorer:
image

@gramster gramster added this to the FY20Q2 milestone Nov 19, 2019
@awav
Copy link

awav commented Nov 19, 2019

@luabud Thanks a lot!

  1. This message is not an error, it is a warning, which was there since the first release of the TensorFlow. The test discovering worked before fine with that message. I do agree though that VSCode can have difficulties to recognize type of the message.
  2. I opened my workspace folder and I couldn't find a .vscode with settings.json file in it. But, the workspace file has "settings" section, so I changed the path for python.envFile there on ${workspaceFolder}/.env. In this configuration the test discovery didn't work. It didn't work when I created a .vscode folder with setting.json as well. It only works when I put the full path to the .env in workspace configuration file. The project folder has .vscode with settings.json as well it might confuse the overall workspace configuration. Should I create another issue for that case?

@luabud
Copy link
Member

luabud commented Nov 20, 2019

@awav Oops, sorry. I assumed you didn't have a .code-workspace file. Anyway, yes please do a new issue for that case 😊 I don't think you should need to add the full path to the .env file in the setting, so it does sound like an issue.

Thank you for giving more info!

@gennaro-tedesco
Copy link

Is there any update on this issue?

@karrtikr karrtikr added the needs PR Ready to be worked on label Aug 9, 2022
@eleanorjboyd
Copy link
Member

Hello! I wanted to update everyone since this has been a long-running issue. We are currently work on a rewrite of how pytest works in vscode and moving to the plugin style. The main issue which will be working on is here: #17242, and progress will be updated on this issue. Please reference this main issue for updates but I will also try and follow up here as necessary. Thank you!

@eleanorjboyd
Copy link
Member

Hello! We have just finished our testing rewrite and are beginning the rollout to users. If you would like to try it yourself, you need to be on vscode insiders and then add this setting to your users settings.json "python.experiments.optInto": ["pythonTestAdapter"]. We are in the process of switching all users to the rewrite but are doing so incrementally so if you do not have insiders, watch our release notes to get updated on when it will begin to hit stable. Closing this issue with this in mind but let me know if it doesn’t work for you and we can re-open this issue. Thanks!

@dandiep
Copy link

dandiep commented Jul 16, 2023

Stoked about this. Trying it out and every time I run my test I see a list of ALL the tests. E.g.

plugins: html-3.2.0, metadata-3.0.0, mock-3.8.2, anyio-3.6.2
collected 261 items

<Package tests>
  <Module test_activity.py>
    <Function test_search_activities>
    ....    

This makes it hard to find the output, especially because any time I change a file, it reruns and obscures the logs. Any setting or anything I can change on my side?

@eleanorjboyd
Copy link
Member

@dandiep glad it worked! Have you tried adding -q to your pytest settings? This is should reduce noisy output and wondering if that would be the fix. The output that comes in the testing output log is all from pytest so we have limited control over it.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 17, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-testing debt Covers everything internal: CI, testing, refactoring of the codebase, etc. needs PR Ready to be worked on
Projects
None yet
Development

No branches or pull requests