-
Notifications
You must be signed in to change notification settings - Fork 27.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Showing output channel via API steals editor focus, output channel output is delayed #59209
Comments
(Experimental duplicate detection)
|
This delay is introduced in insiders and it helps to reduce CPU load on the UI process. Output channels are mainly meant to be used as log views (like git, typescript) and are not required to show the content immediately. At the same time we also provided a flag if you want to enforce immediate updates but this comes at the cost of higher CPU usage. Please see https://github.com/Microsoft/vscode/blob/master/src/vs/vscode.d.ts#L6241
This is happening because you are calling |
To verify: When you pass preserveFocus flag to true in |
We decided not to provide this flag and all output channels behave the same. |
Some Context: Output channels on Extension Host writes appended data into a log file with an interval of 500ms. UI process updates the channel UI by listening to the log file changes at regular interval of 500ms. So an appended message can appear in the UI in not less than 1s. Small improvement: But delay can be still noticed among the appended messages due to the flushing interval in Extension Host and listening interval in UI. |
Output channel looks like following with an extension sample similar to that in description const outputChannel = vscode.window.createOutputChannel(`My Test`);
outputChannel.appendLine('Test');
outputChannel.show(true);
setTimeout(() => {
outputChannel.appendLine('Test1');
outputChannel.appendLine('Test2');
outputChannel.appendLine('Test3');
}, 5000); |
Why not? Unfortunately, the new behaviour is a huge drawback for our extensions. The flag was the only way to preserve the existing behaviour and without the flag it is a very painful breaking change for us and our users. To provide some context, our extension is used by many thousands of VS Code users every day (~2.2 million downloads) and it allows to log values in real-time. We are constantly optimising our engine to reduce the time of getting the live feedback back to user, we are trying to squeeze out every millisecond we can. So far for many versions of VS Code it had been working like this: and now (without the flag) it will look something like this: So what felt like an almost instant feedback, now feels very sluggish and significantly affects the UX. The same applies to our other extension called Wallaby.js and its tests output. We are providing real-time feedback from running tests, so every millisecond counts, and now the output is going to be affected by the 500ms delay. While we do understand the motivation behind changing the default behaviour, could you please re-consider providing a way to switch back to what had been working for a long time prior to the change? |
@ArtemGovorov I see how the delay is affecting your use case. I am adding another improvement to reduce the time lapse between the time data is appended and it is shown in the UI. If the data is appended to a visible channel, I would flush the data into the log (temp) file immediately and signal the UI about the change. UI fetches the data from the file and shows it. Here is the gif of the same example of yours with the above approach I cannot say it is same as before, but it is fast enough and I think it is the best compromise to be efficient and faster. |
@sandy081 Thanks, it sounds like a great solution, and also no need to change the API! We will check it out. |
Steps to Reproduce:
Here's the GIF to demonstrate the issue in 1.28.0-insider:
Here's the GIF to demonstrate that it's working as expected in 1.27.2:
Does this issue occur when all extensions are disabled?: Yes
The text was updated successfully, but these errors were encountered: