Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make Blazor WASM API Calls Under One Second #23869

Closed
allan-mobley-jr opened this issue Jul 11, 2020 · 27 comments
Closed

Make Blazor WASM API Calls Under One Second #23869

allan-mobley-jr opened this issue Jul 11, 2020 · 27 comments
Labels
area-blazor Includes: Blazor, Razor Components feature-blazor-wasm This issue is related to and / or impacts Blazor WebAssembly

Comments

@allan-mobley-jr
Copy link

Is your feature request related to a problem?

I would like to make an async API call from Blazor WASM for real world data (not fetch weather data), and it not take 3 or more seconds to complete while simultaneously freezing the UI. 🤔

Describe the solution you'd like

I would like to make an async API call from Blazor WASM for real world data (not fetch weather data), and it not take 3 or more seconds to complete while simultaneously freezing the UI. 🤔

Additional context

To get around this, I use the js interop to initialize a react component, which then uses the browser Fetch API to make the same call in less than 200 milliseconds and then the React component renders the view. 🤔

Yes, I thought seriously Reacting out...but I have invested a lot of time and hopes and 🙏 in Blazor WASM. Plus, the Enterprise web app I am working to port over is an ASP.NET core wep app with razor pages, and the authentication and authorization easily maps over. (Great job on that, btw.) Not to mention that I f*ing love C#!

I know I'm not one of those big component companies (i.e., Telrik) that you guys be snuggling up to, but I bet I'm more representative of the majority who want to use Blazor WASM for their companies or customers.

@mkArtakMSFT mkArtakMSFT added area-blazor Includes: Blazor, Razor Components feature-blazor-wasm This issue is related to and / or impacts Blazor WebAssembly labels Jul 12, 2020
@mrpmorris
Copy link

mrpmorris commented Jul 12, 2020

I have some questions, your experience is completely different to mine.

How many rows data are you displaying on screen at one time?

Have you remembered to use @key inside your loop?

In what way does your UI freeze considering you are using an asyn call?

Can you paste the code that does the async call, all the way from the entry point (button click, or whatever)?

I don't think MS are snuggling up to Telerik. They are simply making some (very good) speed optimizations based on examples submitted by customers, and these will benefit all of us.

@allan-mobley-jr
Copy link
Author

@mrpmorris

How many rows data are you displaying on screen at one time?

Like you and others #21085, I first suspected that I was simply rendering too many rows (four thousand) at one time. While I never planned to render all the rows at one time in production and was leaning toward an infinite scroll implementation on the table of data, I was nonetheless surprised to see such a prolonged UI freeze.

So I dropped the rendered rows down to 100, but still the prolonged UI freeze persisted.

I dropped some code in to time the render and found a minimum of three seconds for the UI freeze.

Frustrated, I built another page component and wired up a React component to make the call in JS using the browser Fetch API and then render the rows, timing this as well. Outcome: Blazor React is blazing fast (200 milliseconds). Pun intended. There also was no accompanying UI freeze.

For kicks, I changed the React component to render the entire data set, and, sure enough, it rendered it blazing fast. Again, no accompanying UI freeze.

For more kicks, I removed all rendering from the straight Blazor implementation and just made the async API call. To my surprise, the call took the same amount of time as before (three or more seconds) with the accompanying UI freeze.

Have you remembered to use @key inside your loop?

Yes.

In what way does your UI freeze considering you are using an asyn call?

Now, when I say UI freeze, I mean that I could click the navigation button to open the side drawer and nothing would happen until after the freeze passed.

To further confirm this to myself, I added the stock Counter component to the page, and found that no matter how many times I clicked the button, the count remained the same until the freeze passed. Then, if Blazor remembered, the count would be updated to the number of times I clicked it.

Can you paste the code that does the async call, all the way from the entry point (button click, or whatever)?

@page "/fetchdata"

<h1>Blazor with C# HttpClient API Call</h1>

<table class="table text-light">
    <thead>
        <tr>
            <th>#</th>
            <th>Name</th>
            <th>FullName</th>
            <th>Vendor</th>
        </tr>
    </thead>
    <tbody>
        @if (items is not null)
        {
            int cnt = 1;
            foreach (var item in items)
            {
                <tr @key="item">
                    <td>@cnt</td>
                    <td>@item.Name</td>
                    <td>@item.FullName</td>
                    <td>@item.PrefVendorRef?.FullName</td>
                </tr>

                cnt++;
            }
        }
    </tbody>
</table>

@code {
    private List<MyItem> items;

    protected override async Task OnAfterRenderAsync(bool firstRender)
    {
        if (firstRender)
        {
            var timeStart = DateTime.Now;
            var http = new HttpClient { BaseAddress = new Uri("https://azure-function-api-endpoint") };
            
            items = (await http.GetFromJsonAsync<CosmosQuery<MyItem>>("api/Preview")).Documents;
            StateHasChanged();
            System.Console.WriteLine("Request time: " + (DateTime.Now.Subtract(timeStart).TotalMilliseconds) + " milliseconds.");
        }
    }

    public class MyItem
    {
        public string id { get; set; }
        public string Name { get; set; }

        public string FullName { get; set; }

        public Ref PrefVendorRef { get; set; }
    }

    public class Ref 
    {
        public string ListID { get; set; }
        public string FullName { get; set; }
    }

    public class CosmosQuery<T>
    {
        public List<T> Documents { get; set; }
    }
}

Like I said above, you can comment out the StateHasChanged() call and no data will be rendered, or you can use Take(100) on the foreach loop to limit the data rendered. In either case, the UI freeze occurs.

I don't think MS are snuggling up to Telerik. They are simply making some (very good) speed optimizations based on examples submitted by customers, and these will benefit all of us.

Not trying to ruffle anyone's feathers...er, maybe I am...but I certainly do not mean any disrespect.

What the Blazor team has pulled off is nothing short of amazing. Nothing but mad respect for those guys and gals.

But I watched many a Blazor demos and and conferences, read thousands of pages of documentation, blogs, articles, etc... Not once was this cat let out of the bag.

As bright as the Blazor team is, I cannot imagine that they were not aware of this UI freeze or the blocking GC issues #21085 prior to releasing this for production.

"Look for a Future Preview Release"

This is in a sense what @SteveSandersonMS tells a poor chap when he posts a comment on a thread about Blazor performance issues.

That's frankly unacceptable. These guys said Blazor was ready for production and we believed them. Now they are telling us to wait on a .NET 5 preview release that might be the fix? But don't take my word for it, take @SteveSandersonMS:

I'm afraid you'll need to wait a couple more preview releases to confirm this yourself in your own scenarios.

@mrpmorris
Copy link

mrpmorris commented Jul 12, 2020

There is no conspiracy to keep this problem secret, there is something unusual going on here.

How long does the http call itself take? Also, how long does it take if you just do a normal Get + grab the response as a string rather than GetFromJsonAsync?

@SteveSandersonMS
Copy link
Member

SteveSandersonMS commented Jul 12, 2020

@allan-mobley-jr I know it’s inconvenient, but is it possible for you to provide a runnable repro example?

Otherwise we could be making a series of incorrect guesses about what’s wrong here.

If we were able to see the full code, including the full JSON data returned by the server that’s being rendered, we could give a conclusive answer quickly. Ideally a GitHub repo containing that when run exhibits the UI freeze you describe.

@SteveSandersonMS
Copy link
Member

SteveSandersonMS commented Jul 12, 2020

For more kicks, I removed all rendering from the straight Blazor implementation and just made the async API call [... and it still froze ...]

This strongly suggests the issue is in JSON deserialisation. It’s possible the server is returning a lot more data than you need.

This is also why I’d like to see the full repro example so we could say for sure, and perhaps also suggest how to fix it.

In particular, is it possible that this line:

items = (await http.GetFromJsonAsync<CosmosQuery<MyItem>>("api/Preview")).Documents;

... fetches absolutely everything from your CosmosDB collection, not just the items you want to render? How many are getting returned?

If I’m misunderstanding then apologies - that’s why it would be helpful to have a full runnable example.

@allan-mobley-jr
Copy link
Author

@mrpmorris

There is no conspiracy to keep this problem secret, there is something unusual going on here.

I agree. There is something unusual going on here.

How long does the http call itself take? Also, how long does it take if you just do a normal Get + grab the response as a string rather than GetFromJsonAsync?

As I pointed out above, three to four seconds if I fetch the full 4,000+ items. Drops down to about two seconds if I ask for 1,000. Almost unnoticeable, just under a second, if I ask for a hundred items. I have not tried a normal get and deserilaze the string approach.

@SteveSandersonMS

I know it’s inconvenient, but is it possible for you to provide a runnable repro example?

Yes, I will work on this tonight and create a repo for you. Thanks!

This strongly suggests the issue is in JSON deserialisation. It’s possible the server is returning a lot more data than you need.

You know, I wondered if there was something going on in this regard, from javascript to C#. Unless I am mistaken, doesn't every httpClient call go through the browser's Fetch API and marshaled back to C# via the jsInterop? This would explain why the React component handles it with no problem as there is no JSON deserialisation occurring.

In particular, is it possible that this line:

items = (await http.GetFromJsonAsync<CosmosQuery<MyItem>>("api/Preview")).Documents;

... fetches absolutely everything from your CosmosDB collection, not just the items you want to render? How many are getting returned?

Well, in my posted example, the API path "api/Preview" as is would fetch all of the items in the Cosmos collection, which is about 4,000+ items. This can be restricted by adding a route integer value, such as "api/Preview/1000", where the integer sets the max-items returned

As for the make-up of the data, this is dictated by the endpoint itself but here is the query made to the Cosmos backend:

select c.id, c.Name, c.FullName, c.PrefVendorRef from c where c.Discriminator = 'ItemInventory' and c.IsActive

The Cosmos .NET SDK takes this SQL string along with any other options data, such as partition key and max item count, and returns just the properties in the select clause, which matches the MyItem class in the blazor code.

The Azure function endpoint which facilitates this, merely acts as an auth gateway between Blazor WASM client and cosmos DB. It does no JSON deserialisation from or to Cosmos, but rather passes everything as streams, using the streams API in the latest Cosmos .NET SDK.

Cosmos itselfs packs all matched documents in an object under the _documents property, such as:

{
    "_rid": "resource id string",
    "_count": 1000,
   "_documents": [
       ...
   ]
}

While I can't expose the true client code being used in the enterprise app, I can repo this with just the code that I posted above, using the stock Blazor WASM template. As for the Azure Function endpoint and Cosmos backend, I am not allowed to expose that either.

So I am not sure what to do about that side of the equation.

@allan-mobley-jr
Copy link
Author

P.S. @SteveSandersonMS

I also see some GC logs being created by WASM in relation to the API call.

@SteveSandersonMS
Copy link
Member

Thanks for confirming. It seems clear then that the issue is the time taken in JSON deserialisation.

Presumably you don’t want to fetch significantly more data than you’re going to display. Once you’re fetching the actual amount you want (e.g. chunks of 100 per page), do you still have a perf issue?

@allan-mobley-jr
Copy link
Author

Request time: 6818 milliseconds.
L: GC_MAJOR_SWEEP: major size: 3680K in use: 10582K
L: GC_MAJOR: (LOS overflow) time 18.40ms, stw 18.41ms los size: 16832K in use: 14756K

@mrpmorris
Copy link

It would be good to check a plain HTTP get so you can compare it to the time it takes to get + deserialize. I suggested this so we can see if it is deserialization or just the GET itself. Could you do that?

Also, what size is the response from the server for this data?

@allan-mobley-jr
Copy link
Author

allan-mobley-jr commented Jul 12, 2020

@SteveSandersonMS

True, the UI freeze is less noticeable when retrieving chunks of 100 items.

I also agree that typically you do not "want to fetch significantly more data than you’re going to display," but in cases where you need to fetch a large list, say a couple thousand for client-side searching or storing in a client-side database, this would still be a problem.

Even if you do so behind the scenes, in chunks of a hundred, you would have to make multiple calls and each call would block the UI in totality unless you spaced the calls out somehow.

@allan-mobley-jr
Copy link
Author

@mrpmorris

I will try as you suggested tonight and get back to you.

@allan-mobley-jr
Copy link
Author

@SteveSandersonMS and @mrpmorris

Let's not miss the point here guys.

The same results (fetching data and optionally displaying it) can be achieved with a straight JavaScript approach without a UI Freeze of three to four seconds.

There can be any number of reasons why I or any other developer may need to fetch hundreds, if not thousands, of items from one API call:

  1. for a client-side search list
  2. for offline situations
  3. etc...

Anyone coming from previous web experience into using Blazor will have certain expectations that were fostered from using JavaScript on the client-side.

To learn that you can't make the same API calls you made with JavaScript for fear of locking down the UI... well, hopefully you follow my drift.

@allan-mobley-jr
Copy link
Author

allan-mobley-jr commented Jul 12, 2020

@mrpmorris

Interesting, the following call was between just under a second to just over a second on multiple navigations:

var json = await http.GetStringAsync("api/Preview");

So there is something to be said for the JSON deserialization.

No GC logs in WASM.

@mrpmorris
Copy link

mrpmorris commented Jul 13, 2020

@allan-mobley-jr Personally I don't use the new Json library for various serialisation reasons. Could you try using Newtonsoft.Json to check if that performs acceptably?

If it does then consider raising a ticket for the System.Text.Json library.

@allan-mobley-jr
Copy link
Author

HTTP.GetAsync() with Newtonsoft.Json

var response = await http.GetAsync("api/Preview");
var json = await response.Content.ReadAsStringAsync();
items = JsonConvert.DeserializeObject<CosmosQuery<MyItem>>(json).Documents;

Outcome:

  • around 1700 milliseconds for entire data set (4,454 inventory items)
  • occasional GC logs in WASM

Same outcome if following is used: var json = await http.GetStringAsync("api/Preview");

HTTP.GetFromJsonAsync() with System.Text.Json

items = (await http.GetFromJsonAsync<CosmosQuery<MyItem>>("api/Preview")).Documents;

Outcome:

  • around 4500 milliseconds for entire data set (4,454 inventory items)
  • GC logs in WASM every time

@mrpmorris
Copy link

In that case, I would advise using Newton and raising a ticket for System.Text.Json - I suppose that means this ticket can be closed?

@allan-mobley-jr
Copy link
Author

@mrpmorris

While it's still slower than a straight javascript approach and does raise the GC events, I will proceed with Newtonsoft.Json and see how it goes.

As for the current example, I will use an infinite scroll approach, fetching and displaying more items as the user scrolls the table.

As for cases where I need a complete list of customers or inventory, for employee search purposes, I will have to see how much this effects a users interaction when the list are periodically updated. My concern here is triggering the GC too much, which apparently locks the UI.

So, yes, I will close this with prejudice and open a ticket for System.Text.Json.

@SteveSandersonMS
Copy link
Member

SteveSandersonMS commented Jul 13, 2020

Thanks for providing the extra info here @allan-mobley-jr. Yes, it would be good for us to track down the differences explaining why System.Text.Json is so much slower than Newtonsoft in this case. We are already doing a bunch of optimization work on S.T.J. so this is a helpful extra input to that process.

When reporting this on S.T.J., if you're able to give examples of the particular JSON text that you find triggers this big speed difference, that will help a lot with diagnosis.

As for fetching a large block of data for offline use, you might want to consider how I approached it in the CarChecker demo which does exactly this. Basically I just fetched the data as a string, without deserializing it, and gave that to IndexedDB in the browser to deserialize and store. Then the .NET code could perform queries via the IndexedDB APIs.

@MariovanZeist
Copy link
Contributor

Hi @allan-mobley-jr

Could you try out the code below and see what performance you get out of that?
And compare that to your HTTP.GetAsync() with Newtonsoft.Json test?

I suspect the issue is not that System.Text.Json performs worse then NewtonSoft. (System.Text.Json is (marginally) faster in my test)
But that there is some issue in the .NET GetFromJsonAsync extension method. (I suspect an encoding issue on non-NetCoreApp code, but I am not 100% sure)

The below code is around 40% faster than the GetFromJsonAsync extension method

In my tests I created ~4600 items comparable to your MyItem class with some random strings. (Content size was between 750 - 900 kb) It took roughly 1.5 seconds on my dev machine

using System.Text.Json
{
   var response = await Http.GetByteArrayAsync("api/Preview");
   var items = Cnv(response).Documents;
}

private CosmosQuery<MyItem> Cnv(byte[] data)
{
    var w = new Utf8JsonReader(data.AsSpan());
    return JsonSerializer.Deserialize<CosmosQuery<MyItem>>(ref w);
}

@allan-mobley-jr
Copy link
Author

@MariovanZeist

Hi, thanks for the solution.

Unfortunately, while your System.Text.Json solution is faster than the GetFromJsonAsync() extension method, it still runs slower than NewtonSoft.Json:

  • avg 2600 ms vs. avg 1700 ms

Believe me, I am surprised to learn that System.Text.Json is slower than NewtonSoft.Json.

@pranavkm
Copy link
Contributor

FYI @steveharter \ @layomia

@MariovanZeist
Copy link
Contributor

MariovanZeist commented Jul 13, 2020

I investigated the issue a bit further,

I think the problem lies in the call to the JsonSerializer overload that reads from stream (e.g. JsonSerializer.DeserializeAsync)

located here:
https://github.com/dotnet/runtime/blob/c21a38796233f4b3e23cc8426a6d2fb3648778bd/src/libraries/System.Net.Http.Json/src/System/Net/Http/Json/HttpContentJsonExtensions.cs#L69

My performance test had the following results:

use HttpClient GetFromJsonAsync Extensions : 1.5891250 seconds

    var textJson = await Http.GetFromJsonAsync<CosmosQuery<MyItem>>("LargeData");

use JsonSerializer.DeserializeAsync( stream version) : 1.6109100 seconds (bypass extension method, call deserialize directly)

    var stream = await Http.GetStreamAsync("LargeData");
    var data = await System.Text.Json.JsonSerializer.DeserializeAsync<CosmosQuery<MyItem>>(stream, s_defaultSerializerOptions);

use NewtonSoft just for comparison: 1.2948950 seconds

    var response = await Http.GetStringAsync("LargeData");
    var items = JsonConvert.DeserializeObject<CosmosQuery<MyItem>>(response).Documents;

use jsonSerializer.Deserialize( Utf8JsonReader version): 0.9037051 seconds

    var response = await Http.GetByteArrayAsync("LargeData");
    var items = ParseFromByteArray(response).Documents;

    private CosmosQuery<MyItem> ParseFromByteArray(byte[] data)
    {
        var w = new Utf8JsonReader(data.AsSpan());
        return System.Text.Json.JsonSerializer.Deserialize<CosmosQuery<MyItem>>(ref w);
    }

So the last call using the Utf8Reader is significantly faster than the default extension method that uses the stream overload

@allan-mobley-jr
Copy link
Author

@MariovanZeist

Your results are very interesting.

I am not able to achieve the lower score (or faster time) with the Utf8JsonReader version.

This has me wondering if the dev machine itself (ram, CPU, etc.) plays a role in the performance of System.Text.Json??? That may be a stupid thing to think, but I don't understand why I am not able to achieve the results you are getting.

@MariovanZeist
Copy link
Contributor

@allan-mobley-jr

If you want to play around:

https://github.com/MariovanZeist/JsonPerf

This repo is mainly for demonstrating the performance issue in the GetFromJsonAsync Extension method. (It's not an endorsement for loading in 4.600 items 😉 )

The last function was a little fun, I added 2 specialized JsonConverters just to see what speed I could get.

@allan-mobley-jr
Copy link
Author

@MariovanZeist

Thanks!

It's not an endorsement for loading in 4.600 items 😉

No, I hear you! I never planned on displaying that many at one time. Just kind of stumbled on this issue in the course of setting up a stub page for testing the API.

As @SteveSandersonMS pointed out to me, for storing large amounts of data at one time, I think his example would work nice for what I need to accomplish for client-side searching and offline support:

As for fetching a large block of data for offline use, you might want to consider how I approached it in the CarChecker demo which does exactly this. Basically I just fetched the data as a string, without deserializing it, and gave that to IndexedDB in the browser to deserialize and store. Then the .NET code could perform queries via the IndexedDB APIs.

@mrpmorris
Copy link

@allan-mobley-jr I'm pleased you have it sorted out!

@ghost ghost locked as resolved and limited conversation to collaborators Aug 12, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-blazor Includes: Blazor, Razor Components feature-blazor-wasm This issue is related to and / or impacts Blazor WebAssembly
Projects
None yet
Development

No branches or pull requests

6 participants