New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Champion "Async Streams" (including async disposable) #43

Open
gafter opened this Issue Feb 9, 2017 · 62 comments

Comments

Projects
None yet
@gafter
Member

gafter commented Feb 9, 2017

@gafter

This comment has been minimized.

Show comment
Hide comment
@gafter

gafter Feb 15, 2017

Member

This would be something like

interface IAsyncEnumerable<T>
{
        IAsyncEnumerator<T> GetEnumerator();
}
interface IAsyncEnumerator<T>: IAsyncDisposable
{
        Task<bool> MoveNextAsync();
        T Current { get; }
}

With the following elements yet to be designed

  • IAsyncDisposable
  • async foreach loops
  • Declaring async iterator methods
  • Some kind of query expression support
Member

gafter commented Feb 15, 2017

This would be something like

interface IAsyncEnumerable<T>
{
        IAsyncEnumerator<T> GetEnumerator();
}
interface IAsyncEnumerator<T>: IAsyncDisposable
{
        Task<bool> MoveNextAsync();
        T Current { get; }
}

With the following elements yet to be designed

  • IAsyncDisposable
  • async foreach loops
  • Declaring async iterator methods
  • Some kind of query expression support

@gafter gafter changed the title from Champion "Async Streams" to Champion "Async Streams" (including async disposable) Feb 15, 2017

@gafter gafter added this to the 8.0 candidate milestone Feb 22, 2017

@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny Feb 23, 2017

Member

@gafter I thought we settled on

interface IAsyncEnumerable<T>
{
        IAsyncEnumerator<T> GetEnumerator(CancellationToken cancellation);
}

And potentially

static class AsyncEnumerableExtensions
{
    static IAsyncEnumerator<T> GetEnumerator(this IAsyncEnumerable<T> source) => source.GetEnumerator(CancellationToken.None);
}
Member

onovotny commented Feb 23, 2017

@gafter I thought we settled on

interface IAsyncEnumerable<T>
{
        IAsyncEnumerator<T> GetEnumerator(CancellationToken cancellation);
}

And potentially

static class AsyncEnumerableExtensions
{
    static IAsyncEnumerator<T> GetEnumerator(this IAsyncEnumerable<T> source) => source.GetEnumerator(CancellationToken.None);
}
@juepiezhongren

This comment has been minimized.

Show comment
Hide comment
@juepiezhongren

juepiezhongren May 11, 2017

After the appearance of "IAsyncEnumerable" in Azure Service Fabric, we want more.........
We want the entire async implementation of .net collection BCL part!!!!!!
AsyncLinq: SelectAsync... AsyncParalellLinq: AsAsyncParalell... await yield!!!!!

As ecmascript is proposing with generator async, there is to be the fulfilment in .net!
And with what i have mentioned being implemented, .net(C# and F#) is going to be the most developer-friendly language for big-data, and service-fabric is gonna boosting a completely new ecosystem with a firm asynchronous programming foundation.

wish these come true earlier!!!!

juepiezhongren commented May 11, 2017

After the appearance of "IAsyncEnumerable" in Azure Service Fabric, we want more.........
We want the entire async implementation of .net collection BCL part!!!!!!
AsyncLinq: SelectAsync... AsyncParalellLinq: AsAsyncParalell... await yield!!!!!

As ecmascript is proposing with generator async, there is to be the fulfilment in .net!
And with what i have mentioned being implemented, .net(C# and F#) is going to be the most developer-friendly language for big-data, and service-fabric is gonna boosting a completely new ecosystem with a firm asynchronous programming foundation.

wish these come true earlier!!!!

@oliverjanik

This comment has been minimized.

Show comment
Hide comment
@oliverjanik

oliverjanik Jun 2, 2017

Any news on this? Seems this interface has been defined in several places already (IX, EF7)

oliverjanik commented Jun 2, 2017

Any news on this? Seems this interface has been defined in several places already (IX, EF7)

@benaadams

This comment has been minimized.

Show comment
Hide comment
@benaadams

benaadams Aug 22, 2017

As brought up in roslyn thread

interface IAsyncEnumerator<T>: IAsyncDisposable
{
    Task<bool> MoveNextAsync();
    T Current { get; }
}

Has issues with parallelazation/concurrency/atomicity which is a desirable property of an async system (e.g. double gets on Current or missed items on Current)

benaadams commented Aug 22, 2017

As brought up in roslyn thread

interface IAsyncEnumerator<T>: IAsyncDisposable
{
    Task<bool> MoveNextAsync();
    T Current { get; }
}

Has issues with parallelazation/concurrency/atomicity which is a desirable property of an async system (e.g. double gets on Current or missed items on Current)

@IvanKonov

This comment has been minimized.

Show comment
Hide comment
@IvanKonov

IvanKonov Aug 22, 2017

Maybe then:

interface IAsyncEnumerator<T> : IAsyncDisposable
{
    Task<bool> GetNextAsync(out T next);;
    Task<T> Current { get; }
}

IvanKonov commented Aug 22, 2017

Maybe then:

interface IAsyncEnumerator<T> : IAsyncDisposable
{
    Task<bool> GetNextAsync(out T next);;
    Task<T> Current { get; }
}
@HaloFour

This comment has been minimized.

Show comment
Hide comment
@HaloFour

HaloFour Aug 22, 2017

Contributor

An async method can't have out parameters.

Contributor

HaloFour commented Aug 22, 2017

An async method can't have out parameters.

@IvanKonov

This comment has been minimized.

Show comment
Hide comment
@IvanKonov

IvanKonov Aug 22, 2017

Another version:

interface IAsyncEnumerator<T> : IAsyncDisposable
{
    Task<(bool has_value, T next)> GetNextAsync();
    Task<T> Current { get; }
}

IvanKonov commented Aug 22, 2017

Another version:

interface IAsyncEnumerator<T> : IAsyncDisposable
{
    Task<(bool has_value, T next)> GetNextAsync();
    Task<T> Current { get; }
}
@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny Aug 22, 2017

Member

How about just adding out to async methods? That would make this far cleaner :trollface:

Member

onovotny commented Aug 22, 2017

How about just adding out to async methods? That would make this far cleaner :trollface:

@vladd

This comment has been minimized.

Show comment
Hide comment
@vladd

vladd Aug 22, 2017

@IvanKonov This definition would make IAsyncEnumerator<T> not covariant by T.

vladd commented Aug 22, 2017

@IvanKonov This definition would make IAsyncEnumerator<T> not covariant by T.

@benaadams

This comment has been minimized.

Show comment
Hide comment
@benaadams

benaadams Aug 22, 2017

An async method can't have out parameters.

but

Task<bool> GetNextAsync(out T next);

isn't async; its just Task<bool> returning, and you can await with out params so no problems? 😉

async Task DoThing()
{
    var enumerator = new Enumerator<int>();
    while(await enumerator.GetNextAsync(out var next))
    {
        // Do thing with next
    }
}

struct Enumerator<T>
{
    public Task<bool> GetNextAsync(out T next)
    {
        next = default(T);
        return Task.FromResult(false);
    }
}

benaadams commented Aug 22, 2017

An async method can't have out parameters.

but

Task<bool> GetNextAsync(out T next);

isn't async; its just Task<bool> returning, and you can await with out params so no problems? 😉

async Task DoThing()
{
    var enumerator = new Enumerator<int>();
    while(await enumerator.GetNextAsync(out var next))
    {
        // Do thing with next
    }
}

struct Enumerator<T>
{
    public Task<bool> GetNextAsync(out T next)
    {
        next = default(T);
        return Task.FromResult(false);
    }
}
@yaakov-h

This comment has been minimized.

Show comment
Hide comment
@yaakov-h

yaakov-h Aug 22, 2017

Contributor

Doesn't the out param have to be set before the task is returned? At that point, the task will most likely not be completed, so you won't have a "next" to pass as the out param. Then, by the time you have your next, it's too late to fill the out param.

Contributor

yaakov-h commented Aug 22, 2017

Doesn't the out param have to be set before the task is returned? At that point, the task will most likely not be completed, so you won't have a "next" to pass as the out param. Then, by the time you have your next, it's too late to fill the out param.

@benaadams

This comment has been minimized.

Show comment
Hide comment
@benaadams

benaadams Aug 22, 2017

Doesn't the out param have to be set before the task is returned?

😢 I'll go with @onovotny's suggestion instead then #43 (comment)

benaadams commented Aug 22, 2017

Doesn't the out param have to be set before the task is returned?

😢 I'll go with @onovotny's suggestion instead then #43 (comment)

@stephentoub

This comment has been minimized.

Show comment
Hide comment
@stephentoub

stephentoub Aug 22, 2017

Member

Has issues with parallelazation/concurrency/atomicity which is a desirable property of an async system

While true, if you have a source that supports multiple readers, you can also simply change the model to one where each reader is given its own enumerator. Then the concurrency issues evaporate / are left up to the coordinator that's giving the items out to the individual enumerators.

I'll go with @onovotny's suggestion instead

I don't understand the suggestion. When I do:

Task<bool> t = enumerator.GetMoveNext(out T result);
Use(result);
await t;

how is that intended to work?

covariant

Note that most of this was already discussed in dotnet/roslyn#261.

Member

stephentoub commented Aug 22, 2017

Has issues with parallelazation/concurrency/atomicity which is a desirable property of an async system

While true, if you have a source that supports multiple readers, you can also simply change the model to one where each reader is given its own enumerator. Then the concurrency issues evaporate / are left up to the coordinator that's giving the items out to the individual enumerators.

I'll go with @onovotny's suggestion instead

I don't understand the suggestion. When I do:

Task<bool> t = enumerator.GetMoveNext(out T result);
Use(result);
await t;

how is that intended to work?

covariant

Note that most of this was already discussed in dotnet/roslyn#261.

@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny Aug 22, 2017

Member

I was trolling, but what if using out parameters with Task required use of await? Then the assignment order is guaranteed.

Member

onovotny commented Aug 22, 2017

I was trolling, but what if using out parameters with Task required use of await? Then the assignment order is guaranteed.

@stephentoub

This comment has been minimized.

Show comment
Hide comment
@stephentoub

stephentoub Aug 22, 2017

Member

I was trolling, but what if using out parameters with Task required use of await? Then the assignment order is guaranteed.

a) that would be a breaking change, b) there are actually scenarios where you want to access the out value after the synchronous return of the method, and c) anything can be made awaitable, so task isn't special in this regard.

Member

stephentoub commented Aug 22, 2017

I was trolling, but what if using out parameters with Task required use of await? Then the assignment order is guaranteed.

a) that would be a breaking change, b) there are actually scenarios where you want to access the out value after the synchronous return of the method, and c) anything can be made awaitable, so task isn't special in this regard.

@yaakov-h

This comment has been minimized.

Show comment
Hide comment
@yaakov-h

yaakov-h Aug 22, 2017

Contributor

@stephentoub is there a Cliffs Notes version of that thread?

Contributor

yaakov-h commented Aug 22, 2017

@stephentoub is there a Cliffs Notes version of that thread?

@stephentoub

This comment has been minimized.

Show comment
Hide comment
@stephentoub

stephentoub Aug 22, 2017

Member

In the near future I'm planning to take some time to write down where we are, which would include summarizing salient points from that thread.

Member

stephentoub commented Aug 22, 2017

In the near future I'm planning to take some time to write down where we are, which would include summarizing salient points from that thread.

@scottt732

This comment has been minimized.

Show comment
Hide comment
@scottt732

scottt732 Aug 22, 2017

Have you flushed out how CancellationTokens would work here? Is the intent to explicitly call GetEnumerator... something like foreach (await var item in ae.GetEnumerator(cancellationToken)) { ... } ?

Task<bool> MoveNextAsync(CancellationToken cancellationToken) in the interface feels more consistent, but I gather that would require additional language changes to accommodate in foreach/while, LINQ query/method syntax.

Would it be possible to get the IEnumerable/LINQ and IObservable/Rx extension methods wired up to IAsyncEnumerable? Do IAsyncQueryable/IAsyncQbservable factor in to the plans at all?

scottt732 commented Aug 22, 2017

Have you flushed out how CancellationTokens would work here? Is the intent to explicitly call GetEnumerator... something like foreach (await var item in ae.GetEnumerator(cancellationToken)) { ... } ?

Task<bool> MoveNextAsync(CancellationToken cancellationToken) in the interface feels more consistent, but I gather that would require additional language changes to accommodate in foreach/while, LINQ query/method syntax.

Would it be possible to get the IEnumerable/LINQ and IObservable/Rx extension methods wired up to IAsyncEnumerable? Do IAsyncQueryable/IAsyncQbservable factor in to the plans at all?

@Igorbek

This comment has been minimized.

Show comment
Hide comment
@Igorbek

Igorbek Aug 23, 2017

discriminated unions would be useful here.

Igorbek commented Aug 23, 2017

discriminated unions would be useful here.

@markusschaber

This comment has been minimized.

Show comment
Hide comment
@markusschaber

markusschaber Sep 26, 2017

I think the best way for parallel enumeration / iteration is to have one enumerator instance for each consumer, and the iterators internally know how to coordinate themselves.

So, we could define a interface IParallelEnumerable to get an IParallelEnumeratorSource, which itself produces an arbitrary amount of IEnumerator<T> instances which internally know how to coordinate themselves, and IAsyncParallelEnumerable creates IAsyncParallelEnumeratorSources which produce IAsyncEnumerator<T> instances.

We could simplify the XXXParallelEnumeratorSource interfaces to IEnumerable<IXXXEnumerator<T>>, possibly.

markusschaber commented Sep 26, 2017

I think the best way for parallel enumeration / iteration is to have one enumerator instance for each consumer, and the iterators internally know how to coordinate themselves.

So, we could define a interface IParallelEnumerable to get an IParallelEnumeratorSource, which itself produces an arbitrary amount of IEnumerator<T> instances which internally know how to coordinate themselves, and IAsyncParallelEnumerable creates IAsyncParallelEnumeratorSources which produce IAsyncEnumerator<T> instances.

We could simplify the XXXParallelEnumeratorSource interfaces to IEnumerable<IXXXEnumerator<T>>, possibly.

@stazz

This comment has been minimized.

Show comment
Hide comment
@stazz

stazz Sep 28, 2017

@mattwar It took me a while to understand the exact point of your message. I had to convert my UtilPack.AsyncEnumeration project to use WaitForNextAsync / TryGetNext pattern, and write some LINQ extension methods (just to test and learn and investigate) for modified IAsyncEnumerator, and then I started to see it. The synchronous callback LINQ (IAsyncEnumerable<T> Where<T>( this IAsyncEnumerable<T> enumerable, Func<T, Boolean> syncPredicate )) was easy, and I ended up in pretty much identical enumerator class as in your implementation. However, the asynchronous callback LINQ (IAsyncEnumerable<T> Where<T>( this IAsyncEnumerable<T> enumerable, Func<T, Task<Boolean>> asyncPredicate )) was more difficult indeed, and that was when I realized what exactly your message was about.

The root cause is that using the WaitForNextAsync / TryGetNext pattern in the proposal, the context is lost between the invocations:

Task<bool> WaitForNextAsync();
T TryGetNext(out bool success);

However, if the method signatures of that pattern would be changed like this (very close to the ones I had in my original message):

// This example uses Int64, but can be any struct, really
// Return null to signal that enumerator has reached the end
Task<Int64?> WaitForNextAsync(); // Or maybe ValueTask<Int64?> here?
T TryGetNext( Int64 waitToken, out bool success );

Then the context could be preserved via the Int64 value. Obviously, this means that those enumerators, which intend to support multiple concurrent consumers, need to have some sort of ConcurrentDictionary mapping Int64 tokens to T values. This does indeed introduce some memory and complexity issues, but IMO supporting multiple concurrent consumers should be completely optional. Those implementations that won't support concurrent consumers could have much simpler code, since they wouldn't need to use ConcurrentDictionary since they can assume that only one wait-token is used at a time.

I hope that if the WaitForNextAsync / TryGetNext pattern survives over MoveNextAsync / Current pattern in the proposal, then the WaitForNextAsync / TryGetNext pattern would be modified in such way that TryGetNext is aware of exact invocation of WaitForNextAsync (for example, like in the code sample above, Int64 could be used to pass invocation capture). Otherwise, the concurrent consumers mentioned as minor benefit in the proposal would not be a real benefit, as it doesn't scale well in e.g. writing LINQ library.

stazz commented Sep 28, 2017

@mattwar It took me a while to understand the exact point of your message. I had to convert my UtilPack.AsyncEnumeration project to use WaitForNextAsync / TryGetNext pattern, and write some LINQ extension methods (just to test and learn and investigate) for modified IAsyncEnumerator, and then I started to see it. The synchronous callback LINQ (IAsyncEnumerable<T> Where<T>( this IAsyncEnumerable<T> enumerable, Func<T, Boolean> syncPredicate )) was easy, and I ended up in pretty much identical enumerator class as in your implementation. However, the asynchronous callback LINQ (IAsyncEnumerable<T> Where<T>( this IAsyncEnumerable<T> enumerable, Func<T, Task<Boolean>> asyncPredicate )) was more difficult indeed, and that was when I realized what exactly your message was about.

The root cause is that using the WaitForNextAsync / TryGetNext pattern in the proposal, the context is lost between the invocations:

Task<bool> WaitForNextAsync();
T TryGetNext(out bool success);

However, if the method signatures of that pattern would be changed like this (very close to the ones I had in my original message):

// This example uses Int64, but can be any struct, really
// Return null to signal that enumerator has reached the end
Task<Int64?> WaitForNextAsync(); // Or maybe ValueTask<Int64?> here?
T TryGetNext( Int64 waitToken, out bool success );

Then the context could be preserved via the Int64 value. Obviously, this means that those enumerators, which intend to support multiple concurrent consumers, need to have some sort of ConcurrentDictionary mapping Int64 tokens to T values. This does indeed introduce some memory and complexity issues, but IMO supporting multiple concurrent consumers should be completely optional. Those implementations that won't support concurrent consumers could have much simpler code, since they wouldn't need to use ConcurrentDictionary since they can assume that only one wait-token is used at a time.

I hope that if the WaitForNextAsync / TryGetNext pattern survives over MoveNextAsync / Current pattern in the proposal, then the WaitForNextAsync / TryGetNext pattern would be modified in such way that TryGetNext is aware of exact invocation of WaitForNextAsync (for example, like in the code sample above, Int64 could be used to pass invocation capture). Otherwise, the concurrent consumers mentioned as minor benefit in the proposal would not be a real benefit, as it doesn't scale well in e.g. writing LINQ library.

@markusschaber

This comment has been minimized.

Show comment
Hide comment
@markusschaber

markusschaber Sep 28, 2017

@stazz I think that the prize (in terms of complexity) is too high for those who don't need parallel enumeration. The proposal with the multiple enumerators I described has the advantage that only those users who need parallel enumeration pay any prize at all.
(I don't claim any ownership on that proposal, I saw it somewhere else, but don't remember where)

markusschaber commented Sep 28, 2017

@stazz I think that the prize (in terms of complexity) is too high for those who don't need parallel enumeration. The proposal with the multiple enumerators I described has the advantage that only those users who need parallel enumeration pay any prize at all.
(I don't claim any ownership on that proposal, I saw it somewhere else, but don't remember where)

@stazz

This comment has been minimized.

Show comment
Hide comment
@stazz

stazz Sep 28, 2017

Ah yes, I forgot to mention in previous message, that another option would just indeed make this IAsyncEnumerator purely for sequential asynchronous enumeration, and handle parallel/concurrent enumeration entirely via external libraries. That would be completely ok as well.

@markusschaber But I am curious - what exactly did you mean by complexity price? From consumer point of view, it would be just changing this loop (direct copypaste from the proposal):

IAsyncEnumerable<T> enumerator = enumerable.GetAsyncEnumerator();
while (await enumerator.WaitForNextAsync())
{
    while (true)
    {
        int item = enumerator.TryGetNext(out bool success);
        if (!success) break;
        Use(item);
    }
}

into this loop:

IAsyncEnumerable<T> enumerator = enumerable.GetAsyncEnumerator();
Int64? waitToken;
while ((waitToken = await enumerator.WaitForNextAsync()).HasValue)
{
    while (true)
    {
        int item = enumerator.TryGetNext(waitToken.Value, out bool success);
        if (!success) break;
        Use(item);
    }
}

From a sequential producer point of view, all it needs is to have one extra Int64 field, which could be just Interlocked.Incremented on each successful WaitForNextAsync call, and TryGetNext would just add one extra condition that given waitToken parameter matches the previous one stored in the field.

Or did you mean something else entirely? :)

stazz commented Sep 28, 2017

Ah yes, I forgot to mention in previous message, that another option would just indeed make this IAsyncEnumerator purely for sequential asynchronous enumeration, and handle parallel/concurrent enumeration entirely via external libraries. That would be completely ok as well.

@markusschaber But I am curious - what exactly did you mean by complexity price? From consumer point of view, it would be just changing this loop (direct copypaste from the proposal):

IAsyncEnumerable<T> enumerator = enumerable.GetAsyncEnumerator();
while (await enumerator.WaitForNextAsync())
{
    while (true)
    {
        int item = enumerator.TryGetNext(out bool success);
        if (!success) break;
        Use(item);
    }
}

into this loop:

IAsyncEnumerable<T> enumerator = enumerable.GetAsyncEnumerator();
Int64? waitToken;
while ((waitToken = await enumerator.WaitForNextAsync()).HasValue)
{
    while (true)
    {
        int item = enumerator.TryGetNext(waitToken.Value, out bool success);
        if (!success) break;
        Use(item);
    }
}

From a sequential producer point of view, all it needs is to have one extra Int64 field, which could be just Interlocked.Incremented on each successful WaitForNextAsync call, and TryGetNext would just add one extra condition that given waitToken parameter matches the previous one stored in the field.

Or did you mean something else entirely? :)

@markusschaber

This comment has been minimized.

Show comment
Hide comment
@markusschaber

markusschaber Sep 28, 2017

Exactly that extra Int64 field is what I meant.

It requires that non-parallel consumers also handle a token, which is overhead.

Additionally, it forces implementors to have some indirection via the token and an internal lookup. When every consumer has it's own IAsyncEnumerator instance, that indirection can be avoided, which is probably more efficient in some cases.

markusschaber commented Sep 28, 2017

Exactly that extra Int64 field is what I meant.

It requires that non-parallel consumers also handle a token, which is overhead.

Additionally, it forces implementors to have some indirection via the token and an internal lookup. When every consumer has it's own IAsyncEnumerator instance, that indirection can be avoided, which is probably more efficient in some cases.

@jnm2

This comment has been minimized.

Show comment
Hide comment
@jnm2

jnm2 Sep 28, 2017

Contributor

Additionally, it forces implementors to have some indirection via the token and an internal lookup.

Yeah– no way. This isn't about parallelization. It's about a non-blocking version of the single-consumer idiom IEnumerable.

Contributor

jnm2 commented Sep 28, 2017

Additionally, it forces implementors to have some indirection via the token and an internal lookup.

Yeah– no way. This isn't about parallelization. It's about a non-blocking version of the single-consumer idiom IEnumerable.

@stazz

This comment has been minimized.

Show comment
Hide comment
@stazz

stazz Sep 28, 2017

@markusschaber Ahh, I see. Fair enough point! I was thinking that one Int64 field is not that bad (especially since asynchrony usually involves I/O, which tends to be more bottleneck than CPU/memory), but I do agree that it is overhead. 👍 Oh well, let's see how this proposal develops - it still might end up with MoveNextAsync / Current pattern anyways.

stazz commented Sep 28, 2017

@markusschaber Ahh, I see. Fair enough point! I was thinking that one Int64 field is not that bad (especially since asynchrony usually involves I/O, which tends to be more bottleneck than CPU/memory), but I do agree that it is overhead. 👍 Oh well, let's see how this proposal develops - it still might end up with MoveNextAsync / Current pattern anyways.

@benaadams

This comment has been minimized.

Show comment
Hide comment
@benaadams

benaadams Dec 13, 2017

Async filesystem enumeration would be an example use case; as there are currently no async .NET filesystem apis

benaadams commented Dec 13, 2017

Async filesystem enumeration would be an example use case; as there are currently no async .NET filesystem apis

@NetMage

This comment has been minimized.

Show comment
Hide comment
@NetMage

NetMage Jan 12, 2018

Personally I would really prefer flipping TryGetNext around:

Task WaitForNextAsync();
bool TryGetNext(out T nextVal);

NetMage commented Jan 12, 2018

Personally I would really prefer flipping TryGetNext around:

Task WaitForNextAsync();
bool TryGetNext(out T nextVal);

@benaadams

This comment has been minimized.

Show comment
Hide comment
@benaadams

benaadams Jan 12, 2018

Personally I would really prefer flipping TryGetNext around:

Discarded options considered:

Task<bool> WaitForNextAsync(); bool TryGetNext(out T result);: out parameters can't be covariant. There's also a small impact here (an issue with the try pattern in general) that this likely incurs a runtime write barrier for reference type results.

Probably should be When not Wait? As When is normally used in preference to Wait for async?

Task<bool> WhenNextAsync();

benaadams commented Jan 12, 2018

Personally I would really prefer flipping TryGetNext around:

Discarded options considered:

Task<bool> WaitForNextAsync(); bool TryGetNext(out T result);: out parameters can't be covariant. There's also a small impact here (an issue with the try pattern in general) that this likely incurs a runtime write barrier for reference type results.

Probably should be When not Wait? As When is normally used in preference to Wait for async?

Task<bool> WhenNextAsync();
@bbarry

This comment has been minimized.

Show comment
Hide comment
@bbarry

bbarry Jan 12, 2018

Contributor

I like the "viable alternative"

namespace System.Collections.Generic
{
    public interface IAsyncEnumerable<out T>
    {
        IAsyncEnumerator<T> GetAsyncEnumerator();
    }

    public interface IAsyncEnumerator<out T> : IAsyncDisposable
    {
        Task<bool> WaitForNextAsync();
        T TryGetNext(out bool success);
    }
}

Is it being considered to implement a method pattern version instead of explicitly settling on specific interfaces:

TAsyncEnumerable<out TItem>
{
    TAsyncEnumerator<TItem> GetAsyncEnumerator();
}

TAsyncEnumerator<out TItem>
{
    TTaskLike<bool> WaitForNextAsync();
    T TryGetNext(out bool success);
    TTaskLike DisposeAsync();
}
Contributor

bbarry commented Jan 12, 2018

I like the "viable alternative"

namespace System.Collections.Generic
{
    public interface IAsyncEnumerable<out T>
    {
        IAsyncEnumerator<T> GetAsyncEnumerator();
    }

    public interface IAsyncEnumerator<out T> : IAsyncDisposable
    {
        Task<bool> WaitForNextAsync();
        T TryGetNext(out bool success);
    }
}

Is it being considered to implement a method pattern version instead of explicitly settling on specific interfaces:

TAsyncEnumerable<out TItem>
{
    TAsyncEnumerator<TItem> GetAsyncEnumerator();
}

TAsyncEnumerator<out TItem>
{
    TTaskLike<bool> WaitForNextAsync();
    T TryGetNext(out bool success);
    TTaskLike DisposeAsync();
}
@alrz

This comment has been minimized.

Show comment
Hide comment
@alrz

alrz Jan 17, 2018

Contributor

Currently, using statements swallow try block exception if Dispose also throws and async methods do not aggregate exceptions even though they throw AggregateException. Are we going to change that behavior for using await or DisposeAsync is supposed to be error free? I suspect async methods are more likely to fail since there are more moving parts involved, so it could be harder to maintain a safe implementation to take advantage of using await and not fallback to a nested try finally as a precaution for critical code.


Another question for async streams, what if the iteration itself is not async but I have an await somewhere?

async IAsyncEnumerable<T> IteratorAsync() {
  using (var reader = await ExecuteReader()) {
    while (reader.Read())
      yield return RowParser(reader);
  }
}

Currently I can accomplish that with Task<IEnumerable<T>> and a iterator local function. MoveNextAsync for this example is unnecessary. Do we allow Task<IEnumerable<T>> as the return type in these situations?

Contributor

alrz commented Jan 17, 2018

Currently, using statements swallow try block exception if Dispose also throws and async methods do not aggregate exceptions even though they throw AggregateException. Are we going to change that behavior for using await or DisposeAsync is supposed to be error free? I suspect async methods are more likely to fail since there are more moving parts involved, so it could be harder to maintain a safe implementation to take advantage of using await and not fallback to a nested try finally as a precaution for critical code.


Another question for async streams, what if the iteration itself is not async but I have an await somewhere?

async IAsyncEnumerable<T> IteratorAsync() {
  using (var reader = await ExecuteReader()) {
    while (reader.Read())
      yield return RowParser(reader);
  }
}

Currently I can accomplish that with Task<IEnumerable<T>> and a iterator local function. MoveNextAsync for this example is unnecessary. Do we allow Task<IEnumerable<T>> as the return type in these situations?

@quinmars

This comment has been minimized.

Show comment
Hide comment
@quinmars

quinmars Jan 19, 2018

@alrz I guess that would be very tricky. Who will dispose the reader? I think that will only work with buffering of the complete list or with IAsyncEnumerable<T>. The first MoveNextAsync will take sometime because the reader needs to be awaited, the following MoveNextAsync will be fast. For them, the state machine can simply return Task.FromResult(true) or Task.FromResult(false) or even better a cached or global reference of them.

quinmars commented Jan 19, 2018

@alrz I guess that would be very tricky. Who will dispose the reader? I think that will only work with buffering of the complete list or with IAsyncEnumerable<T>. The first MoveNextAsync will take sometime because the reader needs to be awaited, the following MoveNextAsync will be fast. For them, the state machine can simply return Task.FromResult(true) or Task.FromResult(false) or even better a cached or global reference of them.

@alrz

This comment has been minimized.

Show comment
Hide comment
@alrz

alrz Jan 19, 2018

Contributor

RE "Who will dispose the reader" The iterator.

async Task<IEnumerable<T>> IteratorAsync() {
  var reader = await ExecuteReader();
  IEnumerable<T> Iterator() {
    using (reader)
      while (reader.Read())
        yield return RowParser(reader);
  }
  return Iterator();
}

It's not always easy to workaround this, would be nice if the compiler can see if the iterator is not awaiting and permit Task<IEnumerable<T>> as the return type.

Contributor

alrz commented Jan 19, 2018

RE "Who will dispose the reader" The iterator.

async Task<IEnumerable<T>> IteratorAsync() {
  var reader = await ExecuteReader();
  IEnumerable<T> Iterator() {
    using (reader)
      while (reader.Read())
        yield return RowParser(reader);
  }
  return Iterator();
}

It's not always easy to workaround this, would be nice if the compiler can see if the iterator is not awaiting and permit Task<IEnumerable<T>> as the return type.

@quinmars

This comment has been minimized.

Show comment
Hide comment
@quinmars

quinmars Jan 19, 2018

@alrz I see. But it's rather unnoticeable, that you will leak resources if you do not use the IEnumerable<T> once.

quinmars commented Jan 19, 2018

@alrz I see. But it's rather unnoticeable, that you will leak resources if you do not use the IEnumerable<T> once.

@bbarry

This comment has been minimized.

Show comment
Hide comment
@bbarry

bbarry Jan 21, 2018

Contributor

If the alternative interfaces are used, I don't think there is any benefit to permitting Task<IEnumerable<T>> (maybe 1 virtual call? If the pattern form was used they could potentially even be inlineable).

Contributor

bbarry commented Jan 21, 2018

If the alternative interfaces are used, I don't think there is any benefit to permitting Task<IEnumerable<T>> (maybe 1 virtual call? If the pattern form was used they could potentially even be inlineable).

@TylerBrinkley

This comment has been minimized.

Show comment
Hide comment
@TylerBrinkley

TylerBrinkley May 16, 2018

I was watching the Build demo of this feature and was wondering how we can specify .ConfigureAwait(false) for async disposal and async enumeration?

TylerBrinkley commented May 16, 2018

I was watching the Build demo of this feature and was wondering how we can specify .ConfigureAwait(false) for async disposal and async enumeration?

@TylerBrinkley

This comment has been minimized.

Show comment
Hide comment
@TylerBrinkley

TylerBrinkley May 16, 2018

Nevermind, it appears this is being considered in the proposal.

TylerBrinkley commented May 16, 2018

Nevermind, it appears this is being considered in the proposal.

@svick

This comment has been minimized.

Show comment
Hide comment
@svick

svick May 24, 2018

Contributor

From May 21 2018 LDM notes:

foreach await over dynamic

Block it. For synchronous foreach we resort to the nongeneric IEnumerable, but there is no nongeneric IAsyncEnumerable, and there won't be.

Since the IAsyncEnumerable interface is going to be covariant, would it make sense if foreach await over dynamic worked with IAsyncEnumerable<object>? It would work only for async enumerables of reference types, but maybe that's better than nothing?

Contributor

svick commented May 24, 2018

From May 21 2018 LDM notes:

foreach await over dynamic

Block it. For synchronous foreach we resort to the nongeneric IEnumerable, but there is no nongeneric IAsyncEnumerable, and there won't be.

Since the IAsyncEnumerable interface is going to be covariant, would it make sense if foreach await over dynamic worked with IAsyncEnumerable<object>? It would work only for async enumerables of reference types, but maybe that's better than nothing?

@Neme12

This comment has been minimized.

Show comment
Hide comment
@Neme12

Neme12 Sep 27, 2018

How will async disposables interact with #1174? Will there be an async form of that?

Neme12 commented Sep 27, 2018

How will async disposables interact with #1174? Will there be an async form of that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment