New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Async iteration #1
Comments
I must admit, after writing so much functional programming async function bench() {
for (msize of messages)
for (csize of clusters)
for (lsize of clients) {
const details = { msize, csize, lsize, records, connections }
, cluster = await createCluster(details)
, clients = await createClients(details)
, results = await test(cluster, clients)
save(details, cluster, clients, stamps)
values(clients).map(d => d.kill())
values(cluster).map(d => d.kill())
}
log(str(stamps))
}
The initial code I wrote for const stable = peers => new Promise(resolve => {
const start = process.hrtime()
resolve = debounce(1000)(resolve)
peers.map(peer => peer.on('message', ({ checksum }) => {
if (checksum != checksums[peers.length]) return log("cluster unstable".red, checksums[peers.length], '/', checksum)
peer.stable = process.hrtime(start)
log("cluster peer stable".yellow, checksum)
if (peers.every(by('stable'))) {
log('cluster stable'.green, '(', peers.length, ')')
resolve(peers)
peers.map(peer => peer.removeAllListeners())
}
}))
}) It waits till all peers have the same checksum (hash of peers). This is much more readable in the below form imho. Also notice the return value which is a stream we manipulate, but we are only interested in the first and only time it emits (i.e. cluster converged) and hence it can be async function stable(peers) {
const start = process.hrtime()
, combined = emitterify()
// combine checksum changes from each peer into one stream
peers
.map(peer => peer.on('checksum', ({ checksum }) => combined.emit('change', { peer, checksum })))
// log if peer yet to still discover all other peers
changes
.filter(({ checksum }) => checksum != checksums[peers.length])
.map(({ checksum }) => log('cluster unstable'.red, checksums[peers.length], '/', checksum))
// log and record time it took peer to discover all other peers
changes
.filter(({ checksum }) => checksum === checksums[peers.length])
.filter(({ peer, checksum }) => (peer.stable = process.hrtime(start)))
.map(({ checksum }) => log('cluster peer stable'.yellow, checksum))
// resolve when all peers discovered all other peers
return changes
.filter(d => peers.every(by('stable')))
.map(d => log('cluster stable'.green, '(', peers.length, ')'))
.map(d => peers)
} This library should also play nice with for await (const { peer, checksum } of cluster.on('change')) Do you think there is no need for this library or similar once |
My point wasn't so much that your library is useless, but that it's probably useless to try and come up with a unified abstraction for async or sync streams because that problem is solved by AsyncIterator and Iterator protocols respectively. There is probably value in having functions that operate on iterables however, like so: let numbers = numbersFromSomewhere()
let squared = await map(numbers, n => n * n)
let even = await filter(squared, n => n % 2 === 0) Then you can compose, partially apply, thread, or any of the other nice functional thingies you may want to do, which you can't really do with a dot-style fluid API. |
But I must digressAs a side note, I don't much care for the This is what we have now: async function downloadThings() {
let first = await download(firstThing)
let second = await download(secondThing)
let third = download(thirdThing)
return { first, second, third }
}
/* We also don't even have top-level await (at least not yet) so
* in order to use the above I have to do this nonsense:
*/
downloadThings().then(async ({ first, second, third }) => {
console.log(first, second, await third)
}) But really, it should just be: function downloadThings() {
let first = download(firstThing) // Implicitly awaited
let second = download(secondThing) // Implicitly awaited
let third = async download(thirdThing) // Returns promise
return { first, second, third }
}
let { first, second, await third } = downloadThings() // Implicitly awaited
console.log(first, second, third) Sure, it'd probably be difficult for engines to optimize, but at least they are compilers – I'm not. Not to mention the inconsistency in having |
Agreed, I think there's room for improvement in some of the newer API's too. In contrast to Observables, I see AsyncIterator more like an interface that this library could (and hopefully will!) implement. In terms of the example below, the main problem I see with this API is that the second line will wait till completion until any of the values in third line are processed: let numbers = numbersFromSomewhere()
let squared = await map(numbers, n => n * n)
let even = await filter(squared, n => n % 2 === 0) Whereas ideally you want a way to manage concurrency across many streams. A small core with just |
I suppose you meant the other way around. Anyway, this is the reason why the Clojure folk came up with transducers: decoupling the transformation from the collection. |
Yeah, sorry wasn't very clear: the third line will wait till the stream on the second line completes (which might be infinite). Re: transducers: you could compose an algorithmic transformation and pass it to |
This is now done by implementing So now you can pull from a channel like so: // blocks until new value is put on the channel
threads.consumer = async chan => {
for await (const value of chan)
if (results.push(value) == 10) break
} and a producer can respond to pull signals, which is another event stream on the channel itself: // puts new value on the channel as consumer pulls
threads.producer = async (chan, i = 0) => {
for await (const d of chan.on('pull'))
chan.next(++i)
} which means producers/consumers can now co-operate: const chan = o.on('foo')
threads.producer(chan)
threads.consumer(chan) expect(results).to.be.eql([1,2,3,4,5,6,7,8,9,10]) @mstade - I'd be interested to hear your thoughts on this :) |
For what it's worth, async iteration is coming and I've basically resigned to the idea that this is now how streams in JavaScript will look. Oh well.
The text was updated successfully, but these errors were encountered: