getEventStream stops returning events after a long time #72
Comments
Not that this is a fix, but you can explicitly restart it if it gets an end event. This is in some of the examples: function openStream() {
const req = Spark.getEventStream( false, 'mine', function(...) {...} );
req.on('end', function() {
console.warn("Spark event stream ended! re-opening in 3 seconds...");
setTimeout(openStream, 3 * 1000);
});
} |
Bump. Is there a way to explicitly kill the current stream? |
This is the key snippet I was using in conjunction with Node Inspector to play around. const stream = particle.getEventStream(false, 'mine', handler);
// Sometimes the stream ends, we want to be sure to restart it
stream.on('end', () => {
console.log('Particle event stream ended! Restarting in 3 seconds...');
// TODO track the stream closing event
setTimeout(openEventStream, 3000);
});
setTimeout(() => {
const s = stream; // For some reason `stream` was not in the debugger scope unless I reassigned it here
debugger
// Calling s.end() did nothing to stop the stream, earlier event listener was not fired
// Calling s.destroy() did nothing to stop the stream
}, 10000) |
Give |
Yeah, that did the trick. What the hell, how is that not documented in the Node Stream API? |
|
Ah right, thank you. Well, not documented in their API either 😕 Here's the open issue asking for it |
Great. Thanks @brycekahle |
I have a motion detector attached to my spark core, that outputs events like:
I'm using this script to monitor it:
It stops logging events after a long time (after a day or two). I noticed the same issue when using the EventSource js module instead of the official sparkjs module. I know it's a problem with the client, because if I restart the client script, it continues chugging along.
The text was updated successfully, but these errors were encountered: