Skip to content
This repository has been archived by the owner on Feb 25, 2020. It is now read-only.

Pushing Events through HTTP API to subscribed core not working #30

Open
chron0 opened this issue Oct 27, 2014 · 0 comments
Open

Pushing Events through HTTP API to subscribed core not working #30

chron0 opened this issue Oct 27, 2014 · 0 comments

Comments

@chron0
Copy link

chron0 commented Oct 27, 2014

I want to have my core subscribe to certain events, say "alerts" for example and then execute whatever I defined in the associated event handler. This should not be bound to a coreID, I want any core to react to all events named "alerts" when it subscribed that event name. Now, when I POST to /v1/devices/events using the name "alerts", the event never actually gets pushed to the core. After a lot of logger.log in different parts of the code I think I could isolate the problem down to node_modules/spark-protocol/clients/SparkCore.js:

       try {
           if (!global.publisher) {
                logger.error('No global publisher');
                return;
           }

           if (!global.publisher.publish(isPublic, obj.name, obj.userid, obj.data, obj.ttl, obj.published_at, this.getHexCoreID())) {
               //this core is over its limit, and that message was not sent.
               this.sendReply("EventSlowdown", msg.getId());
               logger.log('EventSlowdown triggered' + this.getHexCoreID());
           }
           else {
                 this.sendReply("EventAck", msg.getId());
                 logger.log("onCoreSentEvent: sent to " + this.getHexCoreID());
           }
       }

It seems to me that global.publisher.publish is always going into this limit. I haven't really understood how the publisher works and have some trouble interpreting the code and I might just be doing something completely wrong. If anyone else has something like this working, any advice/example is welcome, otherwise it feels like a bug to me :)

To make it more clear, I don't want the core to subscribe/react to events from other cores, I just want to have them subscribe to a designated channel "alerts" and have them decide what to depending on event data. The trigger should be a simple POST through the spark-server API (as defined in the spark-server docs) so that hubot scripts or whatever else can trigger these events.

lbt pushed a commit to lbt/spark-server that referenced this issue Feb 28, 2017
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants