Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a renamed implementation of Particle.subscribe(particle/device/name); ? #298

Closed
mrferrar opened this issue Feb 4, 2022 · 5 comments

Comments

@mrferrar
Copy link

mrferrar commented Feb 4, 2022

I have noticed there is a useful feature on Particle's server where a photon call of Particle.subscribe("particle/device/name", ...); triggers a particle/device/name event that the device can subscribe to in order to get it's server-assigned name.
Does this feature have a different trigger on this server?

@jlkalberer
Copy link

jlkalberer commented Feb 4, 2022

It does but it's using the legacy command for it -- spark/device/name
I haven't updated any of the commands since they started porting them to be "particle" vs "spark"

@mrferrar
Copy link
Author

mrferrar commented Feb 5, 2022

I did try that, though annoyingly the photons seem unable to properly subscribe to any events with the 'spark' prefix. Thanks for confirming.

@jlkalberer jlkalberer reopened this Feb 5, 2022
@jlkalberer
Copy link

Ok -- I can put out an update to fix this.

jlkalberer added a commit that referenced this issue Feb 5, 2022
@jlkalberer
Copy link

Ok -- this should be fixed. I didn't test it.

Make sure you update the node modules in order to get the latest version of spark-protocol

@mrferrar
Copy link
Author

mrferrar commented Feb 7, 2022

That worked, thank you!

jlkalberer added a commit that referenced this issue Feb 18, 2024
* Added npm functions for example build and run, changed eventProvider sample

* Changed sample

* changed ignore on npm build:examples

* patched sample to working code

* add prettier, add gitAttributes to hide dist in commit diffs

* Changed Infos in package.json

* Added Bunyan Logger

* added last NIT from last PR

* Added Changes from jlkalberer

* Added Changes AntonPuko

* Added WebHookLogger Log Line

* remove gitAdd from npm scripts, its lint-staged own command

* update package-lock.json, closes #229

* remove wrong logger.warn in webhookManager

* remove wrong logger.warn in webhookManager

* update package-lock.json, closes #234

* update migrateScript, so it saves dates correctly.

* passing not json event.data as sting for webhooks as form-ulr-encoded

* update build

* stringify device variables

* try/catch around device attributes as JSON deserializer was breaking on existing devices.

* Update to newer version of spark protocol

* build deviceAttributesDatabaseRepository.js

* Starting work on products API.
Still need to add in device endpoint and update the firmware manager so it will auto-update devices.

* fix adminUser creation

* Finished API for managing firmware/devices/products.

Now we need to actually add code for flashing the correct firmware when the device connects.

* Added changes so that when a product firmware is updated, it sends an event to the device server to start updating all connected devices.

* Rebuilt dist

* Fixed DB so it fetches correctly for mongo.

* fix variables loss on update device name.

* remove console.logs; add webhookLoggerStub

* add bunyan to start scripts

* add DeviceManager to ProductsController bindings

* provide webhookData as querystring for GET requests

* update package-lock.json to get last spark-protocol changes.

* remove comments

* add deviceManager assignment in productsController, closes #245

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@3472b1a

closes #244

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@b5ede6d

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@2e1dde3

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@6e83a46

* product api fixes

* revert back casting product_id to string on production creation, use helper function for getting numeric productID from IDOrSlug

* add equals to 0 check for right platform_id filtering.

* fix _formatProduct in ProductDatabaseRepository

* add parseInt for firmware version comes from route params.

* formatDeviceAttributes, add product_id to returned product devices props

* build files

* use isNaN in getByIDOrSlug

* fix flashProductDevices method, fix csv parsing

* save product_id instead product.id in productDevice

* cast productFirmware buffer when save to/ get from db.

* update flow-bin, fix some product related flow errors, there are still some

* fix bugs, add temporary hack for getting right buffer from different dbs

* add devicePing endpoint

* disable userFilter for admin in eventsController

* return keep alive for event stream

* addproductDevices small fixes, rebuild files

* Fixed bindings...?

* Updated products controller to flash when devices are added to an existing product.
Fixed some flow errors.

* Always flash new devices when they are added to a product.

* Not sure why these changes weren't checked in

* bump package-lock.json

* bump package-lock.json again with rebuilt spark-protocol

* bump spark-protocol verison in package-lock for that change Brewskey/spark-protocol@566fb0a

* Updated README
FirmwareCompilationManager doesn't depend on external files.
Flow fixes.

* update package-lock.json for that change Brewskey/spark-protocol@3f2bf42, fix #265

* bump spark-protocol version

* set current === false for existing released firmware on update/addnew firmware

* remove missing api features from readme

* fix current is undefined

* Fixed updates on product firmware

Fixes #266

* Rebuild

* bump package-lock.json

* fix some products endpoint payloads under v2 routes

* remove page and pageSize from db.find()

* fix default skip/take value in mongodb

* Updated ec-key
Now using yarn

* Update yarn.lock

* and again..

* ............

* This should be the one

* Update yarn lock

* parse skip and take to integer for mongo

* case insensitive search for getByIDs, should close #279

* add paging for v2 getFirmares, add countFirmwares, fix find() with take = 0 return all entities

* remove test endpoint

* remove default take value, better skip/take checks

* move firmwares endpoints to its own controllers

* revert case sensitive search for getManyFromIDs for now

* Fixed issue with adding products. If the device had never connected to the cloud, the platformID wasn't set on DeviceAttributes.

* Update spark-protocol version

* Update yarn.lock

* ...

* add getFirmware route

* fix event stream format under eventsControllerV2

* Fixes to productdevices query
Always `toLowerCase` when working with particle device IDs. There are cases where we query/save without doing this so I'll write a migration script later.

* Fix ProductDevices return value so it always returns `id`. If attributes are null, it doesn't return the value.

* remove deviceAttributes check on delete productDevice

* Update firmware queries so device_count queries the database instead of storing the value.

* Fix bindings. The ProductFirmwareController didn't work because it wasn't getting all the parameters injected.

* The DeviceAttributeDatabaseRepository was patching instead of updating. We need to get all the existing data in order to know the device product firmware version.

* Trying to get the firmware version into productdevices...

* doh.. forgot to use `await`

* Return `notes` and other fields from ProductDevices API

* You should be able to toggle quarantined devices even if they haven't connected yet.

* send /n on sse connection start

* Updated yarn dependencies

* Updated spark-protocol version

* Updated spark-server dependency.

* Updated dependencies

* Fixing logging in WebhookManager

* Updating logging

* Updating logging to be more consistent and include deviceID if possible

* Fix WebhookManager.test.js

* Fixing exception in website manager

* More logging fixes

* More webhook logging for errors.

* Add additional SSE header

* Update protocol version

* Update spark-protocol that supports "@" in binary filenames.

* Update events controller to return the proper format :/

* ...

* Update SSE to return UTF-8 encoding header.

* Removing changes to event stream. We want the default message type to be used so we only have to subscribe to one thing.

* revert keepalive.

* There was an unimplemented function on this repository.. this will fix firmware updates.

* Fixes #290

* Add support for name or device ID on endpoints for #293

* One more stab at #293

* Fixing flow types

* Upgraded spark dependency

* Update README.md

* Update README.md

* Upgraded yarn.lock

* Upgrade spark-protocol

**Make sure to run `yarn upgrade-firmware` in order to get OTA update fixes**

* Upgraded dependencies to get new fix for flashing larger binaries

* Updated ec-key and rebuilt

* Hopefully fixes #295

* Update README.md

* Update README.md

* Upgrade spark-protocol

* Fixed all flow types

Server seems to be running but I'll need to test

* Fixing scripts for npm

* Updated to latest spark-protocol

* Updated dependencies

* Update reference to spark-protocol. Fixes crypto errors.

* Updated CORS to run for all requests

* Adding index generation

* Updated server to use fixed spark-protocol
Added support for a settings.json file to override settings.

* Rebuild server
Updated link in readme

* Fixing #298

* Fixed events controller crashing the whole server and added some logging.

* build server

* Upgrade dependencies

* Remove extra logging

* Upgrade spark-protocol for connection throttling.

* Upgrade spark protocol for OTA update fixes.

* Update spark-protocol

* Update spark-protocol

* Upgrade spark-protocol

* update spark-protocol

* upgrade spark-protocol

* Fix coap functions

* Update spark-protocol

* Update spark protocol

* Update spark-protocol

* update spark-protocol

* Fixed binding for config
Updated spark-protocol

* Update packages

* Remove postinstall step

* Update dependencies

* Update mongodb version

* Fix mongo collection call

* Get db from client

* Fix database

* Updated spark-protocol

* update deps

* update

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* Migrated everything to typescript

* Add improved logging

* Redacted token from authorization header.

* Improve types

* bump

* bump

* Fix mkdirp

* Delete generated files

* Migrate to lerna + workspaces

* Working on publish workflow

* Trying to fix npm ci in GH action

* fix commands

* Remove remote e2e tests

* Make private false

* public access

* only publish dist folder

* Publish spark-protocol too

* Bump version

* fix publish.yml

* fixing exports

* Remove crc check when updating firmware

* Bump packages

* Move particle-collider

* Fix webhooks
Bump modules for publish

---------

Co-authored-by: Andreas <andreas.haeferer@keatec.com>
Co-authored-by: Andreas Häferer <ahr@ttDev.local>
Co-authored-by: Anton puko <antonpukowebdev@gmail.com>
Co-authored-by: Anton Puko <AntonPuko@users.noreply.github.com>
Co-authored-by: AntonPuko <stinger_anton@mail.ru>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants