Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with handshake process #295

Open
xeanhort opened this issue May 5, 2020 · 32 comments
Open

Error with handshake process #295

xeanhort opened this issue May 5, 2020 · 32 comments

Comments

@xeanhort
Copy link

xeanhort commented May 5, 2020

Hi,

I've just tried the steps to build but I'm having troubles during the authentication process.

The error comes when the particle device tries to connect to the cloud but if I look to deviceKeys.deb, the data for that device is properly stored:

{"deviceID":"XXXXXXXXXXXXXXXXXXXXXXXX","algorithm":"ecc","key":"-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAETNP2mzOQyygIo+2O+WnooEWcEeCH\ncQvvydppV0FlxAc8V4Jhe3g+SokUyjgQkiIHI+BmrUJVumMizhydarG1Ug==\n-----END PUBLIC KEY-----\n","_id":"XvCn29Nb2MMOiDuI"}

The error is when comparing publicKey.equals(deviceProvidedPem) at Handshake.js:383 from spark-protocol (I posted the issue here because I don't really know what is the root cause)

The error log shows this:

[2020-05-05T18:17:17.694Z] INFO: DeviceServer.js/7443 on CX61-2PC: New Connection
[2020-05-05T18:17:18.723Z] ERROR: Handshake.js/7443 on CX61-2PC: Handshake failed (cache_key=_4, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, ip=::ffff:XX.XX.XX.XX)
Error: Failed to match tag: "objid" at: ["algorithmIdentifier"]["parameters"]
at DecoderBuffer.error (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/reporter.js:53:11)
at DERNode.decodeTag [as _decodeTag] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/decoders/der.js:67:19)
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:302:23)
at decodeChildren (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:332:15)
at Array.some ()
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:329:33)
at decodeChildren (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:332:15)
at Array.some ()
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:329:33)
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:263:47)
at SpkiKey.decode (/media/user/spark-server/node_modules/asn1.js/lib/asn1/decoders/der.js:25:20)
at Entity.decode (/media/user/spark-server/node_modules/asn1.js/lib/asn1/api.js:47:32)
at parseSpki (/media/user/spark-server/node_modules/ec-key/src/ec-key.js:158:27)
at parsePem (/media/user/spark-server/node_modules/ec-key/src/ec-key.js:184:12)
at new ECKey (/media/user/spark-server/node_modules/ec-key/src/ec-key.js:237:15)
at new DeviceKey (/media/user/spark-server/node_modules/spark-protocol/dist/lib/DeviceKey.js:32:25)
at DeviceKey.equals (/media/user/spark-server/node_modules/spark-protocol/dist/lib/DeviceKey.js:75:20)
at Handshake._callee5$ (/media/user/spark-server/node_modules/spark-protocol/dist/lib/Handshake.js:383:29)
at tryCatch (/media/user/spark-server/node_modules/regenerator-runtime/runtime.js:62:40)
at Generator.invoke [as _invoke] (/media/user/spark-server/node_modules/regenerator-runtime/runtime.js:296:22)
at Generator.prototype.(anonymous function) [as next] (/media/user/spark-server/node_modules/regenerator-runtime/runtime.js:114:21)
at step (/media/user/spark-server/node_modules/babel-runtime/helpers/asyncToGenerator.js:17:30)
at /media/user/spark-server/node_modules/babel-runtime/helpers/asyncToGenerator.js:28:13
at
[2020-05-05T18:17:18.723Z] INFO: Device.js/7443 on CX61-2PC: Device disconnected (cache_key=_4, deviceID="", disconnectCounter=1)
[2020-05-05T18:17:18.723Z] ERROR: DeviceServer.js/7443 on CX61-2PC: Device startup failed (deviceID=null)
Error: Failed to match tag: "objid" at: ["algorithmIdentifier"]["parameters"]
at DecoderBuffer.error (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/reporter.js:53:11)
at DERNode.decodeTag [as _decodeTag] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/decoders/der.js:67:19)
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:302:23)
at decodeChildren (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:332:15)
at Array.some ()
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:329:33)
at decodeChildren (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:332:15)
at Array.some ()
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:329:33)
at DERNode.decode [as _decode] (/media/user/spark-server/node_modules/asn1.js/lib/asn1/base/node.js:263:47)
at SpkiKey.decode (/media/user/spark-server/node_modules/asn1.js/lib/asn1/decoders/der.js:25:20)
at Entity.decode (/media/user/spark-server/node_modules/asn1.js/lib/asn1/api.js:47:32)
at parseSpki (/media/user/spark-server/node_modules/ec-key/src/ec-key.js:158:27)
at parsePem (/media/user/spark-server/node_modules/ec-key/src/ec-key.js:184:12)
at new ECKey (/media/user/spark-server/node_modules/ec-key/src/ec-key.js:237:15)
at new DeviceKey (/media/user/spark-server/node_modules/spark-protocol/dist/lib/DeviceKey.js:32:25)
at DeviceKey.equals (/media/user/spark-server/node_modules/spark-protocol/dist/lib/DeviceKey.js:75:20)
at Handshake._callee5$ (/media/user/spark-server/node_modules/spark-protocol/dist/lib/Handshake.js:383:29)
at tryCatch (/media/user/spark-server/node_modules/regenerator-runtime/runtime.js:62:40)
at Generator.invoke [as _invoke] (/media/user/spark-server/node_modules/regenerator-runtime/runtime.js:296:22)
at Generator.prototype.(anonymous function) [as next] (/media/user/spark-server/node_modules/regenerator-runtime/runtime.js:114:21)
at step (/media/user/spark-server/node_modules/babel-runtime/helpers/asyncToGenerator.js:17:30)
at /media/user/spark-server/node_modules/babel-runtime/helpers/asyncToGenerator.js:28:13

I'm using node v8.11.1, npm 5.6.0 and yarn 1.22.4, and tested the server on Ubuntu&Windows.

Thank you

@jlkalberer
Copy link

What particle device are you using?
Did you configure the device to use TCP? <-- You need to manually add this flag for devices like electron
Did you run particle config myconfig before adding the device to your server? <-- I see a lot of weird errors when this isn't done

@xeanhort
Copy link
Author

xeanhort commented May 5, 2020

It's an electron, and yes! I did the step to change the profile (particle config myconfig) and the tcp step.

I had to change the command slightly to work tough: particle keys server default_key.pub.pem --host IP_ADDRESS --port 5683 --protocol tcp and also particle keys protocol --protocol tcp

@xeanhort
Copy link
Author

xeanhort commented May 5, 2020

By the way, I'm using the latest particle CLI version (2.3.0)

@jlkalberer
Copy link

There are two things I can think of

  1. The server key was incorrectly set on the electron. I would retry this and make sure the file extension is correct.
  2. Node v8.11.1 has a crypto bug or some behavior changed in the libraries

@xeanhort
Copy link
Author

xeanhort commented May 5, 2020

I did the whole process few times, but I can check again!
About node 8.11.1, I've used it because is the one used in this link. If you have an specific version of node/particleCLI I can use it and try again.

@jlkalberer
Copy link

Ok, I think I just need to use the original version of this - usrz/ec-key@23f4dd3

I'm guessing the dependencies are just out of date now.

@xeanhort
Copy link
Author

xeanhort commented May 5, 2020

I've changed package.json, deleted node_modules and executed yarn install again. The file yarn.lock was automatically updated with new versions of ec-key, bn.js and asn1.js but no luck connecting the device. The same Error: Failed to match tag: "objid" at: ["algorithmIdentifier"]["parameters"]

Is there anything I'm missing?

@jlkalberer
Copy link

I updated the dependency. You'll need to pull the latest version of this repo and yarn install --force to upgrade dependencies.

@xeanhort
Copy link
Author

xeanhort commented May 5, 2020

I've seen that spark-protocol repo was updated, but not this. Shouldn't this package.json be updated with the new ec-key ^0.0.4 too?

I can't update properly. It keeps downloading the spark-protocol without your last changes

@jlkalberer
Copy link

Doh -- I didn't push. It's updated now.

@xeanhort
Copy link
Author

xeanhort commented May 6, 2020

Tried the with the updated version but still fails with the same error. Also I've moved to node v14.2.0 without any improvement :(

This is the "updated" error log:

2020-05-06T07:45:09.161Z] ERROR: Handshake.js/5216 on 8NEO6A5: Handshake failed (cache_key=_2, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, ip=::ffff:XX.XX.XX.XX)
Error: Failed to match tag: "objid" at: ["algorithmIdentifier"]["parameters"]
at DecoderBuffer.error (D:\spark-server\node_modules\asn1.js\lib\asn1\base\reporter.js:78:11)
at DERNode.decodeTag [as _decodeTag] (D:\spark-server\node_modules\asn1.js\lib\asn1\decoders\der.js:71:19)
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:341:25)
at decodeChildren (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:378:15)
at Array.forEach ()
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:375:22)
at decodeChildren (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:378:15)
at Array.forEach ()
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:375:22)
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:280:47)
at Generated.decode (D:\spark-server\node_modules\asn1.js\lib\asn1\decoders\der.js:28:20)
at Entity.decode (D:\spark-server\node_modules\asn1.js\lib\asn1\api.js:44:32)
at parseSpki (D:\spark-server\node_modules\ec-key\src\ec-key.js:158:27)
at parsePem (D:\spark-server\node_modules\ec-key\src\ec-key.js:184:12)
at new ECKey (D:\spark-server\node_modules\ec-key\src\ec-key.js:237:15)
at new DeviceKey (D:\spark-server\node_modules\spark-protocol\dist\lib\DeviceKey.js:32:25)
at DeviceKey.equals (D:\spark-server\node_modules\spark-protocol\dist\lib\DeviceKey.js:75:20)
at Handshake._callee5$ (D:\spark-server\node_modules\spark-protocol\dist\lib\Handshake.js:383:29)
at tryCatch (D:\spark-server\node_modules\regenerator-runtime\runtime.js:62:40)
at Generator.invoke [as _invoke] (D:\spark-server\node_modules\regenerator-runtime\runtime.js:296:22)
at Generator.prototype. [as next] (D:\spark-server\node_modules\regenerator-runtime\runtime.js:114:21)
at step (D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:17:30)
at D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:28:13
[2020-05-06T07:45:09.161Z] INFO: Device.js/5216 on 8NEO6A5: Device disconnected (cache_key=_2, deviceID="", disconnectCounter=1)
[2020-05-06T07:45:09.163Z] ERROR: DeviceServer.js/5216 on 8NEO6A5: Device startup failed (deviceID=null)
Error: Failed to match tag: "objid" at: ["algorithmIdentifier"]["parameters"]
at DecoderBuffer.error (D:\spark-server\node_modules\asn1.js\lib\asn1\base\reporter.js:78:11)
at DERNode.decodeTag [as _decodeTag] (D:\spark-server\node_modules\asn1.js\lib\asn1\decoders\der.js:71:19)
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:341:25)
at decodeChildren (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:378:15)
at Array.forEach ()
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:375:22)
at decodeChildren (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:378:15)
at Array.forEach ()
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:375:22)
at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:280:47)
at Generated.decode (D:\spark-server\node_modules\asn1.js\lib\asn1\decoders\der.js:28:20)
at Entity.decode (D:\spark-server\node_modules\asn1.js\lib\asn1\api.js:44:32)
at parseSpki (D:\spark-server\node_modules\ec-key\src\ec-key.js:158:27)
at parsePem (D:\spark-server\node_modules\ec-key\src\ec-key.js:184:12)
at new ECKey (D:\spark-server\node_modules\ec-key\src\ec-key.js:237:15)
at new DeviceKey (D:\spark-server\node_modules\spark-protocol\dist\lib\DeviceKey.js:32:25)
at DeviceKey.equals (D:\spark-server\node_modules\spark-protocol\dist\lib\DeviceKey.js:75:20)
at Handshake._callee5$ (D:\spark-server\node_modules\spark-protocol\dist\lib\Handshake.js:383:29)
at tryCatch (D:\spark-server\node_modules\regenerator-runtime\runtime.js:62:40)
at Generator.invoke [as _invoke] (D:\spark-server\node_modules\regenerator-runtime\runtime.js:296:22)
at Generator.prototype. [as next] (D:\spark-server\node_modules\regenerator-runtime\runtime.js:114:21)
at step (D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:17:30)
at D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:28:13
[2020-05-06T07:45:13.036Z] ERROR: Handshake.js/5216 on 8NEO6A5: Handshake failed (cache_key=_1, deviceID=null, ip=::ffff:XX.XX.XX.XX)
Error: Handshake did not complete in 10 seconds
at Timeout._onTimeout (D:\spark-server\node_modules\spark-protocol\dist\lib\Handshake.js:236:23)
at listOnTimeout (internal/timers.js:549:17)
at processTimers (internal/timers.js:492:7)
[2020-05-06T07:45:13.037Z] INFO: Device.js/5216 on 8NEO6A5: Device disconnected (cache_key=_1, deviceID="", disconnectCounter=1)

@jlkalberer
Copy link

jlkalberer commented May 6, 2020

The only thing I can think of is that the key is improperly formatted.

Can you run a test script with ec-key which tries to parse the device's key? (I am slammed at work so I can't debug this)
var key = new ECKey(pem_key, 'pem');
I'm wondering if the format of the device key has changed in newer versions of Particle firmware. I use P1s for my fleet so I don't ever test this.

@xeanhort
Copy link
Author

xeanhort commented May 6, 2020

There the test with the two keys. The one in deviceKeys.db (also matches the generated XXXXX_rsa_new.pub.pem) and the one received from the device during handshake.

const ECKey = require('ec-key');

device_db_pem = "-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEaUXZhpYnCARKEx1FGFrOgu8Dgyb5\n//Drwxg2oR8LkP37MDj7ESmj78PdBqD6PeNmvBMQg5Z7NQ8saRDxX1h50g==\n-----END PUBLIC KEY-----\n"

device_sent_pem = "-----BEGIN PUBLIC KEY-----\nMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQBThDnRSoUGN20VitFNf4Kj1wWh\nIdiJvFJ4CPj2oAoGCCqGSM49AwEHoUQDQgAEaUXZhpYnCARKEx1FGFrOgu8Dgyb5\n//Drwxg2oR8LkP37MDj7ESmj78PdBqD6PeNmvBMQg5Z7NQ8saRDxX1h50v//////\n/////////////////wIDAQAB\n-----END PUBLIC KEY-----\n"

var key = new ECKey(device_db_pem, 'pem');
console.log(key)

var key2 = new ECKey(device_sent_pem, 'pem');
console.log(key2)
/* The second throws the error:
D:\spark-server>node test.js

D:\spark-server\node_modules\asn1.js\lib\asn1\base\reporter.js:84
    throw err;
    ^
ReporterError: Failed to match tag: "objid" at: ["algorithmIdentifier"]["parameters"]
    at DecoderBuffer.error (D:\spark-server\node_modules\asn1.js\lib\asn1\base\reporter.js:78:11)
    at DERNode.decodeTag [as _decodeTag] (D:\spark-server\node_modules\asn1.js\lib\asn1\decoders\der.js:71:19)
    at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:341:25)
    at decodeChildren (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:378:15)
    at Array.forEach (<anonymous>)
    at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:375:22)
    at decodeChildren (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:378:15)
    at Array.forEach (<anonymous>)
    at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:375:22)
    at DERNode.decode [as _decode] (D:\spark-server\node_modules\asn1.js\lib\asn1\base\node.js:280:47) {
  path: '["algorithmIdentifier"]["parameters"]'
}
*/

@jlkalberer
Copy link

Ok -- that's super helpful. It's definitely a different format. I'm wondering if it's just an RSA key instead of an EC key.

particle-iot/particle-cli#280

This makes me think that I don't need the ec-key parsing code since I'm not using UDP.

@xeanhort
Copy link
Author

xeanhort commented May 6, 2020

I'm not completely sure of how is this issue related because I always use --protocol tcp in all particlecli commands. Also, that issue is closed, so not sure how they finally implemented it.

@jlkalberer
Copy link

What I'm saying is that I implemented Electron support before they fixed that bug. In the past the Electron always sent ecc keys for UDP and TCP. Now that it's fixed, I can just use the RSA key code.

I'll try to find my electron and see if I can get this fixed tonight.

@jlkalberer jlkalberer reopened this May 7, 2020
@jlkalberer
Copy link

Let me know if this works -- I couldn't find my electron so I am just guessing at a fix.

@xeanhort
Copy link
Author

xeanhort commented May 7, 2020

It now seems to compare the keys properly but still the keys doesn't match:

07:36:51.287Z  INFO DeviceServer.js: New Connection
07:36:52.332Z ERROR Handshake.js: Handshake failed (cache_key=_31, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, ip=::ffff:XX.XX.XX.XX)
    Error: key passed to device during handshake doesn'tmatch saved public key: XXXXXXXXXXXXXXXXXXXXXXXX
        at Handshake._callee5$ (D:\spark-server\node_modules\spark-protocol\dist\lib\Handshake.js:388:21)
        at tryCatch (D:\spark-server\node_modules\regenerator-runtime\runtime.js:62:40)
        at Generator.invoke [as _invoke] (D:\spark-server\node_modules\regenerator-runtime\runtime.js:296:22)
        at Generator.next (D:\spark-server\node_modules\regenerator-runtime\runtime.js:114:21)
        at step (D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:17:30)
        at D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:28:13
07:36:52.333Z  INFO Device.js: Device disconnected (cache_key=_31, deviceID="", disconnectCounter=1)
07:36:52.334Z ERROR DeviceServer.js: Device startup failed (deviceID=null)
    Error: key passed to device during handshake doesn'tmatch saved public key: XXXXXXXXXXXXXXXXXXXXXXXX
        at Handshake._callee5$ (D:\spark-server\node_modules\spark-protocol\dist\lib\Handshake.js:388:21)
        at tryCatch (D:\spark-server\node_modules\regenerator-runtime\runtime.js:62:40)
        at Generator.invoke [as _invoke] (D:\spark-server\node_modules\regenerator-runtime\runtime.js:296:22)
        at Generator.next (D:\spark-server\node_modules\regenerator-runtime\runtime.js:114:21)
        at step (D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:17:30)
        at D:\spark-server\node_modules\babel-runtime\helpers\asyncToGenerator.js:28:13

I've did the complete process with the updated code to re-generate the device/database keys and be sure that it's all set properly.

deviceProvidedPem="-----BEGIN PUBLIC KEY-----\nMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQBxq9GblXxQs0tjD3uWXhQsieLx\nKdzewcasniQNoAoGCCqGSM49AwEHoUQDQgAECdp0rE3z+K9Dgi40Og+668Vvkaiz\nacsyDzUZ3kpMPnp+fphkUApsQ0CLGp4E3S/hQQYZKfgS0qUhMcLNcQEjDP//////\n/////////////////wIDAQAB\n-----END PUBLIC KEY-----"

Database: "algorithm":"ecc","key":"-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAECdp0rE3z+K9Dgi40Og+668Vvkaiz\nacsyDzUZ3kpMPnp+fphkUApsQ0CLGp4E3S/hQQYZKfgS0qUhMcLNcQEjDA==\n-----END PUBLIC KEY-----\n"

@jlkalberer
Copy link

Do you have something other than an electron to test with? I just want to make sure that it's a server issue and not with the setup process.

@xeanhort
Copy link
Author

xeanhort commented May 7, 2020

Yes! I've tried with Photon and it works!

But still not working with Electron. I also checked a Boron but it doesn't even connect, so might be an error of UDP/TCP.

@jlkalberer
Copy link

So the ecc shouldn't show up in the database anymore. Did you delete the database before retrying?

@xeanhort
Copy link
Author

xeanhort commented May 7, 2020

All devices have ecc key in the database (except photon). I'm pretty sure I've deleted it before the last update.
I can check again tomorrow to be sure.

@xeanhort
Copy link
Author

xeanhort commented May 8, 2020

I've did it again cleaning the database but it still shows ecc.

Database:
{"deviceID":"XXXXXXXXXXXXXXX","algorithm":"ecc","key":"-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEqDnXk6K0TP14HmWmLpePHvn8Avq0\nyI3dQ1/Q4suA5dnkhZvPoMiSErseQ1+vi2b67DiooM9qYx8fYdlzyzSuxg==\n-----END PUBLIC KEY-----\n","_id":"ahWDZpOx5ZlOnPrv"}

Keys generated in the computer: https://we.tl/t-C9NqPlanxq

Commands used:

  • particle config xxxx apiUrl "http://custom_host.com:8080"
  • particle config xxxx
  • DFU
  • particle keys server default_key.pub.pem --host custom_host.com --port 5683 --protocol tcp
  • particle keys protocol --protocol tcp
  • particle keys doctor XXXXXXXXXXXXXXXXXXXX

@jlkalberer
Copy link

It looks like you're doing everything correctly. I'll need to find my electron to debug this.

@jlkalberer
Copy link

jlkalberer commented May 12, 2020

I got my Electron set up this morning but did not hook it back up to a network so I just tested and verified that it would send an RSA key instead of an ECC key.

Try this:

particle keys new test_key --protocol tcp
particle keys load test_key.der
particle keys send XXXXXXXXXXXXXXXXXXXXXXXX test_key.pub.pem

I'm going to open an issue with Particle CLI but I doubt they will fix it on their end. particle keys doctor should respect the protocol

particle-iot/particle-cli#582

@xeanhort
Copy link
Author

I executed your commands and it connected properly! RSA in the database and everything seems OK about the keys. But there's a problem during the protocol initialization:

2020-05-12T17:31:04.926Z] INFO: DeviceServer.js/4608 on 8NEO6A5: Connected Devices (devices=0, sockets=0)
[2020-05-12T17:31:10.530Z] INFO: DeviceServer.js/4608 on 8NEO6A5: New Connection
[2020-05-12T17:31:12.786Z] INFO: Device.js/4608 on 8NEO6A5: Connection attributes (particleProductId=10, platformId=10, productFirmwareVersion=65535, reservedFlags=6)
[2020-05-12T17:31:12.787Z] INFO: DeviceServer.js/4608 on 8NEO6A5: Connection (connectionID=1, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, remoteIPAddress=::ffff:XX.XX.XX.XX)
[2020-05-12T17:31:14.926Z] INFO: DeviceServer.js/4608 on 8NEO6A5: Connected Devices (devices=1, sockets=1)
[2020-05-12T17:31:24.927Z] INFO: DeviceServer.js/4608 on 8NEO6A5: Connected Devices (devices=1, sockets=1)
[2020-05-12T17:31:27.850Z] ERROR: /4608 on 8NEO6A5: completeProtocolInitialization (appHash=null, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, functions=null, ip=::ffff:XX.XX.XX.XX, lastHeard=null, ownerID=null, particleProductId=10, platformId=10, productFirmwareVersion=65535, registrar=null, reservedFlags=6, variables=null)
Error: Request timed out - Describe
at Timeout._onTimeout (D:\spark-server\node_modules\spark-protocol\dist\clients\Device.js:1118:26)
at listOnTimeout (internal/timers.js:549:17)
at processTimers (internal/timers.js:492:7)
[2020-05-12T17:31:27.850Z] ERROR: DeviceServer.js/4608 on 8NEO6A5: Connection Error (deviceID=XXXXXXXXXXXXXXXXXXXXXXXX)
Error: Request timed out - Describe
at Timeout._onTimeout (D:\spark-server\node_modules\spark-protocol\dist\clients\Device.js:1118:26)
at listOnTimeout (internal/timers.js:549:17)
at processTimers (internal/timers.js:492:7)
[2020-05-12T17:31:27.850Z] INFO: Device.js/4608 on 8NEO6A5: Device disconnected (cache_key=_1, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, duration=15.055, disconnectCounter=1)
[2020-05-12T17:31:27.851Z] WARN: DeviceServer.js/4608 on 8NEO6A5: Session ended for Device (connectionKey=_1, deviceID=XXXXXXXXXXXXXXXXXXXXXXXX, ownerID=null)
[2020-05-12T17:31:34.927Z] INFO: DeviceServer.js/4608 on 8NEO6A5: Connected Devices (devices=0, sockets=0)

I've done the process from the beginning but always happen the same. I tried DeviceOS 0.7.0 and 1.2.1.

@jlkalberer
Copy link

Well, that's progress!

For some reason it seems like the DESCRIBE message isn't being returned. I can't remember what that does off the top of my head but I'll check it out tonight or tomorrow morning.

@xeanhort
Copy link
Author

Yes! Thank you for all your support!

BTW: Which version do you use for the ParticleCLI / DeviceOS? It seems weird that all this errors are happening only to me, :(

@jlkalberer
Copy link

I upgraded to the latest.

The reason it's happening to you is that you're using an Electron. Nobody really uses that and hosts their own cloud.

@jlkalberer
Copy link

So I connected my Electron to my server and don't see the Describe error. The device does disconnect every 30 seconds but that's expected because of #291

@xeanhort
Copy link
Author

Maybe it's related to 1.2.1 version? Not sure about the Describe message but tell me if I can help somehow!

@jlkalberer
Copy link

Could be. My Electron is the latest.

Can you try flashing your Electron with tinker? Then try it with the latest system firmware.

jlkalberer added a commit that referenced this issue Feb 18, 2024
* Added npm functions for example build and run, changed eventProvider sample

* Changed sample

* changed ignore on npm build:examples

* patched sample to working code

* add prettier, add gitAttributes to hide dist in commit diffs

* Changed Infos in package.json

* Added Bunyan Logger

* added last NIT from last PR

* Added Changes from jlkalberer

* Added Changes AntonPuko

* Added WebHookLogger Log Line

* remove gitAdd from npm scripts, its lint-staged own command

* update package-lock.json, closes #229

* remove wrong logger.warn in webhookManager

* remove wrong logger.warn in webhookManager

* update package-lock.json, closes #234

* update migrateScript, so it saves dates correctly.

* passing not json event.data as sting for webhooks as form-ulr-encoded

* update build

* stringify device variables

* try/catch around device attributes as JSON deserializer was breaking on existing devices.

* Update to newer version of spark protocol

* build deviceAttributesDatabaseRepository.js

* Starting work on products API.
Still need to add in device endpoint and update the firmware manager so it will auto-update devices.

* fix adminUser creation

* Finished API for managing firmware/devices/products.

Now we need to actually add code for flashing the correct firmware when the device connects.

* Added changes so that when a product firmware is updated, it sends an event to the device server to start updating all connected devices.

* Rebuilt dist

* Fixed DB so it fetches correctly for mongo.

* fix variables loss on update device name.

* remove console.logs; add webhookLoggerStub

* add bunyan to start scripts

* add DeviceManager to ProductsController bindings

* provide webhookData as querystring for GET requests

* update package-lock.json to get last spark-protocol changes.

* remove comments

* add deviceManager assignment in productsController, closes #245

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@3472b1a

closes #244

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@b5ede6d

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@2e1dde3

* update package-lock.json for spark-protocol change: Brewskey/spark-protocol@6e83a46

* product api fixes

* revert back casting product_id to string on production creation, use helper function for getting numeric productID from IDOrSlug

* add equals to 0 check for right platform_id filtering.

* fix _formatProduct in ProductDatabaseRepository

* add parseInt for firmware version comes from route params.

* formatDeviceAttributes, add product_id to returned product devices props

* build files

* use isNaN in getByIDOrSlug

* fix flashProductDevices method, fix csv parsing

* save product_id instead product.id in productDevice

* cast productFirmware buffer when save to/ get from db.

* update flow-bin, fix some product related flow errors, there are still some

* fix bugs, add temporary hack for getting right buffer from different dbs

* add devicePing endpoint

* disable userFilter for admin in eventsController

* return keep alive for event stream

* addproductDevices small fixes, rebuild files

* Fixed bindings...?

* Updated products controller to flash when devices are added to an existing product.
Fixed some flow errors.

* Always flash new devices when they are added to a product.

* Not sure why these changes weren't checked in

* bump package-lock.json

* bump package-lock.json again with rebuilt spark-protocol

* bump spark-protocol verison in package-lock for that change Brewskey/spark-protocol@566fb0a

* Updated README
FirmwareCompilationManager doesn't depend on external files.
Flow fixes.

* update package-lock.json for that change Brewskey/spark-protocol@3f2bf42, fix #265

* bump spark-protocol version

* set current === false for existing released firmware on update/addnew firmware

* remove missing api features from readme

* fix current is undefined

* Fixed updates on product firmware

Fixes #266

* Rebuild

* bump package-lock.json

* fix some products endpoint payloads under v2 routes

* remove page and pageSize from db.find()

* fix default skip/take value in mongodb

* Updated ec-key
Now using yarn

* Update yarn.lock

* and again..

* ............

* This should be the one

* Update yarn lock

* parse skip and take to integer for mongo

* case insensitive search for getByIDs, should close #279

* add paging for v2 getFirmares, add countFirmwares, fix find() with take = 0 return all entities

* remove test endpoint

* remove default take value, better skip/take checks

* move firmwares endpoints to its own controllers

* revert case sensitive search for getManyFromIDs for now

* Fixed issue with adding products. If the device had never connected to the cloud, the platformID wasn't set on DeviceAttributes.

* Update spark-protocol version

* Update yarn.lock

* ...

* add getFirmware route

* fix event stream format under eventsControllerV2

* Fixes to productdevices query
Always `toLowerCase` when working with particle device IDs. There are cases where we query/save without doing this so I'll write a migration script later.

* Fix ProductDevices return value so it always returns `id`. If attributes are null, it doesn't return the value.

* remove deviceAttributes check on delete productDevice

* Update firmware queries so device_count queries the database instead of storing the value.

* Fix bindings. The ProductFirmwareController didn't work because it wasn't getting all the parameters injected.

* The DeviceAttributeDatabaseRepository was patching instead of updating. We need to get all the existing data in order to know the device product firmware version.

* Trying to get the firmware version into productdevices...

* doh.. forgot to use `await`

* Return `notes` and other fields from ProductDevices API

* You should be able to toggle quarantined devices even if they haven't connected yet.

* send /n on sse connection start

* Updated yarn dependencies

* Updated spark-protocol version

* Updated spark-server dependency.

* Updated dependencies

* Fixing logging in WebhookManager

* Updating logging

* Updating logging to be more consistent and include deviceID if possible

* Fix WebhookManager.test.js

* Fixing exception in website manager

* More logging fixes

* More webhook logging for errors.

* Add additional SSE header

* Update protocol version

* Update spark-protocol that supports "@" in binary filenames.

* Update events controller to return the proper format :/

* ...

* Update SSE to return UTF-8 encoding header.

* Removing changes to event stream. We want the default message type to be used so we only have to subscribe to one thing.

* revert keepalive.

* There was an unimplemented function on this repository.. this will fix firmware updates.

* Fixes #290

* Add support for name or device ID on endpoints for #293

* One more stab at #293

* Fixing flow types

* Upgraded spark dependency

* Update README.md

* Update README.md

* Upgraded yarn.lock

* Upgrade spark-protocol

**Make sure to run `yarn upgrade-firmware` in order to get OTA update fixes**

* Upgraded dependencies to get new fix for flashing larger binaries

* Updated ec-key and rebuilt

* Hopefully fixes #295

* Update README.md

* Update README.md

* Upgrade spark-protocol

* Fixed all flow types

Server seems to be running but I'll need to test

* Fixing scripts for npm

* Updated to latest spark-protocol

* Updated dependencies

* Update reference to spark-protocol. Fixes crypto errors.

* Updated CORS to run for all requests

* Adding index generation

* Updated server to use fixed spark-protocol
Added support for a settings.json file to override settings.

* Rebuild server
Updated link in readme

* Fixing #298

* Fixed events controller crashing the whole server and added some logging.

* build server

* Upgrade dependencies

* Remove extra logging

* Upgrade spark-protocol for connection throttling.

* Upgrade spark protocol for OTA update fixes.

* Update spark-protocol

* Update spark-protocol

* Upgrade spark-protocol

* update spark-protocol

* upgrade spark-protocol

* Fix coap functions

* Update spark-protocol

* Update spark protocol

* Update spark-protocol

* update spark-protocol

* Fixed binding for config
Updated spark-protocol

* Update packages

* Remove postinstall step

* Update dependencies

* Update mongodb version

* Fix mongo collection call

* Get db from client

* Fix database

* Updated spark-protocol

* update deps

* update

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* bump

* Migrated everything to typescript

* Add improved logging

* Redacted token from authorization header.

* Improve types

* bump

* bump

* Fix mkdirp

* Delete generated files

* Migrate to lerna + workspaces

* Working on publish workflow

* Trying to fix npm ci in GH action

* fix commands

* Remove remote e2e tests

* Make private false

* public access

* only publish dist folder

* Publish spark-protocol too

* Bump version

* fix publish.yml

* fixing exports

* Remove crc check when updating firmware

* Bump packages

* Move particle-collider

* Fix webhooks
Bump modules for publish

---------

Co-authored-by: Andreas <andreas.haeferer@keatec.com>
Co-authored-by: Andreas Häferer <ahr@ttDev.local>
Co-authored-by: Anton puko <antonpukowebdev@gmail.com>
Co-authored-by: Anton Puko <AntonPuko@users.noreply.github.com>
Co-authored-by: AntonPuko <stinger_anton@mail.ru>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants