New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

USG keyboard hardware proxy #2518

Open
andrewdavidwong opened this Issue Dec 13, 2016 · 12 comments

Comments

Projects
None yet
4 participants
@andrewdavidwong
Member

andrewdavidwong commented Dec 13, 2016

This is a tracking issue for a community-developed feature.

Description (#2507 (comment)):

IMHO much better solution for USB keyboard problem would be to have a piece of hardware plugged between USB keyboard and PC (based on https://github.com/robertfisk/USG?), to encrypt and integrity-protect the events. And then decrypt them in dom0 and check integrity protection, and only then pass them down to input devices stack. This should at least partially guard against malicious USB VM. It still will be able to perform timing based attacks to guess what you're typing - not sure how accurate such attacks are currently. Such device could introduce artificial delay (like - inject queued events every 50ms) to at least partially mitigate such attacks.

andrewdavidwong added a commit that referenced this issue Dec 13, 2016

@v6ak

This comment has been minimized.

Show comment
Hide comment
@v6ak

v6ak May 24, 2017

Not sure where it is the right place to discuss, but maybe here if I mention @robertfisk.

What's the current status? Do you heed help with anything? I was thinking about something similar, maybe rather as a modified keyboard firmware. I am currently not sure if the AVR chip on my keyboard is powerful enough, though.

I have few ideas:

Key exchange

First level (PoC quality) would be a key compiled in firmware. This requires advanced user who generates a random key, compiles and flashes the firmware. And the user has to flash it securely, i.e., not over a compromised USBVM. Easy to implement, hard to use.

Second level could look like Bluetooth SPP. After some Diffie-Hellman key exchange, user would authenticate it by typing few characters on the keyboard and pressing enter. This can perform mutual authentication.

Side channels

Actually, maybe we don't need to introduce any. USB is host-driven and performs polling on rate like 100Hz. So, keyboard would send the same information every 10ms. This could hide all timing side channels that are involved by design. It does not guarantee not introducing any side channel in implementation, though.

This design could also allow some real-time guarantees: Attacker cannot defer key presses much.

Encryption vs. authentication

I believe both encryption and authentication can be useful there. If there is not enough time for both, I am not 100% sure what to prefer:

  • Encryption can work as "poor man's authentication" in some cases. But it can be fragile.
  • Authentication without encryption is pretty clear solution for some reduced threat model -- malicious USB device, but sys-usb is not compromised.

Of course, if possible, I would prefer having both.

Dropped packets

I would not fear assuming that we have no dropped packets. With proper design, this will probably make the next packets rejected because of wrong sequence number. In such case, I believe it is OK to reset the connection and maybe inform user. (Without informing the user, malicious USBVM could still drop few packets by its choice without being noticed.)

v6ak commented May 24, 2017

Not sure where it is the right place to discuss, but maybe here if I mention @robertfisk.

What's the current status? Do you heed help with anything? I was thinking about something similar, maybe rather as a modified keyboard firmware. I am currently not sure if the AVR chip on my keyboard is powerful enough, though.

I have few ideas:

Key exchange

First level (PoC quality) would be a key compiled in firmware. This requires advanced user who generates a random key, compiles and flashes the firmware. And the user has to flash it securely, i.e., not over a compromised USBVM. Easy to implement, hard to use.

Second level could look like Bluetooth SPP. After some Diffie-Hellman key exchange, user would authenticate it by typing few characters on the keyboard and pressing enter. This can perform mutual authentication.

Side channels

Actually, maybe we don't need to introduce any. USB is host-driven and performs polling on rate like 100Hz. So, keyboard would send the same information every 10ms. This could hide all timing side channels that are involved by design. It does not guarantee not introducing any side channel in implementation, though.

This design could also allow some real-time guarantees: Attacker cannot defer key presses much.

Encryption vs. authentication

I believe both encryption and authentication can be useful there. If there is not enough time for both, I am not 100% sure what to prefer:

  • Encryption can work as "poor man's authentication" in some cases. But it can be fragile.
  • Authentication without encryption is pretty clear solution for some reduced threat model -- malicious USB device, but sys-usb is not compromised.

Of course, if possible, I would prefer having both.

Dropped packets

I would not fear assuming that we have no dropped packets. With proper design, this will probably make the next packets rejected because of wrong sequence number. In such case, I believe it is OK to reset the connection and maybe inform user. (Without informing the user, malicious USBVM could still drop few packets by its choice without being noticed.)

@robertfisk

This comment has been minimized.

Show comment
Hide comment
@robertfisk

robertfisk May 26, 2017

I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc. But I can make some comments on implementation details...

  • ST supports PolarSSL / mbedTLS on their STM32 micros. It would make things much easier to use crypto functions supported by this library.
  • We have about 95kB of flash (program) space available on the USG v1.0 micros, out of 128kB total. This should be plenty of space for our crypto functions.
  • We need to decide which microprocessor performs which crypto operation. The "Upstream" micro talks to the PC, so can be compromised by the sys-usb VM. The "Downstream" micro talks to the keyboard or other USB device, and can be compromised by a malicious device. So we need to answer the question:

Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both?

My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome!

I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc. But I can make some comments on implementation details...

  • ST supports PolarSSL / mbedTLS on their STM32 micros. It would make things much easier to use crypto functions supported by this library.
  • We have about 95kB of flash (program) space available on the USG v1.0 micros, out of 128kB total. This should be plenty of space for our crypto functions.
  • We need to decide which microprocessor performs which crypto operation. The "Upstream" micro talks to the PC, so can be compromised by the sys-usb VM. The "Downstream" micro talks to the keyboard or other USB device, and can be compromised by a malicious device. So we need to answer the question:

Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both?

My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome!

@jpouellet

This comment has been minimized.

Show comment
Hide comment
@jpouellet

jpouellet May 26, 2017

Contributor

I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc.

The universal advice is never roll your own ;)

You may want something simpler than a full TLS stack, in which case NaCl (or its heavier but more friendly to use direct descendant libsodium) is probably what you want.

NaCl authors have ported it to even an 8-bit AVR (see https://cryptojedi.org/papers/avrnacl-20130514.pdf), so I'm sure it would be no problem on your big fancy ARM with lots of flash to spare :)

But I can make some comments on implementation details...

  • ST supports PolarSSL / mbedTLS on their STM32 micros. It would make things much easier to use crypto functions supported by this library.
  • We have about 95kB of flash (program) space available on the USG v1.0 micros, out of 128kB total. This should be plenty of space for our crypto functions.
  • We need to decide which microprocessor performs which crypto operation. The "Upstream" micro talks to the PC, so can be compromised by the sys-usb VM. The "Downstream" micro talks to the keyboard or other USB device, and can be compromised by a malicious device. So we need to answer the question:
  • Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both?

My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome!

It is my understanding that there are two primary motivations:

  1. Securely isolating individual USB devices when you only have a single USB controller to assign via PCI-passthrough to a single sys-usb VM.

  2. Securely interacting with a USB device without needing to trust our USB controller.

Both of these scenarios treat the devices below the USG as trusted, and distrust the things on the other (computer) side.

If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc.

This to me suggests to do the crypto on the downstream side.

Contributor

jpouellet commented May 26, 2017

I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc.

The universal advice is never roll your own ;)

You may want something simpler than a full TLS stack, in which case NaCl (or its heavier but more friendly to use direct descendant libsodium) is probably what you want.

NaCl authors have ported it to even an 8-bit AVR (see https://cryptojedi.org/papers/avrnacl-20130514.pdf), so I'm sure it would be no problem on your big fancy ARM with lots of flash to spare :)

But I can make some comments on implementation details...

  • ST supports PolarSSL / mbedTLS on their STM32 micros. It would make things much easier to use crypto functions supported by this library.
  • We have about 95kB of flash (program) space available on the USG v1.0 micros, out of 128kB total. This should be plenty of space for our crypto functions.
  • We need to decide which microprocessor performs which crypto operation. The "Upstream" micro talks to the PC, so can be compromised by the sys-usb VM. The "Downstream" micro talks to the keyboard or other USB device, and can be compromised by a malicious device. So we need to answer the question:
  • Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both?

My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome!

It is my understanding that there are two primary motivations:

  1. Securely isolating individual USB devices when you only have a single USB controller to assign via PCI-passthrough to a single sys-usb VM.

  2. Securely interacting with a USB device without needing to trust our USB controller.

Both of these scenarios treat the devices below the USG as trusted, and distrust the things on the other (computer) side.

If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc.

This to me suggests to do the crypto on the downstream side.

@jpouellet

This comment has been minimized.

Show comment
Hide comment
@jpouellet

jpouellet May 26, 2017

Contributor

If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc.

Nor cryptographically protecting the fact that yes, this usb rocket launcher is suddently trying to also pretend to be a hub and keyboard and mouse. "Great! I've authenticated that it is indeed trying to attack me, and my USB controller with compromised firmware can't see how!" -- I could be wrong, but I see no benefit.

Contributor

jpouellet commented May 26, 2017

If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc.

Nor cryptographically protecting the fact that yes, this usb rocket launcher is suddently trying to also pretend to be a hub and keyboard and mouse. "Great! I've authenticated that it is indeed trying to attack me, and my USB controller with compromised firmware can't see how!" -- I could be wrong, but I see no benefit.

@v6ak

This comment has been minimized.

Show comment
Hide comment
@v6ak

v6ak May 26, 2017

v6ak commented May 26, 2017

@jpouellet

This comment has been minimized.

Show comment
Hide comment
@jpouellet

jpouellet May 26, 2017

Contributor

Most importantly, the untrusted input it would process is rather going to be crypto-related, which cannot be moved to a separate untrusted CPU.

Can you elaborate on this point? I don't see why it couldn't be moved to e.g. a third processor between the two, isolating from memory corruption issues in either side's USB stack.

Contributor

jpouellet commented May 26, 2017

Most importantly, the untrusted input it would process is rather going to be crypto-related, which cannot be moved to a separate untrusted CPU.

Can you elaborate on this point? I don't see why it couldn't be moved to e.g. a third processor between the two, isolating from memory corruption issues in either side's USB stack.

@v6ak

This comment has been minimized.

Show comment
Hide comment
@v6ak

v6ak May 27, 2017

v6ak commented May 27, 2017

@robertfisk

This comment has been minimized.

Show comment
Hide comment
@robertfisk

robertfisk Jun 3, 2017

Warning: epic post below...

Ok, so our threat model is a compromised sys-usb who wants to sniff our keys, and we assume the keyboard firmware is not compromised (either our own firmware, or protected by a USG firewall and clean from the factory (!)). If both sys-usb and the keyboard firmware are infected with cooperating malware then the task becomes more difficult (see issues below).

Libsodium looks fine to me, and its license is compatible with the (slightly weird) ST middleware license as used by the USG.

Issues

Here are a few issues as I see them:

  1. The crypto will have to protect against a hostile sys-usb performing MITM. Diffie-Hellman key exchange requires a public/private keypair on each end, so both Dom0 and the keyboard encryption firmware will hold a private key that is not known by sys-usb. But the Dom0 code and firmware code is public so the attacker can find them. Unless both ends contain a unique generated keypair signed by another master key? This sounds like it will cause some logistic difficulties for both ends...

  2. If our encryption device does not use a 3-CPU fully isolated architecture, it is essential that we protect the keyboard's firmware from attack from sys-usb. If both the keyboard and sys-usb are infected with cooperating malware, a 1-CPU or 2-CPU encryption system is vulnerable to compromise by the keyboard firmware. The encryption can then be disabled.

  3. I need to remind everyone that prototyping and preparing hardware for production costs real money. Think hundreds of dollars for a simple one-chip ATMEGA board, up to thousands for more complex designs. I have already paid this cost while developing the USG, which can be used with any operating system. If we limit the market to only Qubes users the volumes will be very small and only the simplest designs would be possible. In particular, a 3-CPU solution would have to sell for over US$100 each. This would then reduce the sales volume even more, and reduce the number of people able to use the enhanced security.

Switching encryption mode

Assuming sys-usb is hostile, our keyboard firmware must send only encrypted keystrokes. Otherwise sys-usb could report to Dom0 that a normal keyboard is connected, and report to the keyboard that it is connected to a non-Qubes system, and then proceed to sniff the unencrypted keystrokes. But interacting with the bios and Grub boot menu requires an unencrypted keyboard. So the user will have to unplug/replug the encryption device, or move a switch to change between encrypted and unencrypted modes.

The USG can be programmed with encrypted-keyboard firmware while still providing firewall functionality. In this case the keyboard encryption would be always-on, and switching to unencrypted mode would require replugging the keyboard into a standard USG, or unprotected into sys-usb (a really bad idea, see issue 2 above). Ideally the encryption could be enabled with a switch, but we will also need a really f'n bright green/red LED to remind the user what mode they are in. I plan to implement the switch/LED modification in the future, as it is also useful for a read-only mass storage mode.

Hardware

So we come to the question of which hardware architecture to choose. The 3-CPU solution may be the most robust against attacks, but as discussed in issue 3 above is impractical for cost reasons.

A 2-CPU solution like the USG protects the encryption from attacks by a hostile sys-usb. It does not protect the encryption from a hostile keyboard, or hostile sys-usb/keyboard sandwich (issue 2 above). Therefore Dom0 decryption code should be resistant to exploits in the encrypted payload, and the user should take steps to ensure that sys-usb cannot compromise their keyboard directly. The main advantage is that the hardware is already developed at no cost to Qubes users.

The 1-CPU solution needs some discussion. It is simple enough that it may be possible to design Qubes-specific hardware. Although the low volumes and desired low sale price means it will have to be done on a volunteer basis, I cannot say I will have time to do this! Perhaps we could find an off-the-shelf solution like an Arduino with separate device and host ports.

Then we need to ask if the 1-CPU solution is secure against attacks from a hostile sys-usb. I cannot say this for certain! One would have to very closely analyse the USB embedded device driver stack, and even then you cannot analyse the internal state of the USB device controller hardware (the hardware has its own state machine). The USG was designed with the assumption that any USB connection (device or host) will allow compromise, and I think we should make the same assumption here. This rules out a simple 1-CPU design.

But we can also use an FTDI chip on the computer side, as mentioned by v6ak. I will call this the 1.5-CPU option. You could do this with an off-the-shelf FTDI serial converter and an arduino with OTG or host port. Because we are making a non-standard keyboard, we can ask sys-usb to look for a FTDI serial port instead of a standard USB keyboard. This will isolate the keyboard encryption from a malicious sys-usb in the same way as the 2-CPU solution. However there is one disadvantage, which is that we are introducing some untrusted firmware into our system contained in the FTDI chip. This cannot attack our keyboard encryption, but it can attack sys-usb. In this aspect, a 2-CPU solution like the USG is safer overall than a 1.5-CPU solution.

The only reason we would choose a 1.5-CPU solution is if it was significantly cheaper. It may be cheaper at high volumes, but I don't think we have that market. You also need someone to pay the development costs, and it won't be me because once was enough! You would also have the months-long hardware development process, and the task of re-creating the functions of the USG firmware in another embedded processor. For these reasons I suggest using the USG as our hardware platform.

Moving Forward

I can develop USG firmware to support this feature, but I won't be able to do the Dom0 or sys-usb code. If someone wants to do that, we can work on the encryption details and how the keyboard will report itself to sys-usb.

Warning: epic post below...

Ok, so our threat model is a compromised sys-usb who wants to sniff our keys, and we assume the keyboard firmware is not compromised (either our own firmware, or protected by a USG firewall and clean from the factory (!)). If both sys-usb and the keyboard firmware are infected with cooperating malware then the task becomes more difficult (see issues below).

Libsodium looks fine to me, and its license is compatible with the (slightly weird) ST middleware license as used by the USG.

Issues

Here are a few issues as I see them:

  1. The crypto will have to protect against a hostile sys-usb performing MITM. Diffie-Hellman key exchange requires a public/private keypair on each end, so both Dom0 and the keyboard encryption firmware will hold a private key that is not known by sys-usb. But the Dom0 code and firmware code is public so the attacker can find them. Unless both ends contain a unique generated keypair signed by another master key? This sounds like it will cause some logistic difficulties for both ends...

  2. If our encryption device does not use a 3-CPU fully isolated architecture, it is essential that we protect the keyboard's firmware from attack from sys-usb. If both the keyboard and sys-usb are infected with cooperating malware, a 1-CPU or 2-CPU encryption system is vulnerable to compromise by the keyboard firmware. The encryption can then be disabled.

  3. I need to remind everyone that prototyping and preparing hardware for production costs real money. Think hundreds of dollars for a simple one-chip ATMEGA board, up to thousands for more complex designs. I have already paid this cost while developing the USG, which can be used with any operating system. If we limit the market to only Qubes users the volumes will be very small and only the simplest designs would be possible. In particular, a 3-CPU solution would have to sell for over US$100 each. This would then reduce the sales volume even more, and reduce the number of people able to use the enhanced security.

Switching encryption mode

Assuming sys-usb is hostile, our keyboard firmware must send only encrypted keystrokes. Otherwise sys-usb could report to Dom0 that a normal keyboard is connected, and report to the keyboard that it is connected to a non-Qubes system, and then proceed to sniff the unencrypted keystrokes. But interacting with the bios and Grub boot menu requires an unencrypted keyboard. So the user will have to unplug/replug the encryption device, or move a switch to change between encrypted and unencrypted modes.

The USG can be programmed with encrypted-keyboard firmware while still providing firewall functionality. In this case the keyboard encryption would be always-on, and switching to unencrypted mode would require replugging the keyboard into a standard USG, or unprotected into sys-usb (a really bad idea, see issue 2 above). Ideally the encryption could be enabled with a switch, but we will also need a really f'n bright green/red LED to remind the user what mode they are in. I plan to implement the switch/LED modification in the future, as it is also useful for a read-only mass storage mode.

Hardware

So we come to the question of which hardware architecture to choose. The 3-CPU solution may be the most robust against attacks, but as discussed in issue 3 above is impractical for cost reasons.

A 2-CPU solution like the USG protects the encryption from attacks by a hostile sys-usb. It does not protect the encryption from a hostile keyboard, or hostile sys-usb/keyboard sandwich (issue 2 above). Therefore Dom0 decryption code should be resistant to exploits in the encrypted payload, and the user should take steps to ensure that sys-usb cannot compromise their keyboard directly. The main advantage is that the hardware is already developed at no cost to Qubes users.

The 1-CPU solution needs some discussion. It is simple enough that it may be possible to design Qubes-specific hardware. Although the low volumes and desired low sale price means it will have to be done on a volunteer basis, I cannot say I will have time to do this! Perhaps we could find an off-the-shelf solution like an Arduino with separate device and host ports.

Then we need to ask if the 1-CPU solution is secure against attacks from a hostile sys-usb. I cannot say this for certain! One would have to very closely analyse the USB embedded device driver stack, and even then you cannot analyse the internal state of the USB device controller hardware (the hardware has its own state machine). The USG was designed with the assumption that any USB connection (device or host) will allow compromise, and I think we should make the same assumption here. This rules out a simple 1-CPU design.

But we can also use an FTDI chip on the computer side, as mentioned by v6ak. I will call this the 1.5-CPU option. You could do this with an off-the-shelf FTDI serial converter and an arduino with OTG or host port. Because we are making a non-standard keyboard, we can ask sys-usb to look for a FTDI serial port instead of a standard USB keyboard. This will isolate the keyboard encryption from a malicious sys-usb in the same way as the 2-CPU solution. However there is one disadvantage, which is that we are introducing some untrusted firmware into our system contained in the FTDI chip. This cannot attack our keyboard encryption, but it can attack sys-usb. In this aspect, a 2-CPU solution like the USG is safer overall than a 1.5-CPU solution.

The only reason we would choose a 1.5-CPU solution is if it was significantly cheaper. It may be cheaper at high volumes, but I don't think we have that market. You also need someone to pay the development costs, and it won't be me because once was enough! You would also have the months-long hardware development process, and the task of re-creating the functions of the USG firmware in another embedded processor. For these reasons I suggest using the USG as our hardware platform.

Moving Forward

I can develop USG firmware to support this feature, but I won't be able to do the Dom0 or sys-usb code. If someone wants to do that, we can work on the encryption details and how the keyboard will report itself to sys-usb.

@andrewdavidwong andrewdavidwong added this to the Far in the future milestone Jun 4, 2017

andrewdavidwong added a commit that referenced this issue Jun 4, 2017

@v6ak

This comment has been minimized.

Show comment
Hide comment
@v6ak

v6ak Jun 4, 2017

v6ak commented Jun 4, 2017

andrewdavidwong added a commit that referenced this issue Jun 4, 2017

@robertfisk

This comment has been minimized.

Show comment
Hide comment
@robertfisk

robertfisk Jun 8, 2017

It sounds like you have some specific hardware in mind. Is this a keyboard or an inline USB device? Off-the-shelf? Development board? Any user assembly required?

It sounds like you have some specific hardware in mind. Is this a keyboard or an inline USB device? Off-the-shelf? Development board? Any user assembly required?

@v6ak

This comment has been minimized.

Show comment
Hide comment
@v6ak

v6ak Jun 12, 2017

v6ak commented Jun 12, 2017

andrewdavidwong added a commit that referenced this issue Jun 15, 2017

@jpouellet

This comment has been minimized.

Show comment
Hide comment
@jpouellet

jpouellet Jul 27, 2017

Contributor

For the crypto between the trusted side of the device and the VM, it may be worth de-duplicating effort by stealing the design from the CrypTech people, who are designing a secure channel between their HSM and its client. I think the goals seem sufficiently aligned, and I believe the CrypTech folks have enough competent engineers and interested eyeballs to get the details right.

https://wiki.cryptech.is/wiki/SecureChannel was just posted on the CrypTech mailing list. I will cross-reference there in case anyone there is interested in this as well.

Contributor

jpouellet commented Jul 27, 2017

For the crypto between the trusted side of the device and the VM, it may be worth de-duplicating effort by stealing the design from the CrypTech people, who are designing a secure channel between their HSM and its client. I think the goals seem sufficiently aligned, and I believe the CrypTech folks have enough competent engineers and interested eyeballs to get the details right.

https://wiki.cryptech.is/wiki/SecureChannel was just posted on the CrypTech mailing list. I will cross-reference there in case anyone there is interested in this as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment