Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.
Sign upUSG keyboard hardware proxy #2518
Comments
andrewdavidwong
added
C: other
enhancement
labels
Dec 13, 2016
added a commit
that referenced
this issue
Dec 13, 2016
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
v6ak
May 24, 2017
Not sure where it is the right place to discuss, but maybe here if I mention @robertfisk.
What's the current status? Do you heed help with anything? I was thinking about something similar, maybe rather as a modified keyboard firmware. I am currently not sure if the AVR chip on my keyboard is powerful enough, though.
I have few ideas:
Key exchange
First level (PoC quality) would be a key compiled in firmware. This requires advanced user who generates a random key, compiles and flashes the firmware. And the user has to flash it securely, i.e., not over a compromised USBVM. Easy to implement, hard to use.
Second level could look like Bluetooth SPP. After some Diffie-Hellman key exchange, user would authenticate it by typing few characters on the keyboard and pressing enter. This can perform mutual authentication.
Side channels
Actually, maybe we don't need to introduce any. USB is host-driven and performs polling on rate like 100Hz. So, keyboard would send the same information every 10ms. This could hide all timing side channels that are involved by design. It does not guarantee not introducing any side channel in implementation, though.
This design could also allow some real-time guarantees: Attacker cannot defer key presses much.
Encryption vs. authentication
I believe both encryption and authentication can be useful there. If there is not enough time for both, I am not 100% sure what to prefer:
- Encryption can work as "poor man's authentication" in some cases. But it can be fragile.
- Authentication without encryption is pretty clear solution for some reduced threat model -- malicious USB device, but sys-usb is not compromised.
Of course, if possible, I would prefer having both.
Dropped packets
I would not fear assuming that we have no dropped packets. With proper design, this will probably make the next packets rejected because of wrong sequence number. In such case, I believe it is OK to reset the connection and maybe inform user. (Without informing the user, malicious USBVM could still drop few packets by its choice without being noticed.)
v6ak
commented
May 24, 2017
|
Not sure where it is the right place to discuss, but maybe here if I mention @robertfisk. What's the current status? Do you heed help with anything? I was thinking about something similar, maybe rather as a modified keyboard firmware. I am currently not sure if the AVR chip on my keyboard is powerful enough, though. I have few ideas: Key exchangeFirst level (PoC quality) would be a key compiled in firmware. This requires advanced user who generates a random key, compiles and flashes the firmware. And the user has to flash it securely, i.e., not over a compromised USBVM. Easy to implement, hard to use. Second level could look like Bluetooth SPP. After some Diffie-Hellman key exchange, user would authenticate it by typing few characters on the keyboard and pressing enter. This can perform mutual authentication. Side channelsActually, maybe we don't need to introduce any. USB is host-driven and performs polling on rate like 100Hz. So, keyboard would send the same information every 10ms. This could hide all timing side channels that are involved by design. It does not guarantee not introducing any side channel in implementation, though. This design could also allow some real-time guarantees: Attacker cannot defer key presses much. Encryption vs. authenticationI believe both encryption and authentication can be useful there. If there is not enough time for both, I am not 100% sure what to prefer:
Of course, if possible, I would prefer having both. Dropped packetsI would not fear assuming that we have no dropped packets. With proper design, this will probably make the next packets rejected because of wrong sequence number. In such case, I believe it is OK to reset the connection and maybe inform user. (Without informing the user, malicious USBVM could still drop few packets by its choice without being noticed.) |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
robertfisk
May 26, 2017
I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc. But I can make some comments on implementation details...
- ST supports PolarSSL / mbedTLS on their STM32 micros. It would make things much easier to use crypto functions supported by this library.
- We have about 95kB of flash (program) space available on the USG v1.0 micros, out of 128kB total. This should be plenty of space for our crypto functions.
- We need to decide which microprocessor performs which crypto operation. The "Upstream" micro talks to the PC, so can be compromised by the sys-usb VM. The "Downstream" micro talks to the keyboard or other USB device, and can be compromised by a malicious device. So we need to answer the question:
Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both?
My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome!
robertfisk
commented
May 26, 2017
|
I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc. But I can make some comments on implementation details...
Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both? My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome! |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
jpouellet
May 26, 2017
Contributor
I don't have much experience writing crypto apps so I'm going to need some advice on choosing the general strategy, crypto algorithms, etc.
The universal advice is never roll your own ;)
You may want something simpler than a full TLS stack, in which case NaCl (or its heavier but more friendly to use direct descendant libsodium) is probably what you want.
NaCl authors have ported it to even an 8-bit AVR (see https://cryptojedi.org/papers/avrnacl-20130514.pdf), so I'm sure it would be no problem on your big fancy ARM with lots of flash to spare :)
But I can make some comments on implementation details...
- ST supports PolarSSL / mbedTLS on their STM32 micros. It would make things much easier to use crypto functions supported by this library.
- We have about 95kB of flash (program) space available on the USG v1.0 micros, out of 128kB total. This should be plenty of space for our crypto functions.
- We need to decide which microprocessor performs which crypto operation. The "Upstream" micro talks to the PC, so can be compromised by the sys-usb VM. The "Downstream" micro talks to the keyboard or other USB device, and can be compromised by a malicious device. So we need to answer the question:
- Are we protecting our keystrokes from a malicious sys-usb VM, or a malicious device? Or both?
My guess is "both", which means we have to perform nested crypto on both Downstream and Upstream??? Any advice or comments are welcome!
It is my understanding that there are two primary motivations:
-
Securely isolating individual USB devices when you only have a single USB controller to assign via PCI-passthrough to a single sys-usb VM.
-
Securely interacting with a USB device without needing to trust our USB controller.
Both of these scenarios treat the devices below the USG as trusted, and distrust the things on the other (computer) side.
If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc.
This to me suggests to do the crypto on the downstream side.
The universal advice is never roll your own ;) You may want something simpler than a full TLS stack, in which case NaCl (or its heavier but more friendly to use direct descendant libsodium) is probably what you want. NaCl authors have ported it to even an 8-bit AVR (see https://cryptojedi.org/papers/avrnacl-20130514.pdf), so I'm sure it would be no problem on your big fancy ARM with lots of flash to spare :)
It is my understanding that there are two primary motivations:
Both of these scenarios treat the devices below the USG as trusted, and distrust the things on the other (computer) side. If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc. This to me suggests to do the crypto on the downstream side. |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
jpouellet
May 26, 2017
Contributor
If our keyboard or flash drive or whatever itself is malicious, then I do not see what is to be gained by authenticity- and integrity-protecting malicious keystrokes, etc.
Nor cryptographically protecting the fact that yes, this usb rocket launcher is suddently trying to also pretend to be a hub and keyboard and mouse. "Great! I've authenticated that it is indeed trying to attack me, and my USB controller with compromised firmware can't see how!" -- I could be wrong, but I see no benefit.
Nor cryptographically protecting the fact that yes, this usb rocket launcher is suddently trying to also pretend to be a hub and keyboard and mouse. "Great! I've authenticated that it is indeed trying to attack me, and my USB controller with compromised firmware can't see how!" -- I could be wrong, but I see no benefit. |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
v6ak
May 26, 2017
v6ak
commented
May 26, 2017
|
For sake of brevity, I'll define “keyboard system” as our endpoint. It can be either a standard keyboard with USG proxy or a keyboard with patched firmware. I assume there is some protection from flashing by malicious sys-usb, as it would be pointless without that.
- ST supports PolarSSL / mbedTLS on their STM32 micros. It would make
things much easier to use crypto functions supported by this library.
I admit was thinking in much more embedded style, because I'd like to see it on Ergodox with Teensy 2.0 with 8-bit AVR ATMEGA32U4.
Well, if you want full-blown TLS between keyboard system and dom0, then I suspect it will be rather rejected as something too complex in dom0. Even in the case when you strip the TLS down (one TLS version, one ciphersuite, …).
OTOH, implementing all on low-level crypto primitives is not ideal, either. I know (and have seen) many potential vulnerabilities that can occur. Maybe libsodium can offer a good tradeoff, I'll look closer.
While I had some elegant very embedded idea that could require just performing one AES block encryption (on properly formatted data) for encryption, authentication and replay attack prevention (remember, keyboard sends just 8-byte packets, so there is some space left in 16-byte block), I am not so confident to suggest it. I can describe it if someone is confident enough to review it.
- We need to decide which microprocessor performs which crypto
operation. The "Upstream" micro talks to the PC, so can be compromised
by the sys-usb VM. The "Downstream" micro talks to the keyboard or
other USB device, and can be compromised by a malicious device. So we
need to answer the question:
I'd assume that dom0 and the keyboard system is trusted and other parts (sys-usb, other USB devices that can interfere) are untrusted. If you had an untrusted keyboard and wanted to type your pasword on it and control whole your computed over it, you would probably need some crypto in your head in order to make it secure. ☺
Maybe you are trying to design a single USG device that could work both as firewall (i.e., for untrusted devices) and secure input proxy (i.e., for output devices). I am not sure if it is a good idea to have both in a single device:
* Now, you see some practical issues.
* This would force you having two separate modes (HID reverse proxy and firewall) with far different threat models. You would also have to provide some switch between them.
* Maybe we could have a single PCB for both, but there should be a different firmware and a different case (e.g., green case for USG HID and red case for connecting USG firewall). This can prevent accidental misuse.
When considering two separate devices for USG HID and USG firewall, I am not sure if USG HID really needs two separate CPUs. It should not need to process much of untrusted input. More importantly, the input is not going to be complex, but rather something fixed-length. Most importantly, the untrusted input it would process is rather going to be crypto-related, which cannot be moved to a separate untrusted CPU.
# Few other notes
## Side channels
We can measure time of crypto operations. As we are not running on any OS and we are encrypting fixed-length data, we should ideally get constant time for encryption. In real world, some interrupts might make it different, but maybe we can create some testing environment that does not have such issues.
Power-related side channels might be a bit more tricky to detect. OTOH, one capacitor could make them noisy enough.
## More complex keyboards
Some keyboards do not behave just as one keyboard, but they also do have multimedia keys, scroll keys, trackpoint, touchpad etc. So, they can behave as multiple keyboards and mouse in ode device. I'm not suggesting those have to be supported in the first version, I'm just foreseeing some challenges that could affect hardware selection. For example, Arduino Due would need the keyboard to behave just as a single device.
|
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
jpouellet
May 26, 2017
Contributor
Most importantly, the untrusted input it would process is rather going to be crypto-related, which cannot be moved to a separate untrusted CPU.
Can you elaborate on this point? I don't see why it couldn't be moved to e.g. a third processor between the two, isolating from memory corruption issues in either side's USB stack.
Can you elaborate on this point? I don't see why it couldn't be moved to e.g. a third processor between the two, isolating from memory corruption issues in either side's USB stack. |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
v6ak
May 27, 2017
v6ak
commented
May 27, 2017
|
OK, the USB protocol itself was something I did not have so much on my mind. So, the 3 CPU setup could look like this:
1. One CPU (USB host) handles translation between USB keyboard and some simpler protocol that contains just the keypresses, reported as 8 bytes chunks (modifiers + reserved byte + up to six keys), plus initial metadata plus LED control.
2. One CPU handles encryption and authentication. This would be the trusted one. It would handle very little untrusted inputs (provided that we trust the keyboard), mostly some handshakes (nonce from dom0 for initialization, initial key exchange) and LED status*. And even if we consider the keyboard as untrusted, it would not perform difficult processing on its input, just encrypting its keypresses and maybe its initial message**.
3. One CPU (USB guest) handles USB enumeration etc. and translates a simple protocol to USB.
But when looking at it, I see:
* The first and third CPU can be replaced maybe with some special chip like FTDI.
* If we consider keyboard as trusted, there is no security reason for splitting the first and the second CPU. We could utilize the fact that some CPUs have builtin USB support.
*) Well, LED status is not something I've deeply thought about. I prefer stateless keyboard and don't use (or even have) status LEDs on my keyboard. I don't think they usually contain some very sensitive information and obscuring side channels might look a but paranoid here.
**) I haven't thought about reporting keyboard capabilities much. But maybe the first version can skip it fully if there is some challenge.
…On May 27, 2017 12:13:22 AM GMT+02:00, Jean-Philippe Ouellet ***@***.***> wrote:
> Most importantly, the untrusted input it would process is rather
going to be crypto-related, which cannot be moved to a separate
untrusted CPU.
Can you elaborate on this point? I don't see why it couldn't be moved
to e.g. a third processor between the two, isolating from memory
corruption issues in either side's USB stack.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
#2518 (comment)
--
Sent from my fruity BlackBerry pocket computer powered by Android with K-9 Mail. Please excuse my brevity.
|
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
robertfisk
Jun 3, 2017
Warning: epic post below...
Ok, so our threat model is a compromised sys-usb who wants to sniff our keys, and we assume the keyboard firmware is not compromised (either our own firmware, or protected by a USG firewall and clean from the factory (!)). If both sys-usb and the keyboard firmware are infected with cooperating malware then the task becomes more difficult (see issues below).
Libsodium looks fine to me, and its license is compatible with the (slightly weird) ST middleware license as used by the USG.
Issues
Here are a few issues as I see them:
-
The crypto will have to protect against a hostile sys-usb performing MITM. Diffie-Hellman key exchange requires a public/private keypair on each end, so both Dom0 and the keyboard encryption firmware will hold a private key that is not known by sys-usb. But the Dom0 code and firmware code is public so the attacker can find them. Unless both ends contain a unique generated keypair signed by another master key? This sounds like it will cause some logistic difficulties for both ends...
-
If our encryption device does not use a 3-CPU fully isolated architecture, it is essential that we protect the keyboard's firmware from attack from sys-usb. If both the keyboard and sys-usb are infected with cooperating malware, a 1-CPU or 2-CPU encryption system is vulnerable to compromise by the keyboard firmware. The encryption can then be disabled.
-
I need to remind everyone that prototyping and preparing hardware for production costs real money. Think hundreds of dollars for a simple one-chip ATMEGA board, up to thousands for more complex designs. I have already paid this cost while developing the USG, which can be used with any operating system. If we limit the market to only Qubes users the volumes will be very small and only the simplest designs would be possible. In particular, a 3-CPU solution would have to sell for over US$100 each. This would then reduce the sales volume even more, and reduce the number of people able to use the enhanced security.
Switching encryption mode
Assuming sys-usb is hostile, our keyboard firmware must send only encrypted keystrokes. Otherwise sys-usb could report to Dom0 that a normal keyboard is connected, and report to the keyboard that it is connected to a non-Qubes system, and then proceed to sniff the unencrypted keystrokes. But interacting with the bios and Grub boot menu requires an unencrypted keyboard. So the user will have to unplug/replug the encryption device, or move a switch to change between encrypted and unencrypted modes.
The USG can be programmed with encrypted-keyboard firmware while still providing firewall functionality. In this case the keyboard encryption would be always-on, and switching to unencrypted mode would require replugging the keyboard into a standard USG, or unprotected into sys-usb (a really bad idea, see issue 2 above). Ideally the encryption could be enabled with a switch, but we will also need a really f'n bright green/red LED to remind the user what mode they are in. I plan to implement the switch/LED modification in the future, as it is also useful for a read-only mass storage mode.
Hardware
So we come to the question of which hardware architecture to choose. The 3-CPU solution may be the most robust against attacks, but as discussed in issue 3 above is impractical for cost reasons.
A 2-CPU solution like the USG protects the encryption from attacks by a hostile sys-usb. It does not protect the encryption from a hostile keyboard, or hostile sys-usb/keyboard sandwich (issue 2 above). Therefore Dom0 decryption code should be resistant to exploits in the encrypted payload, and the user should take steps to ensure that sys-usb cannot compromise their keyboard directly. The main advantage is that the hardware is already developed at no cost to Qubes users.
The 1-CPU solution needs some discussion. It is simple enough that it may be possible to design Qubes-specific hardware. Although the low volumes and desired low sale price means it will have to be done on a volunteer basis, I cannot say I will have time to do this! Perhaps we could find an off-the-shelf solution like an Arduino with separate device and host ports.
Then we need to ask if the 1-CPU solution is secure against attacks from a hostile sys-usb. I cannot say this for certain! One would have to very closely analyse the USB embedded device driver stack, and even then you cannot analyse the internal state of the USB device controller hardware (the hardware has its own state machine). The USG was designed with the assumption that any USB connection (device or host) will allow compromise, and I think we should make the same assumption here. This rules out a simple 1-CPU design.
But we can also use an FTDI chip on the computer side, as mentioned by v6ak. I will call this the 1.5-CPU option. You could do this with an off-the-shelf FTDI serial converter and an arduino with OTG or host port. Because we are making a non-standard keyboard, we can ask sys-usb to look for a FTDI serial port instead of a standard USB keyboard. This will isolate the keyboard encryption from a malicious sys-usb in the same way as the 2-CPU solution. However there is one disadvantage, which is that we are introducing some untrusted firmware into our system contained in the FTDI chip. This cannot attack our keyboard encryption, but it can attack sys-usb. In this aspect, a 2-CPU solution like the USG is safer overall than a 1.5-CPU solution.
The only reason we would choose a 1.5-CPU solution is if it was significantly cheaper. It may be cheaper at high volumes, but I don't think we have that market. You also need someone to pay the development costs, and it won't be me because once was enough! You would also have the months-long hardware development process, and the task of re-creating the functions of the USG firmware in another embedded processor. For these reasons I suggest using the USG as our hardware platform.
Moving Forward
I can develop USG firmware to support this feature, but I won't be able to do the Dom0 or sys-usb code. If someone wants to do that, we can work on the encryption details and how the keyboard will report itself to sys-usb.
robertfisk
commented
Jun 3, 2017
|
Warning: epic post below... Ok, so our threat model is a compromised sys-usb who wants to sniff our keys, and we assume the keyboard firmware is not compromised (either our own firmware, or protected by a USG firewall and clean from the factory (!)). If both sys-usb and the keyboard firmware are infected with cooperating malware then the task becomes more difficult (see issues below). Libsodium looks fine to me, and its license is compatible with the (slightly weird) ST middleware license as used by the USG. IssuesHere are a few issues as I see them:
Switching encryption modeAssuming sys-usb is hostile, our keyboard firmware must send only encrypted keystrokes. Otherwise sys-usb could report to Dom0 that a normal keyboard is connected, and report to the keyboard that it is connected to a non-Qubes system, and then proceed to sniff the unencrypted keystrokes. But interacting with the bios and Grub boot menu requires an unencrypted keyboard. So the user will have to unplug/replug the encryption device, or move a switch to change between encrypted and unencrypted modes. The USG can be programmed with encrypted-keyboard firmware while still providing firewall functionality. In this case the keyboard encryption would be always-on, and switching to unencrypted mode would require replugging the keyboard into a standard USG, or unprotected into sys-usb (a really bad idea, see issue 2 above). Ideally the encryption could be enabled with a switch, but we will also need a really f'n bright green/red LED to remind the user what mode they are in. I plan to implement the switch/LED modification in the future, as it is also useful for a read-only mass storage mode. HardwareSo we come to the question of which hardware architecture to choose. The 3-CPU solution may be the most robust against attacks, but as discussed in issue 3 above is impractical for cost reasons. A 2-CPU solution like the USG protects the encryption from attacks by a hostile sys-usb. It does not protect the encryption from a hostile keyboard, or hostile sys-usb/keyboard sandwich (issue 2 above). Therefore Dom0 decryption code should be resistant to exploits in the encrypted payload, and the user should take steps to ensure that sys-usb cannot compromise their keyboard directly. The main advantage is that the hardware is already developed at no cost to Qubes users. The 1-CPU solution needs some discussion. It is simple enough that it may be possible to design Qubes-specific hardware. Although the low volumes and desired low sale price means it will have to be done on a volunteer basis, I cannot say I will have time to do this! Perhaps we could find an off-the-shelf solution like an Arduino with separate device and host ports. Then we need to ask if the 1-CPU solution is secure against attacks from a hostile sys-usb. I cannot say this for certain! One would have to very closely analyse the USB embedded device driver stack, and even then you cannot analyse the internal state of the USB device controller hardware (the hardware has its own state machine). The USG was designed with the assumption that any USB connection (device or host) will allow compromise, and I think we should make the same assumption here. This rules out a simple 1-CPU design. But we can also use an FTDI chip on the computer side, as mentioned by v6ak. I will call this the 1.5-CPU option. You could do this with an off-the-shelf FTDI serial converter and an arduino with OTG or host port. Because we are making a non-standard keyboard, we can ask sys-usb to look for a FTDI serial port instead of a standard USB keyboard. This will isolate the keyboard encryption from a malicious sys-usb in the same way as the 2-CPU solution. However there is one disadvantage, which is that we are introducing some untrusted firmware into our system contained in the FTDI chip. This cannot attack our keyboard encryption, but it can attack sys-usb. In this aspect, a 2-CPU solution like the USG is safer overall than a 1.5-CPU solution. The only reason we would choose a 1.5-CPU solution is if it was significantly cheaper. It may be cheaper at high volumes, but I don't think we have that market. You also need someone to pay the development costs, and it won't be me because once was enough! You would also have the months-long hardware development process, and the task of re-creating the functions of the USG firmware in another embedded processor. For these reasons I suggest using the USG as our hardware platform. Moving ForwardI can develop USG firmware to support this feature, but I won't be able to do the Dom0 or sys-usb code. If someone wants to do that, we can work on the encryption details and how the keyboard will report itself to sys-usb. |
andrewdavidwong
added
the
help wanted
label
Jun 4, 2017
andrewdavidwong
added this to the Far in the future milestone
Jun 4, 2017
added a commit
that referenced
this issue
Jun 4, 2017
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
v6ak
Jun 4, 2017
v6ak
commented
Jun 4, 2017
|
In the meantime, I've tried to sniff USB keyboard traffic and it does not look like it polls the way I thought. The polling mechanism is somewhat lower-level.
Anyway, this doesn't mean we cannot go this way. We can, however, introduce some powersave mode like „if there isn't any new keystroke for 5s, don't transfer anything.
If both sys-usb and the keyboard firmware are infected with cooperating malware
I would skip this case, as an infected keyboard is already bad enough.
Libsodium looks fine to me
On AVR, it preliminarily looks fine, except for RNG, which looks a it like a challenge a bit.
When mentioning AVRs: They use Harvard architecture, which can theoretically make them safer against buffer overflow attacks – provided that we lock the flash memory. Thanks to separate memory for program and data, attacker can jump just to the code, not to arbitrary data, which can limit the impact of buffer overflow. But without making the device non-updateable, attacker can still just jump to bootloader: https://arxiv.org/pdf/0901.3482
On issue 1 (key exchange): This is a pretty standard issue with existing solutions:
a. Just make it configurable. It does not even need DH. This is PoC-style solution and requires some care (e.g., a separate trusted USBVM).
b. TOFU (Trust On First Use) – just assume that the first connection is secure. Just a more friendly version of the former. Note: In whis case, neither dom0 nor keyboard system shall just blindly accept any request for new pairing, as this would allow renegotiation attacks.
c. Full solution inspired by Bluetooth SPP. The DH outputs not only the key, but also some verification code, which matches on both ends. The user would need to type the code from dom0 on the keyboard. The code wouldn't be sent, just verified by the CPU that handles crypto.
Note that there might be some issues on reduced keyboards, like not being able to type the code. But this can be easily solved by using a different keyboard when pairing.
On issue 2: As I've said, I would just assume the keyboard is not compromised.
But interacting with the bios and Grub boot menu requires an unencrypted keyboard. So the user will have to unplug/replug the encryption device, or move a switch to change between encrypted and unencrypted modes.
I'm OK with such limitations. For laptops, it can be also solved by using the internal keyboard.
The USG can be programmed with encrypted-keyboard firmware while still providing firewall functionality.
I am not sure if I understand this part, but I see this as complex and error-prone from user's perspective, without gaining much:
* If user forgets to switch, she essentially connects an untrusted device as a keyboard attached to dom0.
* The CPU that handles untrusted device will handle a keyboard later. I hope you see the risk here. Maybe write-protected CPU (ideally of Harvard architecture) with appropriate reboots can mitigate it.
* What would the usage scenario for universal device look like? It sounds like “So, I need to connect an untrusted USB flash drive, so I'll temporarily disconnect the keyboard, switch an USG switch and connect the flash drive.” Maybe I overlook something, but this sounds crazy.
I can develop USG firmware to support this feature, but I won't be able to do the Dom0 or sys-usb code.
I hope the sys-usb part will not be much harder than USB hello world (after all, it is just a stupid proxy that is not supposed to understand the data much) and the dom0 part could be inspired by the qubes-input-proxy-receiver. Sure, this will require some experimenting for me, but it looks doable.
Regards,
Vít Šesták 'v6ak'
…On June 3, 2017 12:36:45 PM GMT+02:00, robertfisk ***@***.***> wrote:
_**Warning: epic post below...**_
Ok, so our threat model is a compromised sys-usb who wants to sniff our
keys, and we assume the keyboard firmware is not compromised (either
our own firmware, or protected by a USG firewall and clean from the
factory (!)). If both sys-usb and the keyboard firmware are infected
with cooperating malware then the task becomes more difficult (see
issues below).
Libsodium looks fine to me, and its license is compatible with the
(slightly weird) ST middleware license as used by the USG.
## Issues
Here are a few issues as I see them:
1. The crypto will have to protect against a hostile sys-usb performing
MITM. Diffie-Hellman key exchange requires a public/private keypair on
each end, so both Dom0 and the keyboard encryption firmware will hold a
private key that is not known by sys-usb. But the Dom0 code and
firmware code is public so the attacker can find them. Unless both ends
contain a unique generated keypair signed by another master key? This
sounds like it will cause some logistic difficulties for both ends...
2. If our encryption device does *not* use a 3-CPU fully isolated
architecture, it is essential that we protect the keyboard's firmware
from attack from sys-usb. If both the keyboard and sys-usb are infected
with cooperating malware, a 1-CPU or 2-CPU encryption system is
vulnerable to compromise by the keyboard firmware. The encryption can
then be disabled.
3. I need to remind everyone that prototyping and preparing hardware
for production costs real money. Think hundreds of dollars for a simple
one-chip ATMEGA board, up to thousands for more complex designs. I have
already paid this cost while developing the USG, which can be used with
any operating system. If we limit the market to only Qubes users the
volumes will be very small and only the simplest designs would be
possible. In particular, a 3-CPU solution would have to sell for over
US$100 each. This would then reduce the sales volume even more, and
reduce the number of people able to use the enhanced security.
## Switching encryption mode
Assuming sys-usb is hostile, our keyboard firmware must send only
encrypted keystrokes. Otherwise sys-usb could report to Dom0 that a
normal keyboard is connected, and report to the keyboard that it is
connected to a non-Qubes system, and then proceed to sniff the
unencrypted keystrokes. But interacting with the bios and Grub boot
menu requires an unencrypted keyboard. So the user will have to
unplug/replug the encryption device, or move a switch to change between
encrypted and unencrypted modes.
The USG can be programmed with encrypted-keyboard firmware while still
providing firewall functionality. In this case the keyboard encryption
would be always-on, and switching to unencrypted mode would require
replugging the keyboard into a standard USG, or unprotected into
sys-usb (a really bad idea, see issue 2 above). Ideally the encryption
could be enabled with a switch, but we will also need a really f'n
bright green/red LED to remind the user what mode they are in. I plan
to implement the switch/LED modification in the future, as it is also
useful for a read-only mass storage mode.
## Hardware
So we come to the question of which hardware architecture to choose.
The 3-CPU solution may be the most robust against attacks, but as
discussed in issue 3 above is impractical for cost reasons.
A 2-CPU solution like the USG protects the encryption from attacks by a
hostile sys-usb. It does not protect the encryption from a hostile
keyboard, or hostile sys-usb/keyboard sandwich (issue 2 above).
Therefore Dom0 decryption code should be resistant to exploits in the
encrypted payload, and the user should take steps to ensure that
sys-usb cannot compromise their keyboard directly. The main advantage
is that the hardware is already developed at no cost to Qubes users.
The 1-CPU solution needs some discussion. It is simple enough that it
may be possible to design Qubes-specific hardware. Although the low
volumes and desired low sale price means it will have to be done on a
volunteer basis, I cannot say I will have time to do this! Perhaps we
could find an off-the-shelf solution like an Arduino with separate
device and host ports.
Then we need to ask if the 1-CPU solution is secure against attacks
from a hostile sys-usb. I cannot say this for certain! One would have
to very closely analyse the USB embedded device driver stack, and even
then you cannot analyse the internal state of the USB device controller
hardware (the hardware has its own state machine). The USG was designed
with the assumption that any USB connection (device or host) will allow
compromise, and I think we should make the same assumption here. This
rules out a simple 1-CPU design.
But we can also use an FTDI chip on the computer side, as mentioned by
v6ak. I will call this the 1.5-CPU option. You could do this with an
off-the-shelf FTDI serial converter and an arduino with OTG or host
port. Because we are making a non-standard keyboard, we can ask sys-usb
to look for a FTDI serial port instead of a standard USB keyboard. This
will isolate the keyboard encryption from a malicious sys-usb in the
same way as the 2-CPU solution. However there is one disadvantage,
which is that we are introducing some untrusted firmware into our
system contained in the FTDI chip. This cannot attack our keyboard
encryption, but it can attack sys-usb. In this aspect, a 2-CPU solution
like the USG is safer overall than a 1.5-CPU solution.
The only reason we would choose a 1.5-CPU solution is if it was
significantly cheaper. It may be cheaper at high volumes, but I don't
think we have that market. You also need someone to pay the development
costs, and it won't be me because once was enough! You would also have
the months-long hardware development process, and the task of
re-creating the functions of the USG firmware in another embedded
processor. For these reasons I suggest using the USG as our hardware
platform.
## Moving Forward
I can develop USG firmware to support this feature, but I won't be able
to do the Dom0 or sys-usb code. If someone wants to do that, we can
work on the encryption details and how the keyboard will report itself
to sys-usb.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
#2518 (comment)
--
Sent from my fruity BlackBerry pocket computer powered by Android with K-9 Mail. Please excuse my brevity.
|
added a commit
that referenced
this issue
Jun 4, 2017
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
robertfisk
Jun 8, 2017
It sounds like you have some specific hardware in mind. Is this a keyboard or an inline USB device? Off-the-shelf? Development board? Any user assembly required?
robertfisk
commented
Jun 8, 2017
|
It sounds like you have some specific hardware in mind. Is this a keyboard or an inline USB device? Off-the-shelf? Development board? Any user assembly required? |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
v6ak
Jun 12, 2017
v6ak
commented
Jun 12, 2017
|
What sounds like I have a specific hardware in mind? Well, there are two places when I thought about some specific hardware:
* Arduino Due, when mentioning its limitations.
* Teensy 2, when thinking how to implement it directly in my external keyboard (Ergodox) without any additional proxy.
When thinking about the protocol, I'd first design some variant that assumes some secret is already transferred. I'd skip real-time properties (so malicious sys-usb could delay any keypress provided it is not reordered), powersave. I'd go with a fixed polling rate at 100Hz. I'd address authenticity, confidentiality (including selected side channels) and replay attack prevention. From side channels, I'd like to prevent timing attack, but ignore attacks based on power analysis. I also would like to focus on keypresses and not on LED indicators. This is not to say we cannot widen the scope in the future, I just want to focus on important things.
So, let's assume we have some shared secret and we want to connect the keyboard. We need an unique nonce to be generated on both sides (dom0 and keyboard system). Without a nonce generated by keyboard system, malicious sys-usb can force the keyboard to key/IV reuse. Without nonce generated by dom0, attacker could replay some key sequences to dom0.
So, we have a long-lived secret and not-so-secret nonces from both sides, so the parties now have enough information to communicate to each other according to some symmetric encryption scheme. We can either:
a. Use the long-lived secret as key. In this case, we have to be more careful about IVs in order not to repeat them across sessions.
b. Derive a key from all the data (long-lived secret and both nonces), so we don't have to care about reusing IVs across sessions, as we use a different key. Note that we can use a simple hash function like SHA2 (not PBKDF2/bcrypt/scrypt), as we don't derive the key from a low-entropy source.
The latter approach seems to be a bit more universal and might have a higher performance in some cases.
We have few options mentioned in [symmetric authenticated encryption](https://download.libsodium.org/doc/secret-key_cryptography/authenticated_encryption.html) (one option) and [AEAD](https://download.libsodium.org/doc/secret-key_cryptography/aead.html) (four options):
* Performance estimate: accelerated AES > cha cha > unaccelerated AES, AEAD is usually faster than separate encryption and authentication
* Side channel resistance: Separate encryption and authentication (if it is encrypt then mac) is usually stronger. Unaccelerated AES is questionable.
* How much are ciphers reviewed: IMHO: AES (OK) > ChaCha20 (probably OK) > XChaCha20 (questionable)
So, the ciphersuite will probably also depend on AES hadrware acceleration and its support in libsodium. OTOH, if we today pick one, it should be easy to change it in the future.
|
added a commit
that referenced
this issue
Jun 15, 2017
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
jpouellet
Jul 27, 2017
Contributor
For the crypto between the trusted side of the device and the VM, it may be worth de-duplicating effort by stealing the design from the CrypTech people, who are designing a secure channel between their HSM and its client. I think the goals seem sufficiently aligned, and I believe the CrypTech folks have enough competent engineers and interested eyeballs to get the details right.
https://wiki.cryptech.is/wiki/SecureChannel was just posted on the CrypTech mailing list. I will cross-reference there in case anyone there is interested in this as well.
|
For the crypto between the trusted side of the device and the VM, it may be worth de-duplicating effort by stealing the design from the CrypTech people, who are designing a secure channel between their HSM and its client. I think the goals seem sufficiently aligned, and I believe the CrypTech folks have enough competent engineers and interested eyeballs to get the details right. https://wiki.cryptech.is/wiki/SecureChannel was just posted on the CrypTech mailing list. I will cross-reference there in case anyone there is interested in this as well. |
andrewdavidwong commentedDec 13, 2016
This is a tracking issue for a community-developed feature.
Description (#2507 (comment)):