Let s reflect on some of my recent work that started with understanding Trisquel GNU/Linux, improving transparency into apt-archives, working on reproducible builds of Trisquel, strengthening verification of apt-archives with Sigstore, and finally thinking about security device threat models. A theme in all this is improving methods to have trust in machines, or generally any external entity. While I believe that everything starts by trusting something, usually something familiar and well-known, we need to deal with misuse of that trust that leads to failure to deliver what is desired and expected from the trusted entity. How can an entity behave to invite trust? Let s argue for some properties that can be quantitatively measured, with a focus on computer software and hardware:
Deterministic Behavior given a set of circumstances, it should behave the same.
Verifiability and Transparency the method (the source code) should be accessible for understanding (compare scientific method) and its binaries verifiable, i.e., it should be possible to verify that the entity actually follows the intended deterministic method (implying efforts like reproducible builds and bootstrappable builds).
Accountable the entity should behave the same for everyone, and deviation should be possible prove in a way that is hard to deny, implying efforts such as Certificate Transparency and more generic checksum logs like Sigstore and Sigsum.
Liberating the tools and documentation should be available as free software to enable you to replace the trusted entity if so desired. An entity that wants to restrict you from being able to replace the trusted entity is vulnerable to corruption and may stop acting trustworthy. This point of view reinforces that open source misses the point; it has become too common to use trademark laws to restrict re-use of open source software (e.g., firefox, chrome, rust).
Essentially, this boils down to: Trust, Verify and Hold Accountable. To put this dogma in perspective, it helps to understand that this approach may be harmful to human relationships (which could explain the social awkwardness of hackers), but it remains useful as a method to improve the design of computer systems, and a useful method to evaluate safety of computer systems. When a system fails some of the criteria above, we know we have more work to do to improve it.
How far have we come on this journey? Through earlier efforts, we are in a fairly good situation. Richard Stallman through GNU/FSF made us aware of the importance of free software, the Reproducible/Bootstrappable build projects made us aware of the importance of verifiability, and Certificate Transparency highlighted the need for accountable signature logs leading to efforts like Sigstore for software. None of these efforts would have seen the light of day unless people wrote free software and packaged them into distributions that we can use, and built hardware that we can run it on. While there certainly exists more work to be done on the software side, with the recent amazing full-source build of Guix based on a 357-byte hand-written seed, I believe that we are closing that loop on the software engineering side.
So what remains? Some inspiration for further work:
Accountable binary software distribution remains unresolved in practice, although we have some software components around (e.g., apt-sigstore and guix git authenticate). What is missing is using them for verification by default and/or to improve the signature process to use trustworthy hardware devices, and committing the signatures to transparency logs.
Trustworthy hardware to run trustworthy software on remains a challenge, and we owe FSF s Respect Your Freedom credit for raising awareness of this. Many modern devices requires non-free software to work which fails most of the criteria above and are thus inherently untrustworthy.
Verifying rebuilds of currently published binaries on trustworthy hardware is unresolved.
Completing a full-source rebuild from a small seed on trustworthy hardware remains, preferably on a platform wildly different than X86 such as Raptor s Talos II.
We need improved security hardware devices and improved established practices on how to use them. For example, while Gnuk on the FST enable a trustworthy software and hardware solution, the best process for using it that I can think of generate the cryptographic keys on a more complex device. Efforts like Tillitis are inspiring here.
Onwards and upwards, happy hacking!
Update 2023-05-03: Added the Liberating property regarding free software, instead of having it be part of the Verifiability and Transparency .
I use GnuPG to compute cryptographic signatures for my emails, git commits/tags, and software release artifacts (tarballs). Part of GnuPG is gpg-agent which talks to OpenSSH, which I login to remote servers and to clone git repositories. I dislike storing cryptographic keys on general-purpose machines, and have used hardware-backed OpenPGP keys since around 2006 when I got a FSFE Fellowship Card. GnuPG via gpg-agent handles this well, and the private key never leaves the hardware. The ZeitControl cards were (to my knowledge) proprietary hardware running some non-free operating system and OpenPGP implementation. By late 2012 the YubiKey NEO supported OpenPGP, and while the hardware and operating system on it was not free, at least it ran a free software OpenPGP implementation and eventually I setup my primary RSA key on it. This worked well for a couple of years, and when I in 2019 wished to migrate to a new key, the FST-01G device with open hardware running free software that supported Ed25519 had become available. I created a key and have been using the FST-01G on my main laptop since then. This little device has been working, the signature counter on it is around 14501 which means around 10 signatures/day since then!
Currently I am in the process of migrating towards a new laptop, and moving the FST-01G device between them is cumbersome, especially if I want to use both laptops in parallel. That s why I need to setup a new hardware device to hold my OpenPGP key, which can go with my new laptop. This is a good time to re-visit alternatives. I quickly decided that I did not want to create a new key, only to import my current one to keep everything working. My requirements on the device to chose hasn t changed since 2019, see my summary at the end of the earlier blog post. Unfortunately the FST-01G is out of stock and the newer FST-01SZ has also out of stock. While Tillitis looks promising (and I have one to play with), it does not support OpenPGP (yet). What to do? Fortunately, I found some FST-01SZ device in my drawer, and decided to use it pending a more satisfactory answer. Hopefully once I get around to generate a new OpenPGP key in a year or so, I will do a better survey of options that are available on the market then. What are your (freedom-respecting) OpenPGP hardware recommendations?
Similar to setting up the FST-01G, the FST-01SZ needs to be setup before use. I m doing the following from Trisquel 11 but any GNU/Linux system would work. When the device is inserted at first time, some kernel messages are shown (see /var/log/syslog or use the dmesg command):
usb 3-3: new full-speed USB device number 39 using xhci_hcd
usb 3-3: New USB device found, idVendor=234b, idProduct=0004, bcdDevice= 2.00
usb 3-3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
usb 3-3: Product: Fraucheky
usb 3-3: Manufacturer: Free Software Initiative of Japan
usb 3-3: SerialNumber: FSIJ-0.0
usb-storage 3-3:1.0: USB Mass Storage device detected
scsi host1: usb-storage 3-3:1.0
scsi 1:0:0:0: Direct-Access FSIJ Fraucheky 1.0 PQ: 0 ANSI: 0
sd 1:0:0:0: Attached scsi generic sg2 type 0
sd 1:0:0:0: [sdc] 128 512-byte logical blocks: (65.5 kB/64.0 KiB)
sd 1:0:0:0: [sdc] Write Protect is off
sd 1:0:0:0: [sdc] Mode Sense: 03 00 00 00
sd 1:0:0:0: [sdc] No Caching mode page found
sd 1:0:0:0: [sdc] Assuming drive cache: write through
sdc:
sd 1:0:0:0: [sdc] Attached SCSI removable disk
Interestingly, the NeuG software installed on the device I got appears to be version 1.0.9:
jas@kaka:~$ head /media/jas/Fraucheky/README
NeuG - a true random number generator implementation
Version 1.0.9
2018-11-20
Niibe Yutaka
Free Software Initiative of Japan
What's NeuG?
============
jas@kaka:~$
I could not find version 1.0.9 published anywhere, but the device came with a SD-card that contain a copy of the source, so I uploaded it until a more canonical place is located. Putting the device in the serial mode can be done using a sudo eject /dev/sdc command which results in the following syslog output.
usb 3-3: reset full-speed USB device number 39 using xhci_hcd
usb 3-3: device firmware changed
usb 3-3: USB disconnect, device number 39
sdc: detected capacity change from 128 to 0
usb 3-3: new full-speed USB device number 40 using xhci_hcd
usb 3-3: New USB device found, idVendor=234b, idProduct=0001, bcdDevice= 2.00
usb 3-3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
usb 3-3: Product: NeuG True RNG
usb 3-3: Manufacturer: Free Software Initiative of Japan
usb 3-3: SerialNumber: FSIJ-1.0.9-42315277
cdc_acm 3-3:1.0: ttyACM0: USB ACM device
Now download Gnuk, verify its integrity and build it. You may need some additional packages installed, try apt-get install gcc-arm-none-eabi openocd python3-usb. As you can see, I m using the stable 1.2 branch of Gnuk, currently on version 1.2.20. The ./configure parameters deserve some explanation. The kdf_do=required sets up the device to require KDF usage. The --enable-factory-reset allows me to use the command factory-reset (with admin PIN) inside gpg --card-edit to completely wipe the card. Some may consider that too dangerous, but my view is that if someone has your admin PIN it is game over anyway. The --vidpid=234b:0000 is specifies the USB VID/PID to use, and --target=FST_01SZ is critical to set the platform (you ll may brick the device if you pick the wrong --target setting).
jas@kaka:~/src$ rm -rf gnuk neug
jas@kaka:~/src$ git clone https://gitlab.com/jas/neug.git
Cloning into 'neug'...
remote: Enumerating objects: 2034, done.
remote: Counting objects: 100% (2034/2034), done.
remote: Compressing objects: 100% (603/603), done.
remote: Total 2034 (delta 1405), reused 2013 (delta 1405), pack-reused 0
Receiving objects: 100% (2034/2034), 910.34 KiB 3.50 MiB/s, done.
Resolving deltas: 100% (1405/1405), done.
jas@kaka:~/src$ git clone https://salsa.debian.org/gnuk-team/gnuk/gnuk.git
Cloning into 'gnuk'...
remote: Enumerating objects: 13765, done.
remote: Counting objects: 100% (959/959), done.
remote: Compressing objects: 100% (337/337), done.
remote: Total 13765 (delta 629), reused 907 (delta 599), pack-reused 12806
Receiving objects: 100% (13765/13765), 12.59 MiB 3.05 MiB/s, done.
Resolving deltas: 100% (10077/10077), done.
jas@kaka:~/src$ cd neug
jas@kaka:~/src/neug$ git describe
release/1.0.9
jas@kaka:~/src/neug$ git tag -v git describe
object 5d51022a97a5b7358d0ea62bbbc00628c6cec06a
type commit
tag release/1.0.9
tagger NIIBE Yutaka <gniibe@fsij.org> 1542701768 +0900
Version 1.0.9.
gpg: Signature made Tue Nov 20 09:16:08 2018 CET
gpg: using EDDSA key 249CB3771750745D5CDD323CE267B052364F028D
gpg: issuer "gniibe@fsij.org"
gpg: Good signature from "NIIBE Yutaka <gniibe@fsij.org>" [unknown]
gpg: aka "NIIBE Yutaka <gniibe@debian.org>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: 249C B377 1750 745D 5CDD 323C E267 B052 364F 028D
jas@kaka:~/src/neug$ cd ../gnuk/
jas@kaka:~/src/gnuk$ git checkout STABLE-BRANCH-1-2
Branch 'STABLE-BRANCH-1-2' set up to track remote branch 'STABLE-BRANCH-1-2' from 'origin'.
Switched to a new branch 'STABLE-BRANCH-1-2'
jas@kaka:~/src/gnuk$ git describe
release/1.2.20
jas@kaka:~/src/gnuk$ git tag -v git describe
object 9d3c08bd2beb73ce942b016d4328f0a596096c02
type commit
tag release/1.2.20
tagger NIIBE Yutaka <gniibe@fsij.org> 1650594032 +0900
Gnuk: Version 1.2.20
gpg: Signature made Fri Apr 22 04:20:32 2022 CEST
gpg: using EDDSA key 249CB3771750745D5CDD323CE267B052364F028D
gpg: Good signature from "NIIBE Yutaka <gniibe@fsij.org>" [unknown]
gpg: aka "NIIBE Yutaka <gniibe@debian.org>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: 249C B377 1750 745D 5CDD 323C E267 B052 364F 028D
jas@kaka:~/src/gnuk/src$ git submodule update --init
Submodule 'chopstx' (https://salsa.debian.org/gnuk-team/chopstx/chopstx.git) registered for path '../chopstx'
Cloning into '/home/jas/src/gnuk/chopstx'...
Submodule path '../chopstx': checked out 'e12a7e0bb3f004c7bca41cfdb24c8b66daf3db89'
jas@kaka:~/src/gnuk$ cd chopstx
jas@kaka:~/src/gnuk/chopstx$ git describe
release/1.21
jas@kaka:~/src/gnuk/chopstx$ git tag -v git describe
object e12a7e0bb3f004c7bca41cfdb24c8b66daf3db89
type commit
tag release/1.21
tagger NIIBE Yutaka <gniibe@fsij.org> 1650593697 +0900
Chopstx: Version 1.21
gpg: Signature made Fri Apr 22 04:14:57 2022 CEST
gpg: using EDDSA key 249CB3771750745D5CDD323CE267B052364F028D
gpg: Good signature from "NIIBE Yutaka <gniibe@fsij.org>" [unknown]
gpg: aka "NIIBE Yutaka <gniibe@debian.org>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: 249C B377 1750 745D 5CDD 323C E267 B052 364F 028D
jas@kaka:~/src/gnuk/chopstx$ cd ../src
jas@kaka:~/src/gnuk/src$ kdf_do=required ./configure --enable-factory-reset --vidpid=234b:0000 --target=FST_01SZ
Header file is: board-fst-01sz.h
Debug option disabled
Configured for bare system (no-DFU)
PIN pad option disabled
CERT.3 Data Object is NOT supported
Card insert/removal by HID device is NOT supported
Life cycle management is supported
Acknowledge button is supported
KDF DO is required before key import/generation
jas@kaka:~/src/gnuk/src$ make less
jas@kaka:~/src/gnuk/src$ cd ../regnual/
jas@kaka:~/src/gnuk/regnual$ make less
jas@kaka:~/src/gnuk/regnual$ cd ../../
jas@kaka:~/src$ sudo python3 neug/tool/neug_upgrade.py -f gnuk/regnual/regnual.bin gnuk/src/build/gnuk.bin
gnuk/regnual/regnual.bin: 4608
gnuk/src/build/gnuk.bin: 109568
CRC32: b93ca829
Device:
Configuration: 1
Interface: 1
20000e00:20005000
Downloading flash upgrade program...
start 20000e00
end 20002000
# 20002000: 32 : 4
Run flash upgrade program...
Wait 1 second...
Wait 1 second...
Device:
08001000:08020000
Downloading the program
start 08001000
end 0801ac00
jas@kaka:~/src$
The kernel log will contain the following, and the card is ready to use as an OpenPGP card. You may unplug it and re-insert it as you wish.
usb 3-3: reset full-speed USB device number 41 using xhci_hcd
usb 3-3: device firmware changed
usb 3-3: USB disconnect, device number 41
usb 3-3: new full-speed USB device number 42 using xhci_hcd
usb 3-3: New USB device found, idVendor=234b, idProduct=0000, bcdDevice= 2.00
usb 3-3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
usb 3-3: Product: Gnuk Token
usb 3-3: Manufacturer: Free Software Initiative of Japan
usb 3-3: SerialNumber: FSIJ-1.2.20-42315277
Setting up the card is the next step, and there are many tutorials around for this, eventually I settled with the following sequence. Let s start with setting the admin PIN. First make sure that pcscd nor scdaemon is running, which is good hygien since those processes cache some information and with a stale connection this easily leads to confusion. Cache invalidation sigh.
jas@kaka:~$ gpg-connect-agent "SCD KILLSCD" "SCD BYE" /bye
jas@kaka:~$ ps auxww grep -e pcsc -e scd
jas 30221 0.0 0.0 3468 1692 pts/3 R+ 11:49 0:00 grep --color=auto -e pcsc -e scd
jas@kaka:~$ gpg --card-edit
Reader ...........: 234B:0000:FSIJ-1.2.20-42315277:0
Application ID ...: D276000124010200FFFE423152770000
Application type .: OpenPGP
Version ..........: 2.0
Manufacturer .....: unmanaged S/N range
Serial number ....: 42315277
Name of cardholder: [not set]
Language prefs ...: [not set]
Salutation .......:
URL of public key : [not set]
Login data .......: [not set]
Signature PIN ....: forced
Key attributes ...: rsa2048 rsa2048 rsa2048
Max. PIN lengths .: 127 127 127
PIN retry counter : 3 3 3
Signature counter : 0
KDF setting ......: off
Signature key ....: [none]
Encryption key....: [none]
Authentication key: [none]
General key info..: [none]
gpg/card> admin
Admin commands are allowed
gpg/card> kdf-setup
gpg/card> passwd
gpg: OpenPGP card no. D276000124010200FFFE423152770000 detected
1 - change PIN
2 - unblock PIN
3 - change Admin PIN
4 - set the Reset Code
Q - quit
Your selection? 3
PIN changed.
1 - change PIN
2 - unblock PIN
3 - change Admin PIN
4 - set the Reset Code
Q - quit
Your selection?
Now it would be natural to setup the PIN and reset code. However the Gnuk software is configured to not allow this until the keys are imported. You would get the following somewhat cryptical error messages if you try. This took me a while to understand, since this is device-specific, and some other OpenPGP implementations allows you to configure a PIN and reset code before key import.
Your selection? 4
Error setting the Reset Code: Card error
1 - change PIN
2 - unblock PIN
3 - change Admin PIN
4 - set the Reset Code
Q - quit
Your selection? 1
Error changing the PIN: Conditions of use not satisfied
1 - change PIN
2 - unblock PIN
3 - change Admin PIN
4 - set the Reset Code
Q - quit
Your selection? q
Continue to configure the card and make it ready for key import. Some settings deserve comments. The lang field may be used to setup the language, but I have rarely seen it use, and I set it to sv (Swedish) mostly to be able to experiment if any software adhears to it. The URL is important to point to somewhere where your public key is stored, the fetch command of gpg --card-edit downloads it and sets up GnuPG with it when you are on a clean new laptop. The forcesig command changes the default so that a PIN code is not required for every digital signature operation, remember that I averaged 10 signatures per day for the past 2-3 years? Think of the wasted energy typing those PIN codes every time! Changing the cryptographic key type is required when I import 25519-based keys.
gpg/card> name
Cardholder's surname: Josefsson
Cardholder's given name: Simon
gpg/card> lang
Language preferences: sv
gpg/card> sex
Salutation (M = Mr., F = Ms., or space): m
gpg/card> login
Login data (account name): jas
gpg/card> url
URL to retrieve public key: https://josefsson.org/key-20190320.txt
gpg/card> forcesig
gpg/card> key-attr
Changing card key attribute for: Signature key
Please select what kind of key you want:
(1) RSA
(2) ECC
Your selection? 2
Please select which elliptic curve you want:
(1) Curve 25519
(4) NIST P-384
Your selection? 1
The card will now be re-configured to generate a key of type: ed25519
Note: There is no guarantee that the card supports the requested size.
If the key generation does not succeed, please check the
documentation of your card to see what sizes are allowed.
Changing card key attribute for: Encryption key
Please select what kind of key you want:
(1) RSA
(2) ECC
Your selection? 2
Please select which elliptic curve you want:
(1) Curve 25519
(4) NIST P-384
Your selection? 1
The card will now be re-configured to generate a key of type: cv25519
Changing card key attribute for: Authentication key
Please select what kind of key you want:
(1) RSA
(2) ECC
Your selection? 2
Please select which elliptic curve you want:
(1) Curve 25519
(4) NIST P-384
Your selection? 1
The card will now be re-configured to generate a key of type: ed25519
gpg/card>
Reader ...........: 234B:0000:FSIJ-1.2.20-42315277:0
Application ID ...: D276000124010200FFFE423152770000
Application type .: OpenPGP
Version ..........: 2.0
Manufacturer .....: unmanaged S/N range
Serial number ....: 42315277
Name of cardholder: Simon Josefsson
Language prefs ...: sv
Salutation .......: Mr.
URL of public key : https://josefsson.org/key-20190320.txt
Login data .......: jas
Signature PIN ....: not forced
Key attributes ...: ed25519 cv25519 ed25519
Max. PIN lengths .: 127 127 127
PIN retry counter : 3 3 3
Signature counter : 0
KDF setting ......: on
Signature key ....: [none]
Encryption key....: [none]
Authentication key: [none]
General key info..: [none]
gpg/card>
The device is now ready for key import! Bring out your offline laptop and boot it and use the keytocard command on the subkeys to import them. This assumes you saved a copy of the GnuPG home directory after generating the master and subkeys before, which I did in my own previous tutorial when I generated the keys. This may be a bit unusual, and there are simpler ways to do this (e.g., import a copy of the secret keys into a fresh GnuPG home directory).
$ cp -a gnupghome-backup-mastersubkeys gnupghome-import-fst01sz-42315277-2022-12-24
$ ps auxww grep -e pcsc -e scd
$ gpg --homedir $PWD/gnupghome-import-fst01sz-42315277-2022-12-24 --edit-key B1D2BD1375BECB784CF4F8C4D73CF638C53C06BE
...
Secret key is available.
gpg: checking the trustdb
gpg: marginals needed: 3 completes needed: 1 trust model: pgp
gpg: depth: 0 valid: 1 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 1u
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> key 1
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb* cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> keytocard
Please select where to store the key:
(2) Encryption key
Your selection? 2
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb* cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> key 1
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> key 2
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb* ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> keytocard
Please select where to store the key:
(3) Authentication key
Your selection? 3
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb* ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> key 2
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> key 3
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb* ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> keytocard
Please select where to store the key:
(1) Signature key
(3) Authentication key
Your selection? 1
sec ed25519/D73CF638C53C06BE
created: 2019-03-20 expired: 2019-10-22 usage: SC
trust: ultimate validity: expired
ssb cv25519/02923D7EE76EBD60
created: 2019-03-20 expired: 2019-10-22 usage: E
ssb ed25519/80260EE8A9B92B2B
created: 2019-03-20 expired: 2019-10-22 usage: A
ssb* ed25519/51722B08FE4745A2
created: 2019-03-20 expired: 2019-10-22 usage: S
[ expired] (1). Simon Josefsson <simon@josefsson.org>
gpg> quit
Save changes? (y/N) y
$
Now insert it into your daily laptop and have GnuPG and learn about the new private keys and forget about any earlier locally available card bindings this usually manifests itself by GnuPG asking you to insert a OpenPGP card with another serial number. Earlier I did rm -rf ~/.gnupg/private-keys-v1.d/ but the scd serialno followed by learn --force is nicer. I also sets up trust setting for my own key.
jas@kaka:~$ gpg-connect-agent "scd serialno" "learn --force" /bye
...
jas@kaka:~$ echo "B1D2BD1375BECB784CF4F8C4D73CF638C53C06BE:6:" gpg --import-ownertrust
jas@kaka:~$ gpg --card-status
Reader ...........: 234B:0000:FSIJ-1.2.20-42315277:0
Application ID ...: D276000124010200FFFE423152770000
Application type .: OpenPGP
Version ..........: 2.0
Manufacturer .....: unmanaged S/N range
Serial number ....: 42315277
Name of cardholder: Simon Josefsson
Language prefs ...: sv
Salutation .......: Mr.
URL of public key : https://josefsson.org/key-20190320.txt
Login data .......: jas
Signature PIN ....: not forced
Key attributes ...: ed25519 cv25519 ed25519
Max. PIN lengths .: 127 127 127
PIN retry counter : 5 5 5
Signature counter : 3
KDF setting ......: on
Signature key ....: A3CC 9C87 0B9D 310A BAD4 CF2F 5172 2B08 FE47 45A2
created ....: 2019-03-20 23:40:49
Encryption key....: A9EC 8F4D 7F1E 50ED 3DEF 49A9 0292 3D7E E76E BD60
created ....: 2019-03-20 23:40:26
Authentication key: CA7E 3716 4342 DF31 33DF 3497 8026 0EE8 A9B9 2B2B
created ....: 2019-03-20 23:40:37
General key info..: sub ed25519/51722B08FE4745A2 2019-03-20 Simon Josefsson <simon@josefsson.org>
sec# ed25519/D73CF638C53C06BE created: 2019-03-20 expires: 2023-09-19
ssb> ed25519/80260EE8A9B92B2B created: 2019-03-20 expires: 2023-09-19
card-no: FFFE 42315277
ssb> ed25519/51722B08FE4745A2 created: 2019-03-20 expires: 2023-09-19
card-no: FFFE 42315277
ssb> cv25519/02923D7EE76EBD60 created: 2019-03-20 expires: 2023-09-19
card-no: FFFE 42315277
jas@kaka:~$
Verify that you can digitally sign and authenticate using the key and you are done!
jas@kaka:~$ echo foo gpg -a --sign gpg --verify
gpg: Signature made Sat Dec 24 13:49:59 2022 CET
gpg: using EDDSA key A3CC9C870B9D310ABAD4CF2F51722B08FE4745A2
gpg: Good signature from "Simon Josefsson <simon@josefsson.org>" [ultimate]
jas@kaka:~$ ssh-add -L
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILzCFcHHrKzVSPDDarZPYqn89H5TPaxwcORgRg+4DagE cardno:FFFE42315277
jas@kaka:~$
So time to relax and celebrate christmas? Hold on not so fast! Astute readers will have noticed that the output said PIN retry counter: 5 5 5 . That s not the default PIN retry counter for Gnuk! How did that happen? Indeed, good catch and great question, my dear reader. I wanted to include how you can modify the Gnuk source code, re-build it and re-flash the Gnuk as well. This method is different than flashing Gnuk onto a device that is running NeuG so the commands I used to flash the firmware in the start of this blog post no longer works in a device running Gnuk. Fortunately modern Gnuk supports updating firmware by specifying the Admin PIN code only, and provides a simple script to achieve this as well. The PIN retry counter setting is hard coded in the openpgp-do.c file, and we run a a perl command to modify the file, rebuild Gnuk and upgrade the FST-01SZ. This of course wipes all your settings, so you will have the opportunity to practice all the commands earlier in this post once again!
jas@kaka:~/src/gnuk/src$ perl -pi -e 's/PASSWORD_ERRORS_MAX 3/PASSWORD_ERRORS_MAX 5/' openpgp-do.c
jas@kaka:~/src/gnuk/src$ make less
jas@kaka:~/src/gnuk/src$ cd ../tool/
jas@kaka:~/src/gnuk/tool$ ./upgrade_by_passwd.py
Admin password:
Device:
Configuration: 1
Interface: 0
../regnual/regnual.bin: 4608
../src/build/gnuk.bin: 110592
CRC32: b93ca829
Device:
Configuration: 1
Interface: 0
20002a00:20005000
Downloading flash upgrade program...
start 20002a00
end 20003c00
Run flash upgrade program...
Waiting for device to appear:
Wait 1 second...
Wait 1 second...
Device:
08001000:08020000
Downloading the program
start 08001000
end 0801b000
Protecting device
Finish flashing
Resetting device
Update procedure finished
jas@kaka:~/src/gnuk/tool$
Now finally, I wish you all a Merry Christmas and Happy Hacking!
An earlier article showed that
private key storage is an important problem to solve in any
cryptographic system and established keycards as a good way to store
private key material offline. But which keycard should we use? This
article examines the form factor, openness, and performance of four
keycards to try to help readers choose the one that will fit their
needs.
I have personally been using a YubiKey NEO, since a 2015
announcement
on GitHub promoting two-factor authentication. I was also able to hook
up my SSH authentication key into the YubiKey's 2048 bit RSA slot. It
seemed natural to move the other subkeys onto the keycard, provided that
performance was sufficient. The mail client that I use,
(Notmuch), blocks when decrypting messages,
which could be a serious problems on large email threads from encrypted
mailing lists.
So I built a test harness and got access to some more keycards: I bought
a FST-01 from its creator,
Yutaka Niibe, at the last DebConf and Nitrokey donated a Nitrokey
Pro. I also
bought a YubiKey 4
when I got the NEO. There are of course other keycards out there, but
those are the ones I could get my hands on. You'll notice none of those
keycards have a physical keypad to enter passwords, so they are all
vulnerable to keyloggers that could extract the key's PIN. Keep in mind,
however, that even with the PIN, an attacker could only ask the keycard
to decrypt or sign material but not extract the key that is protected by
the card's firmware.
Form factor
The four keycards have similar form factors: they all connect to a
standard USB port, although both YubiKey keycards have a capacitive
button by which the user triggers two-factor authentication and the
YubiKey 4 can also require a button
press
to confirm private key use. The YubiKeys feel sturdier than the other
two. The NEO has withstood two years of punishment in my pockets along
with the rest of my "real" keyring and there is only minimal wear on the
keycard in the picture. It's also thinner so it fits well on the
keyring.
The FST-01 stands out from the other two with its minimal design. Out of
the box, the FST-01 comes without a case, so the circuitry is exposed.
This is deliberate: one of its goals is to be as transparent as
possible, both in terms of software and hardware design and you
definitely get that feeling at the physical level. Unfortunately, that
does mean it feels more brittle than other models: I wouldn't carry it
in my pocket all the time, although there is a
case
that may protect the key a little better, but it does not provide an
easy way to hook it into a keyring. In the group picture above, the
FST-01 is the pink plastic thing, which is a rubbery casing I received
along with the device when I got it.
Notice how the USB connectors of the YubiKeys differ from the other two:
while the FST-01 and the Nitrokey have standard USB connectors, the
YubiKey has only a "half-connector", which is what makes it thinner than
the other two. The "Nano" form factor takes this even further and almost
disappears in the USB port. Unfortunately, this arrangement means the
YubiKey NEO often comes loose and falls out of the USB port, especially
when connected to a laptop. On my workstation, however, it usually stays
put even with my whole keyring hanging off of it. I suspect this adds
more strain to the host's USB port but that's a tradeoff I've lived with
without any noticeable wear so far. Finally, the NEO has this peculiar
feature of supporting NFC for certain operations, as LWN previously
covered, but I haven't used that
feature yet.
The Nitrokey Pro looks like a normal USB key, in contrast with the other
two devices. It does feel a little brittle when compared with the
YubiKey, although only time will tell how much of a beating it can take.
It has a small ring in the case so it is possible to carry it directly
on your keyring, but I would be worried the cap would come off
eventually. Nitrokey devices are also two times thicker than the Yubico
models which makes them less convenient to carry around on keyrings.
Open and closed designs
The FST-01 is as open as hardware comes, down to the PCB design
available as KiCad files in this Git
repository. The
software running on the card is the
Gnuk firmware that implements the
OpenPGP card protocol, but you can
also get it with firmware implementing a true random number generator
(TRNG) called
NeuG
(pronounced "noisy"); the device is
programmable through a
standard Serial Wire
Debug (SWD) port. The
Nitrokey Start model also runs the Gnuk firmware. However, the Nitrokey
website announces only ECC and RSA 2048-bit
support for the Start, while the FST-01 also supports RSA-4096.
Nitrokey's founder Jan Suhr, in a private email, explained that this is
because "Gnuk doesn't support RSA-3072 or larger at a reasonable speed".
Its devices (the Pro, Start, and HSM models) use a similar chip to the
FST-01: the STM32F103
microcontroller.
Nitrokey also publishes its hardware designs, on
GitHub, which shows the Pro is basically a
fork of the FST-01, according to the
ChangeLog.
I opened the case to confirm it was using the STM MCU, something I
should warn you against; I broke one of the pins holding it together
when opening it so now it's even more fragile. But at least, I was able
to confirm it was built using the STM32F103TBU6 MCU, like the FST-01.
But this is where the comparison ends: on the back side, we find a SIM
card reader that holds the OpenPGP
card that, in turn, holds
the private key material and does the cryptographic operations. So, in
effect, the Nitrokey Pro is really a evolution of the original OpenPGP
card readers.
Nitrokey confirmed the OpenPGP card featured in the Pro is the same as
the one shipped by
the Free Software Foundation Europe (FSFE): the
BasicCard built by ZeitControl. Those cards,
however, are covered by NDAs and the firmware is only partially open
source.
This makes the Nitrokey Pro less open than the FST-01, but that's an
inevitable tradeoff when choosing a design based on the OpenPGP cards,
which Suhr described to me as "pretty proprietary". There are other
keycards out there, however, for example the
SLJ52GDL150-150k
smartcard suggested by
Debian developer Yves-Alexis Perez, which he prefers as it is certified
by French and German authorities. In that blog post, he also said he was
experimenting with the GPL-licensed OpenPGP
applet implemented by the French
ANSSI.
But the YubiKey devices are even further away in the closed-design
direction. Both the hardware designs and firmware are proprietary. The
YubiKey NEO, for example, cannot be upgraded at all, even though it is
based on an open firmware. According to Yubico's
FAQ,
this is due to "best security practices": "There is a 'no upgrade'
policy for our devices since nothing, including malware, can write to
the firmware."
I find this decision questionable in a context where security updates
are often more important than trying to design a bulletproof design,
which may simply be impossible. And the YubiKey NEO did suffer from
critical security
issue
that allowed attackers to bypass the PIN protection on the card, which
raises the question of the actual protection of the private key material
on those cards. According to Niibe, "some OpenPGP cards store the
private key unencrypted. It is a common attitude for many smartcard
implementations", which was confirmed by Suhr: "the private key is
protected by hardware mechanisms which prevent its extraction and
misuse". He is referring to the use of tamper
resistance.
After that security issue, there was no other option for YubiKey NEO
users than to get a new keycard (for free, thankfully) from Yubico,
which also meant discarding the private key material on the key. For
OpenPGP keys, this may mean having to bootstrap the web of trust from
scratch if the keycard was responsible for the main certification key.
But at least the NEO is running free software based on the OpenPGP card
applet and the
source is still available on
GitHub. The YubiKey 4, on the
other hand, is now closed
source,
which was controversial when the new model was announced last year. It
led the main Linux Foundation system administrator, Konstantin
Ryabitsev, to withdraw his
endorsement
of Yubico products. In response, Yubico argued that this approach was
essential to the security of its
devices,
which are now based on "a secure chip, which has built-in
countermeasures to mitigate a long list of attacks". In particular, it
claims that:
A commercial-grade AVR or ARM controller is unfit to be used in a
security product. In most cases, these controllers are easy to attack,
from breaking in via a debug/JTAG/TAP port to probing memory contents.
Various forms of fault injection and side-channel analysis are
possible, sometimes allowing for a complete key recovery in a
shockingly short period of time.
While I understand those concerns, they eventually come down to the
trust you have in an organization. Not only do we have to trust Yubico,
but also hardware manufacturers and designs they have chosen. Every step
in the hidden supply chain is then trusted to make correct technical
decisions and not introduce any backdoors.
History, unfortunately, is not on Yubico's side: Snowden revealed the
example of RSA security
accepting what renowned cryptographer Bruce Schneier described as a
"bribe"
from the NSA to weaken its ECC implementation, by using the presumably
backdoored Dual_EC_DRBG
algorithm. What makes Yubico or its suppliers so different from RSA
Security? Remember that RSA Security used to be an adamant opponent to
the degradation of encryption standards, campaigning against the
Clipper chip in the first
crypto wars.
Even if we trust the Yubico supply chain, how can we trust a closed
design using what basically amounts to security through obscurity?
Publicly auditable designs are an important tradition in cryptography,
and that principle shouldn't stop when software is frozen into silicon.
In fact, a critical vulnerability called
ROCA
disclosed recently affects closed "smartcards" like the
YubiKey 4
and allows full private key recovery from the public key if the key was
generated on a vulnerable keycard. When speaking with Ars
Technica,
the researchers outlined the importance of open designs and questioned
the reliability of certification:
Our work highlights the dangers of keeping the design secret and the
implementation closed-source, even if both are thoroughly analyzed and
certified by experts. The lack of public information causes a delay in
the discovery of flaws (and hinders the process of checking for them),
thereby increasing the number of already deployed and affected devices
at the time of detection.
This issue with open hardware designs seems to be recurring topic of
conversation on the Gnuk mailing
list. For
example, there was a
discussion
in September 2017 regarding possible hardware vulnerabilities in the STM
MCU that would allow extraction of encrypted key material from the key.
Niibe referred to a
talk
presented at the WOOT 17
workshop, where Johannes Obermaier and Stefan Tatschner, from the
Fraunhofer Institute, demonstrated attacks against the STMF0 family
MCUs. It is still unclear if those attacks also apply to the older STMF1
design used in the FST-01, however. Furthermore, extracted private key
material is still protected by user passphrase, but the Gnuk uses a weak
key derivation function, so brute-forcing attacks may be possible.
Fortunately, there is work in progress to
make GnuPG hash the passphrase before sending it to the keycard, which
should make such attacks harder if not completely pointless.
When asked about the Yubico claims in a private email, Niibe did
recognize that "it is true that there are more weak points in general
purpose implementations than special implementations". During the last
DebConf in Montreal, Niibe
explained:
If you don't trust me, you should not buy from me. Source code
availability is only a single factor: someone can maliciously replace
the firmware to enable advanced attacks.
Niibe recommends to "build the firmware yourself", also saying the
design of the FST-01 uses normal hardware that "everyone can replicate".
Those advantages are hard to deny for a cryptographic system: using more
generic components makes it harder for hostile parties to mount targeted
attacks.
A counter-argument here is that it can be difficult for a regular user
to audit such designs, let alone physically build the device from
scratch but, in a mailing list discussion, Debian developer Ian Jackson
explained
that:
You don't need to be able to validate it personally. The thing spooks
most hate is discovery. Backdooring supposedly-free hardware is harder
(more costly) because it comes with greater risk of discovery.
To put it concretely: if they backdoor all of them, someone (not
necessarily you) might notice. (Backdooring only yours involves
messing with the shipping arrangements and so on, and supposes that
you specifically are of interest.)
Since that, as far as we know, the STM microcontrollers are not
backdoored, I would tend to favor those devices instead of proprietary
ones, as such a backdoor would be more easily detectable than in a
closed design. Even though physical attacks may be possible against
those microcontrollers, in the end, if an attacker has physical access
to a keycard, I consider the key compromised, even if it has the best
chip on the market. In our email exchange, Niibe argued that "when a
token is lost, it is better to revoke keys, even if the token is
considered secure enough". So like any other device, physical compromise
of tokens may mean compromise of the key and should trigger
key-revocation procedures.
Algorithms and performance
To establish reliable performance results, I wrote a benchmark program
naively called crypto-bench
that could produce comparable results between the different keys. The
program takes each algorithm/keycard combination and runs 1000
decryptions of a 16-byte file (one AES-128 block) using GnuPG, after
priming it to get the password cached. I assume the overhead of GnuPG
calls to be negligible, as it should be the same across all tokens, so
comparisons are possible. AES encryption is constant across all tests as
it is always performed on the host and fast enough to be irrelevant in
the tests.
I used the following:
Intel(R) Core(TM) i3-6100U CPU @ 2.30GHz running Debian 9
("stretch"/stable amd64), using GnuPG 2.1.18-6 (from the stable
Debian package)
Nitrokey Pro 0.8 (latest firmware)
FST-01, running Gnuk version 1.2.5 (latest firmware)
I ran crypto-bench for each keycard, which resulted in the following:
Algorithm
Device
Mean time (s)
ECDH-Curve25519
CPU
0.036
FST-01
0.135
RSA-2048
CPU
0.016
YubiKey-4
0.162
Nitrokey-Pro
0.610
YubiKey-NEO
0.736
FST-01
1.265
RSA-4096
CPU
0.043
YubiKey-4
0.875
Nitrokey-Pro
3.150
FST-01
8.218
There we see the performance of the four keycards I tested, compared
with the same operations done without a keycard: the "CPU" device. That
provides the baseline time of GnuPG decrypting the file. The first
obvious observation is that using a keycard is slower: in the best
scenario (FST-01 + ECC) we see a four-fold slowdown, but in the worst
case (also FST-01, but RSA-4096), we see a catastrophic 200-fold
slowdown. When I
presented
the results on the Gnuk mailing list, GnuPG developer Werner Koch
confirmed those "numbers are as expected":
With a crypto chip RSA is much faster. By design the Gnuk can't be as
fast - it is just a simple MCU. However, using Curve25519 Gnuk is
really fast.
And yes, the FST-01 is really fast at doing ECC, but it's also the only
keycard that handles ECC in my tests; the Nitrokey Start and Nitrokey
HSM should support it as well, but I haven't been able to test those
devices. Also note that the YubiKey NEO doesn't support RSA-4096 at all,
so we can only compare RSA-2048 across keycards. We should note,
however, that ECC is slower than RSA on the CPU, which suggests the
Gnuk ECC implementation used by the FST-01 is exceptionally fast.
In
discussions
about improving the performance of the FST-01, Niibe estimated the user
tolerance threshold to be "2 seconds decryption time". In a new
design
using the STM32L432 microcontroller, Aurelien Jarno was able to bring
the numbers for RSA-2048 decryption from 1.27s down to 0.65s, and for
RSA-4096, from 8.22s down to 3.87s seconds. RSA-4096 is still beyond the
two-second threshold, but at least it brings the FST-01 close to the
YubiKey NEO and Nitrokey Pro performance levels.
We should also underline the superior performance of the YubiKey 4:
whatever that thing is doing, it's doing it faster than anyone else. It
does RSA-4096 faster than the FST-01 does RSA-2048, and almost as fast
as the Nitrokey Pro does RSA-2048. We should also note that the Nitrokey
Pro also fails to cross the two-second threshold for RSA-4096
decryption.
For me, the FST-01's stellar performance with ECC outshines the other
devices. Maybe it says more about the efficiency of the algorithm than
the FST-01 or Gnuk's design, but it's definitely an interesting avenue
for people who want to deploy those modern algorithms. So, in terms of
performance, it is clear that both the YubiKey 4 and the FST-01 take the
prize in their own areas (RSA and ECC, respectively).
Conclusion
In the above presentation, I have evaluated four cryptographic keycards
for use with various OpenPGP operations. What the results show is that
the only efficient way of storing a 4096-bit encryption key on a keycard
would be to use the YubiKey 4. Unfortunately, I do not feel we should
put our trust in such closed designs so I would argue you should either
stick with 2048-bit encryption subkeys or keep the keys on disk.
Considering that losing such a key would be catastrophic, this might be
a good approach anyway. You should also consider switching to ECC
encryption: even though it may not be supported everywhere, GnuPG
supports having multiple encryption subkeys on a keyring: if one
algorithm is unsupported (e.g. GnuPG 1.4 doesn't support ECC), it will
fall back to a supported algorithm (e.g. RSA). Do not forget your
previously encrypted material doesn't magically re-encrypt itself using
your new encryption subkey, however.
For authentication and signing keys, speed is not such an issue, so I
would warmly recommend either the Nitrokey Pro or Start, or the FST-01,
depending on whether you want to start experimenting with ECC
algorithms. Availability also seems to be an issue for the FST-01. While
you can generally get the device when you meet Niibe in person for a few
bucks (I bought mine for around \$30 Canadian), the Seeed online
shop says the device is out of
stock
at the time of this writing, even though Jonathan McDowell
said
that may be inaccurate in a debian-project discussion. Nevertheless,
this issue may make the Nitrokey devices more attractive. When deciding
on using the Pro or Start, Suhr offered the following advice:
In practice smart card security has been proven to work well (at least
if you use a decent smart card). Therefore the Nitrokey Pro should be
used for high security cases. If you don't trust the smart card or if
Nitrokey Start is just sufficient for you, you can choose that one.
This is why we offer both models.
So far, I have created a signing subkey and moved that and my
authentication key to the YubiKey NEO, because it's a device I
physically trust to keep itself together in my pockets and I was already
using it. It has served me well so far, especially with its extra
features like U2F and
HOTP
support, which I use frequently. Those features are also available on
the Nitrokey Pro, so that may be an alternative if I lose the YubiKey. I
will probably move my main certification key to the FST-01 and a
LUKS-encrypted USB disk, to keep that certification key offline but
backed up on two different devices. As for the encryption key, I'll wait
for keycard performance to improve, or simply switch my whole keyring to
ECC and use the FST-01 or Nitrokey Start for that purpose.
[The author would like to thank Nitrokey for providing hardware for
testing.]
This article first appeared in the Linux Weekly News.
While the adoption of OpenPGP by the general
population is marginal at best, it is a critical component for the
security community and particularly for Linux distributions. For
example, every package uploaded into Debian is verified by the central
repository using the maintainer's OpenPGP keys and the repository itself
is, in turn, signed using a separate key. If upstream packages also use
such signatures, this creates a complete trust path from the original
upstream developer to users. Beyond that, pull requests for the Linux
kernel are verified using signatures as well. Therefore, the stakes are
high: a compromise of the release key, or even of a single maintainer's
key, could enable devastating attacks against many machines.
That has led the Debian community to develop a good grasp of best
practices for cryptographic signatures (which are typically handled
using GNU Privacy Guard, also known as GnuPG or
GPG). For example, weak (less than 2048 bits) and
vulnerable PGPv3 keys were
removed from
the keyring in 2015, and there is a strong culture of cross-signing keys
between Debian members at in-person meetings. Yet even Debian developers
(DDs) do not seem to have established practices on how to actually store
critical private key material, as we can see in this
discussion
on the debian-project mailing list. That email boiled down to a simple
request: can I have a "key dongles for dummies" tutorial? Key dongles,
or keycards as we'll call them here, are small devices that allow users
to store keys on an offline device and provide one possible solution for
protecting private key material. In this article, I hope to use my
experience in this domain to clarify the issue of how to store those
precious private keys that, if compromised, could enable arbitrary code
execution on millions of machines all over the world.
Why store keys offline?
Before we go into details about storing keys offline, it may be useful
to do a small reminder of how the OpenPGP
standard works. OpenPGP keys are
made of a main public/private key pair, the certification key, used to
sign user identifiers and subkeys. My public key, shown below, has the
usual main certification/signature key (marked SC) but also an
encryption subkey (marked E), a separate signature key (S), and two
authentication keys (marked A) which I use as RSA keys to log into
servers using SSH, thanks to the
Monkeysphere project.
pub rsa4096/792152527B75921E 2009-05-29 [SC] [expires: 2018-04-19]
8DC901CE64146C048AD50FBB792152527B75921E
uid [ultimate] Antoine Beaupr <anarcat@anarc.at>
uid [ultimate] Antoine Beaupr <anarcat@koumbit.org>
uid [ultimate] Antoine Beaupr <anarcat@orangeseeds.org>
uid [ultimate] Antoine Beaupr <anarcat@debian.org>
sub rsa2048/B7F648FED2DF2587 2012-07-18 [A]
sub rsa2048/604E4B3EEE02855A 2012-07-20 [A]
sub rsa4096/A51D5B109C5A5581 2009-05-29 [E]
sub rsa2048/3EA1DDDDB261D97B 2017-08-23 [S]
All the subkeys (sub) and identities (uid) are bound by the main
certification key using cryptographic self-signatures. So while an
attacker stealing a private subkey can spoof signatures in my name or
authenticate to other servers, that key can always be revoked by the
main certification key. But if the certification key gets stolen, all
bets are off: the attacker can create or revoke identities or subkeys as
they wish. In a catastrophic scenario, an attacker could even steal the
key and remove your copies, taking complete control of the key, without
any possibility of recovery. Incidentally, this is why it is so
important to generate a revocation certificate and store it offline.
So by moving the certification key offline, we reduce the attack surface
on the OpenPGP trust chain: day-to-day keys (e.g. email encryption or
signature) can stay online but if they get stolen, the certification key
can revoke those keys without having to revoke the main certification
key as well. Note that a stolen encryption key is a different problem:
even if we revoke the encryption subkey, this will only affect future
encrypted messages. Previous messages will be readable by the attacker
with the stolen subkey even if that subkey gets revoked, so the benefits
of revoking encryption certificates are more limited.
Common strategies for offline key storage
Considering the security tradeoffs, some propose storing those critical
keys offline to reduce those threats. But where exactly? In an attempt
to answer that question, Jonathan McDowell, a member of the Debian
keyring maintenance team,
said that there are three
options:
use an external LUKS-encrypted volume, an air-gapped system, or a
keycard.
Full-disk encryption like LUKS adds an extra layer of security by hiding
the content of the key from an attacker. Even though private keyrings
are usually protected by a passphrase, they are easily identifiable as a
keyring. But when a volume is fully encrypted, it's not immediately
obvious to an attacker there is private key material on the device.
According
to Sean Whitton, another advantage of LUKS over plain GnuPG keyring
encryption is that you can pass the --iter-time argument when creating
a LUKS partition to increase key-derivation delay, which makes
brute-forcing much harder. Indeed, GnuPG 2.x doesn't
have a run-time option to configure the
key-derivation algorithm, although a
patch was introduced recently to make the
delay configurable at compile time in gpg-agent, which is now
responsible for all secret key operations.
The downside of external volumes is complexity: GnuPG makes it difficult
to extract secrets out of its keyring, which makes the first setup
tricky and error-prone. This is easier in the 2.x series thanks to the
new storage system and the associated keygrip files, but it still
requires arcane knowledge of GPG internals. It is also inconvenient to
use secret keys stored outside your main keyring when you actually do
need to use them, as GPG doesn't know where to find those keys anymore.
Another option is to set up a separate air-gapped system to perform
certification operations. An example is the PGP clean
room project,
which is a live system based on Debian and designed by DD Daniel Pocock
to operate an OpenPGP and X.509 certificate authority using commodity
hardware. The basic principle is to store the secrets on a different
machine that is never connected to the network and, therefore, not
exposed to attacks, at least in theory. I have personally discarded that
approach because I feel air-gapped systems provide a false sense of
security: data eventually does need to come in and out of the system,
somehow, even if only to propagate signatures out of the system, which
exposes the system to attacks.
System updates are similarly problematic: to keep the system secure,
timely security updates need to be deployed to the air-gapped system. A
common use pattern is to share data through USB keys, which introduce a
vulnerability where attacks like
BadUSB can infect the air-gapped
system. From there, there is a multitude of exotic ways of exfiltrating
the data using
LEDs,
infrared
cameras,
or the good old
TEMPEST
attack. I therefore concluded the complexity tradeoffs of an air-gapped
system are not worth it. Furthermore, the workflow for air-gapped
systems is complex: even though PGP clean room went a long way, it's
still lacking even simple scripts that allow signing or transferring
keys, which is a problem shared by the external LUKS storage approach.
Keycard advantages
The approach I have chosen is to use a cryptographic keycard: an
external device, usually connected through the USB port, that stores the
private key material and performs critical cryptographic operations on
the behalf of the host. For example, the FST-01
keycard can perform RSA and
ECC public-key decryption without ever exposing the private key material
to the host. In effect, a keycard is a miniature computer that performs
restricted computations for another host. Keycards usually support
multiple "slots" to store subkeys. The OpenPGP standard specifies there
are three subkeys available by default: for signature, authentication,
and encryption. Finally, keycards can have an actual physical keypad to
enter passwords so a potential keylogger cannot capture them, although
the keycards I have access to do not feature such a keypad.
We could easily draw a parallel between keycards and an air-gapped
system; in effect, a keycard is a miniaturized air-gapped computer and
suffers from similar problems. An attacker can intercept data on the
host system and attack the device in the same way, if not more easily,
because a keycard is actually "online" (i.e. clearly not air-gapped)
when connected. The advantage over a fully-fledged air-gapped computer,
however, is that the keycard implements only a restricted set of
operations. So it is easier to create an open hardware and software
design that is audited and verified, which is much harder to accomplish
for a general-purpose computer.
Like air-gapped systems, keycards address the scenario where an attacker
wants to get the private key material. While an attacker could fool the
keycard into signing or decrypting some data, this is possible only
while the key is physically connected, and the keycard software will
prompt the user for a password before doing the operation, though the
keycard can cache the password for some time. In effect, it thwarts
offline attacks: to brute-force the key's password, the attacker needs
to be on the target system and try to guess the keycard's password,
which will lock itself after a limited number of tries. It also provides
for a clean and standard interface to store keys offline: a single GnuPG
command moves private key material to a keycard (the keytocard command
in the --edit-key interface), whereas moving private key material to a
LUKS-encrypted device or air-gapped computer is more complex.
Keycards are also useful if you operate on multiple computers. A common
problem when using GnuPG on multiple machines is how to safely copy and
synchronize private key material among different devices, which
introduces new security problems. Indeed, a "good rule of thumb in a
forensics lab",
according
to Robert J. Hansen on the GnuPG mailing list, is to "store the minimum
personal data possible on your systems". Keycards provide the best of
both worlds here: you can use your private key on multiple computers
without actually storing it in multiple places. In fact, Mike Gerwitz
went as far as
saying:
For users that need their GPG key on multiple boxes, I consider a
smartcard to be essential. Otherwise, the user is just furthering her
risk of compromise.
Keycard tradeoffs
As Gerwitz hinted, there are multiple downsides to using a keycard,
however. Another DD, Wouter Verhelst clearly
expressed
the tradeoffs:
Smartcards are useful. They ensure that the private half of your key
is never on any hard disk or other general storage device, and
therefore that it cannot possibly be stolen (because there's only one
possible copy of it).
Smartcards are a pain in the ass. They ensure that the private half of
your key is never on any hard disk or other general storage device but
instead sits in your wallet, so whenever you need to access it, you
need to grab your wallet to be able to do so, which takes more effort
than just firing up GnuPG. If your laptop doesn't have a builtin
cardreader, you also need to fish the reader from your backpack or
wherever, etc.
"Smartcards" here refer to older OpenPGP
cards that relied on the
IEC 7816 smartcard
connectors and therefore
needed a specially-built smartcard reader. Newer keycards simply use a
standard USB connector. In any case, it's true that having an external
device introduces new issues: attackers can steal your keycard, you can
simply lose it, or wash it with your dirty laundry. A laptop or a
computer can also be lost, of course, but it is much easier to lose a
small USB keycard than a full laptop and I have yet to hear of someone
shoving a full laptop into a washing machine. When you lose your
keycard, unless a separate revocation certificate is available
somewhere, you lose complete control of the key, which is catastrophic.
But, even if you revoke the lost key, you need to create a new one,
which involves rebuilding the web of trust for the key a rather
expensive operation as it usually requires meeting other OpenPGP users
in person to exchange fingerprints.
You should therefore think about how to back up the certification key,
which is a problem that already exists for online keys; of course,
everyone has a revocation certificates and backups of their OpenPGP
keys... right? In the keycard scenario, backups may be multiple keycards
distributed geographically.
Note that, contrary to an air-gapped system, a key generated on a
keycard cannot be backed up, by design. For subkeys, this is not a
problem as they do not need to be backed up (except encryption keys).
But, for a certification key, this means users need to generate the key
on the host and transfer it to the keycard, which means the host is
expected to have enough entropy to generate cryptographic-strength
random numbers, for example. Also consider the possibility of combining
different approaches: you could, for example, use a keycard for
day-to-day operation, but keep a backup of the certification key on a
LUKS-encrypted offline volume.
Keycards introduce a new element into the trust chain: you need to trust
the keycard manufacturer to not have any hostile code in the key's
firmware or hardware. In addition, you need to trust that the
implementation is correct. Keycards are harder to update: the firmware
may be deliberately inaccessible to the host for security reasons or may
require special software to manipulate. Keycards may be slower than the
CPU in performing certain operations because they are small embedded
microcontrollers with limited computing power.
Finally, keycards may encourage users to trust multiple machines with
their secrets, which works against the "minimum personal data"
principle. A completely different approach called the trusted physical
console
(TPC) does the opposite: instead of trying to get private key material
onto all of those machines, just have them on a single machine that is
used for everything. Unlike a keycard, the TPC is an actual computer,
say a laptop, which has the advantage of needing no special procedure to
manage keys. The downside is, of course, that you actually need to carry
that laptop everywhere you go, which may be problematic, especially in
some corporate environments that restrict bringing your own devices.
Quick keycard "howto"
Getting keys onto a keycard is easy enough:
Use the key command to select the first subkey, then copy it to
the keycard (you can also use the addcardkey command to just
generate a new subkey directly on the keycard):
gpg> key 1
gpg> keytocard
If you want to move the subkey, use the save command, which will
remove the local copy of the private key, so the keycard will be the
only copy of the secret key. Otherwise use the quit command to
save the key on the keycard, but keep the secret key in your normal
keyring; answer "n" to "save changes?" and "y" to "quit without
saving?" . This way the keycard is a backup of your secret key.
Once you are satisfied with the results, repeat steps 1 through 4
with your normal keyring (unset $GNUPGHOME)
When a key is moved to a keycard, --list-secret-keys will show it as
sec> (or ssb> for subkeys) instead of the usual sec keyword. If
the key is completely missing (for example, if you moved it to a LUKS
container), the # sign is used instead. If you need to use a key from
a keycard backup, you simply do gpg --card-edit with the key plugged
in, then type the fetch command at the prompt to fetch the public key
that corresponds to the private key on the keycard (which stays on the
keycard). This is the same procedure as the one to use the secret key
on another
computer.
Conclusion
There are already informal OpenPGP best-practices
guides
out there and some recommend storing keys offline, but they rarely
explain what exactly that means. Storing your primary secret key offline
is important in dealing with possible compromises and we examined the
main ways of doing so: either with an air-gapped system, LUKS-encrypted
keyring, or by using keycards. Each approach has its own tradeoffs, but
I recommend getting familiar with keycards if you use multiple computers
and want a standardized interface with minimal configuration trouble.
And of course, those approaches can be combined. This
tutorial,
for example, uses a keycard on an air-gapped computer, which neatly
resolves the question of how to transmit signatures between the
air-gapped system and the world. It is definitely not for the faint of
heart, however.
Once one has decided to use a keycard, the next order of business is to
choose a specific device. That choice will be addressed in a followup
article, where I will look at performance, physical design, and other
considerations.
While the adoption of OpenPGP by the general
population is marginal at best, it is a critical component for the
security community and particularly for Linux distributions. For
example, every package uploaded into Debian is verified by the central
repository using the maintainer's OpenPGP keys and the repository itself
is, in turn, signed using a separate key. If upstream packages also use
such signatures, this creates a complete trust path from the original
upstream developer to users. Beyond that, pull requests for the Linux
kernel are verified using signatures as well. Therefore, the stakes are
high: a compromise of the release key, or even of a single maintainer's
key, could enable devastating attacks against many machines.
That has led the Debian community to develop a good grasp of best
practices for cryptographic signatures (which are typically handled
using GNU Privacy Guard, also known as GnuPG or
GPG). For example, weak (less than 2048 bits) and
vulnerable PGPv3 keys were
removed from
the keyring in 2015, and there is a strong culture of cross-signing keys
between Debian members at in-person meetings. Yet even Debian developers
(DDs) do not seem to have established practices on how to actually store
critical private key material, as we can see in this
discussion
on the debian-project mailing list. That email boiled down to a simple
request: can I have a "key dongles for dummies" tutorial? Key dongles,
or keycards as we'll call them here, are small devices that allow users
to store keys on an offline device and provide one possible solution for
protecting private key material. In this article, I hope to use my
experience in this domain to clarify the issue of how to store those
precious private keys that, if compromised, could enable arbitrary code
execution on millions of machines all over the world.
Why store keys offline?
Before we go into details about storing keys offline, it may be useful
to do a small reminder of how the OpenPGP
standard works. OpenPGP keys are
made of a main public/private key pair, the certification key, used to
sign user identifiers and subkeys. My public key, shown below, has the
usual main certification/signature key (marked SC) but also an
encryption subkey (marked E), a separate signature key (S), and two
authentication keys (marked A) which I use as RSA keys to log into
servers using SSH, thanks to the
Monkeysphere project.
pub rsa4096/792152527B75921E 2009-05-29 [SC] [expires: 2018-04-19]
8DC901CE64146C048AD50FBB792152527B75921E
uid [ultimate] Antoine Beaupr <anarcat@anarc.at>
uid [ultimate] Antoine Beaupr <anarcat@koumbit.org>
uid [ultimate] Antoine Beaupr <anarcat@orangeseeds.org>
uid [ultimate] Antoine Beaupr <anarcat@debian.org>
sub rsa2048/B7F648FED2DF2587 2012-07-18 [A]
sub rsa2048/604E4B3EEE02855A 2012-07-20 [A]
sub rsa4096/A51D5B109C5A5581 2009-05-29 [E]
sub rsa2048/3EA1DDDDB261D97B 2017-08-23 [S]
All the subkeys (sub) and identities (uid) are bound by the main
certification key using cryptographic self-signatures. So while an
attacker stealing a private subkey can spoof signatures in my name or
authenticate to other servers, that key can always be revoked by the
main certification key. But if the certification key gets stolen, all
bets are off: the attacker can create or revoke identities or subkeys as
they wish. In a catastrophic scenario, an attacker could even steal the
key and remove your copies, taking complete control of the key, without
any possibility of recovery. Incidentally, this is why it is so
important to generate a revocation certificate and store it offline.
So by moving the certification key offline, we reduce the attack surface
on the OpenPGP trust chain: day-to-day keys (e.g. email encryption or
signature) can stay online but if they get stolen, the certification key
can revoke those keys without having to revoke the main certification
key as well. Note that a stolen encryption key is a different problem:
even if we revoke the encryption subkey, this will only affect future
encrypted messages. Previous messages will be readable by the attacker
with the stolen subkey even if that subkey gets revoked, so the benefits
of revoking encryption certificates are more limited.
Common strategies for offline key storage
Considering the security tradeoffs, some propose storing those critical
keys offline to reduce those threats. But where exactly? In an attempt
to answer that question, Jonathan McDowell, a member of the Debian
keyring maintenance team,
said that there are three
options:
use an external LUKS-encrypted volume, an air-gapped system, or a
keycard.
Full-disk encryption like LUKS adds an extra layer of security by hiding
the content of the key from an attacker. Even though private keyrings
are usually protected by a passphrase, they are easily identifiable as a
keyring. But when a volume is fully encrypted, it's not immediately
obvious to an attacker there is private key material on the device.
According
to Sean Whitton, another advantage of LUKS over plain GnuPG keyring
encryption is that you can pass the --iter-time argument when creating
a LUKS partition to increase key-derivation delay, which makes
brute-forcing much harder. Indeed, GnuPG 2.x doesn't
have a run-time option to configure the
key-derivation algorithm, although a
patch was introduced recently to make the
delay configurable at compile time in gpg-agent, which is now
responsible for all secret key operations.
The downside of external volumes is complexity: GnuPG makes it difficult
to extract secrets out of its keyring, which makes the first setup
tricky and error-prone. This is easier in the 2.x series thanks to the
new storage system and the associated keygrip files, but it still
requires arcane knowledge of GPG internals. It is also inconvenient to
use secret keys stored outside your main keyring when you actually do
need to use them, as GPG doesn't know where to find those keys anymore.
Another option is to set up a separate air-gapped system to perform
certification operations. An example is the PGP clean
room project,
which is a live system based on Debian and designed by DD Daniel Pocock
to operate an OpenPGP and X.509 certificate authority using commodity
hardware. The basic principle is to store the secrets on a different
machine that is never connected to the network and, therefore, not
exposed to attacks, at least in theory. I have personally discarded that
approach because I feel air-gapped systems provide a false sense of
security: data eventually does need to come in and out of the system,
somehow, even if only to propagate signatures out of the system, which
exposes the system to attacks.
System updates are similarly problematic: to keep the system secure,
timely security updates need to be deployed to the air-gapped system. A
common use pattern is to share data through USB keys, which introduce a
vulnerability where attacks like
BadUSB can infect the air-gapped
system. From there, there is a multitude of exotic ways of exfiltrating
the data using
LEDs,
infrared
cameras,
or the good old
TEMPEST
attack. I therefore concluded the complexity tradeoffs of an air-gapped
system are not worth it. Furthermore, the workflow for air-gapped
systems is complex: even though PGP clean room went a long way, it's
still lacking even simple scripts that allow signing or transferring
keys, which is a problem shared by the external LUKS storage approach.
Keycard advantages
The approach I have chosen is to use a cryptographic keycard: an
external device, usually connected through the USB port, that stores the
private key material and performs critical cryptographic operations on
the behalf of the host. For example, the FST-01
keycard can perform RSA and
ECC public-key decryption without ever exposing the private key material
to the host. In effect, a keycard is a miniature computer that performs
restricted computations for another host. Keycards usually support
multiple "slots" to store subkeys. The OpenPGP standard specifies there
are three subkeys available by default: for signature, authentication,
and encryption. Finally, keycards can have an actual physical keypad to
enter passwords so a potential keylogger cannot capture them, although
the keycards I have access to do not feature such a keypad.
We could easily draw a parallel between keycards and an air-gapped
system; in effect, a keycard is a miniaturized air-gapped computer and
suffers from similar problems. An attacker can intercept data on the
host system and attack the device in the same way, if not more easily,
because a keycard is actually "online" (i.e. clearly not air-gapped)
when connected. The advantage over a fully-fledged air-gapped computer,
however, is that the keycard implements only a restricted set of
operations. So it is easier to create an open hardware and software
design that is audited and verified, which is much harder to accomplish
for a general-purpose computer.
Like air-gapped systems, keycards address the scenario where an attacker
wants to get the private key material. While an attacker could fool the
keycard into signing or decrypting some data, this is possible only
while the key is physically connected, and the keycard software will
prompt the user for a password before doing the operation, though the
keycard can cache the password for some time. In effect, it thwarts
offline attacks: to brute-force the key's password, the attacker needs
to be on the target system and try to guess the keycard's password,
which will lock itself after a limited number of tries. It also provides
for a clean and standard interface to store keys offline: a single GnuPG
command moves private key material to a keycard (the keytocard command
in the --edit-key interface), whereas moving private key material to a
LUKS-encrypted device or air-gapped computer is more complex.
Keycards are also useful if you operate on multiple computers. A common
problem when using GnuPG on multiple machines is how to safely copy and
synchronize private key material among different devices, which
introduces new security problems. Indeed, a "good rule of thumb in a
forensics lab",
according
to Robert J. Hansen on the GnuPG mailing list, is to "store the minimum
personal data possible on your systems". Keycards provide the best of
both worlds here: you can use your private key on multiple computers
without actually storing it in multiple places. In fact, Mike Gerwitz
went as far as
saying:
For users that need their GPG key on multiple boxes, I consider a
smartcard to be essential. Otherwise, the user is just furthering her
risk of compromise.
Keycard tradeoffs
As Gerwitz hinted, there are multiple downsides to using a keycard,
however. Another DD, Wouter Verhelst clearly
expressed
the tradeoffs:
Smartcards are useful. They ensure that the private half of your key
is never on any hard disk or other general storage device, and
therefore that it cannot possibly be stolen (because there's only one
possible copy of it).
Smartcards are a pain in the ass. They ensure that the private half of
your key is never on any hard disk or other general storage device but
instead sits in your wallet, so whenever you need to access it, you
need to grab your wallet to be able to do so, which takes more effort
than just firing up GnuPG. If your laptop doesn't have a builtin
cardreader, you also need to fish the reader from your backpack or
wherever, etc.
"Smartcards" here refer to older OpenPGP
cards that relied on the
IEC 7816 smartcard
connectors and therefore
needed a specially-built smartcard reader. Newer keycards simply use a
standard USB connector. In any case, it's true that having an external
device introduces new issues: attackers can steal your keycard, you can
simply lose it, or wash it with your dirty laundry. A laptop or a
computer can also be lost, of course, but it is much easier to lose a
small USB keycard than a full laptop and I have yet to hear of someone
shoving a full laptop into a washing machine. When you lose your
keycard, unless a separate revocation certificate is available
somewhere, you lose complete control of the key, which is catastrophic.
But, even if you revoke the lost key, you need to create a new one,
which involves rebuilding the web of trust for the key a rather
expensive operation as it usually requires meeting other OpenPGP users
in person to exchange fingerprints.
You should therefore think about how to back up the certification key,
which is a problem that already exists for online keys; of course,
everyone has a revocation certificates and backups of their OpenPGP
keys... right? In the keycard scenario, backups may be multiple keycards
distributed geographically.
Note that, contrary to an air-gapped system, a key generated on a
keycard cannot be backed up, by design. For subkeys, this is not a
problem as they do not need to be backed up (except encryption keys).
But, for a certification key, this means users need to generate the key
on the host and transfer it to the keycard, which means the host is
expected to have enough entropy to generate cryptographic-strength
random numbers, for example. Also consider the possibility of combining
different approaches: you could, for example, use a keycard for
day-to-day operation, but keep a backup of the certification key on a
LUKS-encrypted offline volume.
Keycards introduce a new element into the trust chain: you need to trust
the keycard manufacturer to not have any hostile code in the key's
firmware or hardware. In addition, you need to trust that the
implementation is correct. Keycards are harder to update: the firmware
may be deliberately inaccessible to the host for security reasons or may
require special software to manipulate. Keycards may be slower than the
CPU in performing certain operations because they are small embedded
microcontrollers with limited computing power.
Finally, keycards may encourage users to trust multiple machines with
their secrets, which works against the "minimum personal data"
principle. A completely different approach called the trusted physical
console
(TPC) does the opposite: instead of trying to get private key material
onto all of those machines, just have them on a single machine that is
used for everything. Unlike a keycard, the TPC is an actual computer,
say a laptop, which has the advantage of needing no special procedure to
manage keys. The downside is, of course, that you actually need to carry
that laptop everywhere you go, which may be problematic, especially in
some corporate environments that restrict bringing your own devices.
Quick keycard "howto"
Getting keys onto a keycard is easy enough:
Use the key command to select the first subkey, then copy it to
the keycard (you can also use the addcardkey command to just
generate a new subkey directly on the keycard):
gpg> key 1
gpg> keytocard
If you want to move the subkey, use the save command, which will
remove the local copy of the private key, so the keycard will be the
only copy of the secret key. Otherwise use the quit command to
save the key on the keycard, but keep the secret key in your normal
keyring; answer "n" to "save changes?" and "y" to "quit without
saving?" . This way the keycard is a backup of your secret key.
Once you are satisfied with the results, repeat steps 1 through 4
with your normal keyring (unset $GNUPGHOME)
When a key is moved to a keycard, --list-secret-keys will show it as
sec> (or ssb> for subkeys) instead of the usual sec keyword. If
the key is completely missing (for example, if you moved it to a LUKS
container), the # sign is used instead. If you need to use a key from
a keycard backup, you simply do gpg --card-edit with the key plugged
in, then type the fetch command at the prompt to fetch the public key
that corresponds to the private key on the keycard (which stays on the
keycard). This is the same procedure as the one to use the secret key
on another
computer.
Conclusion
There are already informal OpenPGP best-practices
guides
out there and some recommend storing keys offline, but they rarely
explain what exactly that means. Storing your primary secret key offline
is important in dealing with possible compromises and we examined the
main ways of doing so: either with an air-gapped system, LUKS-encrypted
keyring, or by using keycards. Each approach has its own tradeoffs, but
I recommend getting familiar with keycards if you use multiple computers
and want a standardized interface with minimal configuration trouble.
And of course, those approaches can be combined. This
tutorial,
for example, uses a keycard on an air-gapped computer, which neatly
resolves the question of how to transmit signatures between the
air-gapped system and the world. It is definitely not for the faint of
heart, however.
Once one has decided to use a keycard, the next order of business is to
choose a specific device. That choice will be addressed in a followup
article, where I will look at performance, physical design, and other
considerations.
Last weekend, as a result of my addiction to buying random microcontrollers to play with, I received some Maple Minis. I bought the Baite clone direct from AliExpress - so just under 3 each including delivery. Not bad for something that s USB capable, is based on an ARM and has plenty of IO pins.
I m not entirely sure what my plan is for the devices, but as a first step I thought I d look at getting GnuK up and running on it. Only to discover that chopstx already has support for the Maple Mini and it was just a matter of doing a ./configure --vidpid=234b:0000 --target=MAPLE_MINI --enable-factory-reset ; make. I d hoped to install via the DFU bootloader already on the Mini but ended up making it unhappy so used SWD by following the same steps with OpenOCD as for the FST-01/BusPirate. (SWCLK is D21 and SWDIO is D22 on the Mini). Reset after flashing and the device is detected just fine:
usb 1-1.1: new full-speed USB device number 73 using xhci_hcd
usb 1-1.1: New USB device found, idVendor=234b, idProduct=0000
usb 1-1.1: New USB device strings: Mfr=1, Product=2, SerialNumber=3
usb 1-1.1: Product: Gnuk Token
usb 1-1.1: Manufacturer: Free Software Initiative of Japan
usb 1-1.1: SerialNumber: FSIJ-1.2.3-87155426
And GPG is happy:
$ gpg --card-status
Reader ...........: 234B:0000:FSIJ-1.2.3-87155426:0
Application ID ...: D276000124010200FFFE871554260000
Version ..........: 2.0
Manufacturer .....: unmanaged S/N range
Serial number ....: 87155426
Name of cardholder: [not set]
Language prefs ...: [not set]
Sex ..............: unspecified
URL of public key : [not set]
Login data .......: [not set]
Signature PIN ....: forced
Key attributes ...: rsa2048 rsa2048 rsa2048
Max. PIN lengths .: 127 127 127
PIN retry counter : 3 3 3
Signature counter : 0
Signature key ....: [none]
Encryption key....: [none]
Authentication key: [none]
General key info..: [none]
While GnuK isn t the fastest OpenPGP smart card implementation this certainly seems to be one of the cheapest ways to get it up and running. (Plus the fact that chopstx already runs on the Mini provides me with a useful basis for other experimentation.)
Just before I went to DebConf15 I got around to setting up my gnuk with the latest build (1.1.7), which supports 4K RSA keys. As a result I decided to generate a new certification only primary key, using a live CD on a non-networked host and ensuring the raw key was only ever used in this configuration. The intention is that in general I will use the key via the gnuk, ensuring no danger of leaking the key material.
I took part in various key signings at DebConf and the subsequent UK Debian BBQ, and finally today got round to dealing with the key slips I had accumulated. I m sure I ve missed some people off my signing list, but at least now the key should be embedded into the strong set of keys. Feel free to poke me next time you see me if you didn t get mail from me with fresh signatures and you think you should have.
Key details are:
I have no reason to assume my old key (0x94FA372B2DA8B985) has been compromised and for now continue to use that key. Also for the new key I have not generated any subkeys as yet, which caff handles ok but emits a warning about unencrypted mail. Thanks to those of you who sent me signatures despite this.
[Update: I was asked about my setup for the key generation, in particular how I ensured enough entropy, given that it was a fresh boot and without networking there were limited entropy sources available to the machine. I made the decision that the machine s TPM and the use of tpm-rng and rng-tools was sufficient (i.e. I didn t worry overly about the TPM being compromised for the purposes of feeding additional information into the random pool). Alternative options would have been flashing the gnuk with the NeuG firmware or using my Entropy Key.]
Last year at DebConf14 Lucas authorized the purchase of a handful of gnuk devices, one of which I obtained. At the time it only supported 2048 bit RSA keys. I took a look at what might be involved in adding 4096 bit support during DebConf and managed to brick my device several times in doing so. Thankfully gniibe was on hand with his STLinkV2 to help me recover. However subsequently I was loathe to experiment further at home until I had a suitable programmer.
As it is this year has been busy and the 1.1.x release train is supposed to have 4K RSA (as well as ECC) support. DebConf15 is coming up and I felt I should finally sort out playing with the device properly. I still didn t have a suitable programmer. Or did I? Could my trusty Bus Pirate help?
The FST-01 has an STM32F103TB on it. There is an exposed SWD port. I found a few projects that claimed to do SWD with a Bus Pirate - Will Donnelly has a much cloned Python project, the MC HCK project have a programmer in Ruby and there s LibSWD though that s targeted to smarter programmers. None of them worked for me; I could get the Python bits as far as correctly doing the ID of the device, but not reading the option bytes or successfully flashing (though I did manage an erase).
Enter the old favourite, OpenOCD. This already has SWD support and there s an outstanding commit request to add Bus Pirate support. NodoNogard has a post on using the ST-Link/V2 with OpenOCD and the FST-01 which provided some useful pointers. I grabbed the patch from Gerrit, applied it to OpenOCD git and built an openocd.cfg that contained:
source [find interface/buspirate.cfg]
buspirate_port /dev/ttyUSB0
buspirate_vreg 1
buspirate_mode normal
transport select swd
source [find target/stm32f1x.cfg]
My BP has the Seeed Studio probe cable, so my hookups look like this:
That s BP MOSI (grey) to SWD IO, BP CLK (purple) to SWD CLK, BP 3.3V (red) to FST-01 PWR and BP GND (brown) to FST-01 GND. Once that was done I fired up OpenOCD in one terminal and did the following in another:
$ telnet localhost 4444
Trying ::1...
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Open On-Chip Debugger
> reset halt
target state: halted
target halted due to debug-request, current mode: Thread
xPSR: 0x01000000 pc: 0xfffffffe msp: 0xfffffffc
Info : device id = 0x20036410
Info : SWD IDCODE 0x1ba01477
Error: Failed to read memory at 0x1ffff7e2
Warn : STM32 flash size failed, probe inaccurate - assuming 128k flash
Info : flash size = 128kbytes
> stm32f1x unlock 0
Device Security Bit Set
stm32x unlocked.
INFO: a reset or power cycle is required for the new settings to take effect.
> reset halt
target state: halted
target halted due to debug-request, current mode: Thread
xPSR: 0x01000000 pc: 0xfffffffe msp: 0xfffffffc
> flash write_image erase /home/noodles/checkouts/gnuk/src/build/gnuk.elf
auto erase enabled
wrote 109568 bytes from file /home/noodles/checkouts/gnuk/src/build/gnuk.elf in 95.055603s (1.126 KiB/s)
> stm32f1x lock 0
stm32x locked
> reset halt
target state: halted
target halted due to debug-request, current mode: Thread
xPSR: 0x01000000 pc: 0x08000280 msp: 0x20005000
Then it was a matter of disconnecting the gnuk from the BP, plugging it into my USB port and seeing it come up successfully:
usb 1-2: new full-speed USB device number 11 using xhci_hcd
usb 1-2: New USB device found, idVendor=234b, idProduct=0000
usb 1-2: New USB device strings: Mfr=1, Product=2, SerialNumber=3
usb 1-2: Product: Gnuk Token
usb 1-2: Manufacturer: Free Software Initiative of Japan
usb 1-2: SerialNumber: FSIJ-1.1.7-87063020
usb 1-2: ep 0x82 - rounding interval to 1024 microframes, ep desc says 2040 microframes
Last weekend, I (knok), Hideki (henrich) and Yutaka (gniibe) met with John Paul Adrian Glaubitz (glaubitz).
In the past, I had met with another Germany developer Jens Schmalzing (jensen) in Japan. He was a good guy, but unfortunately he gone in 2005.
I had an old OpenPGP key with his sign. It is a record of his activity, but the key is weak nowaday (1024D), so I stop to use the key but don t issue revoke.
Anyway glaubitz is also a good guy, and he loves old videogame console. gniibe gave him five DreamCast consoles. I bring him to SUPER POTATO, a old videogame shop. He bought some software for Virtual Boy.
DebConf 2015 will hold in Germany, I want to go for it if I can.
Andrew, I bought those wootoff lights as well, and have them connected to a hub on my mythtv system so I can activate them with a remote. I use the hub-ctrl.c utility from this page with this simple wrapper script that searches for the hub:
#!/bin/sh
bus=$(lsusb grep TUSB2046 cut -d' ' -f2)
dev=$(lsusb grep TUSB2046 cut -d' ' -f4 sed 's/:$//')
port=4
hubctrl=/home/dannf/hub-ctrl
if $hubctrl -b "$bus" -d "$dev" -v grep "Port $ port :" grep -q power; then
toggle=0
else
toggle=1
fi
$hub-ctrl -b "$bus" -d "$dev" -P "$port" -p "$toggle"
Note that not all hubs implement the port power feature - but luckily I had an unused one laying around that does.
Unfortunately, one of my lights won't spin unless the physical power switch on the light is toggled - hopefully that's not true for yours.
I've had a few enquiries from my blog post
about trying to control dumb USB-powered lights.
I thought I'd just write something up to save myself replying to any more
emails.
Yes, it's doable. Finding a USB hub that will do it is another story. From
my own research, I found someone else who was doing something with
USB-powered devices (I can't remember what now), and he had been using
a what is now a Linksys
ProConnect USB 4-Port Hub USB 2.0
I'd first tried a couple of random cheap hubs from Fry's with no success
(fortunately I was able to return them) before I determined that the Linksys
one would definitely work. The downside of the Linksys hub is it requires
external power. It was also one of the more expensive USB hubs on the
market.
One of my co-workers, who is an Electrical Engineer by education, said that
the USB spec requires the functionality that I wanted, but most chip
manufacturers had cut a corner in the interests of cost saving. The Linksys
hub uses an NEC chipset. Every other hub that I could get my hands on had a
Genesys Logic chipset, and did not work. You can tell if you've got a winner
by the output of lsusb -v. If the hub characteristics include
"Per-port power switching", you're in business.
To do the actual port powering on and off, I'm using a setuid-root hub-ctrl,
wrapped with a small shell script, which has the USB ID of the hub and the
port number the lights are plugged into hard-coded in it.
In my searching, I found also that it may
be possible to do with Python, but I did not invest the time trying to
find out.
Continuing with my mission to
control the power to some dumb USB-powered lights...
Dann Frazier confirmed my theory
that a USB hub would indeed do the job. I'd already found the hub-ctrl.c
program he mentioned, but couldn't get it to work with my built-in USB
ports of my laptop.
It seems it all depends on whether the hub will support per-port power
switching or not. (lsusb -v will tell you).
So off to the mighty institution that is Fry's Electronics I went.
The first two attempts failed, as did a borrowed hub. It seems most (at
least the cheap one) use a Genesys Logic chipset, which does not support
per-port power control. Fortunately I chose products that weren't in blister
packs, so I was able to return them to Fry's in as-new condition and try
again.
Now that I knew USB hubs would do the trick, I did some more targeted
searching, and found this thread where someone had been
messing around with hubs to do what sounded like what I wanted.
(Incidentally, this
email in the thread also provided a nice looking Python program. I'm
going to look at refitting it to use the "real" Python USB module.)
I emailed the poster to ask him what brand of hub he was using. The answer
was the Linksys 4-port
hub
So I managed to track down one of them last night. Yes, it works, but the
downside is the hub itself requires external power, which is a bit
unfortunate. Not only do I need to use a hub to make these lights software
controllable, I have to plug the hub into a power outlet. Bleh.