PSA Initial Attestation SW Architecture
The top-level public API is at the secure/non-secure boundary. It allows callers on the non-
secure side to get attestation tokens generated from the secure side.
The implementation is layered as follows:
- Main implementation that provides the pubic API. It calls the following:
- Various system functions to get the data for various claims
- QCBOR to encode the claim in CBOR
- The EAT/CWT library to make a signed token out of the clams. It calls the following:
- QCBOR to set up for CBOR encoding
- COSE Sign 1 implementation. It calls the following
- QCBOR library to format the COSE signature
- ECDSA signing
- Hash functions
The rest of this section describes details of the layers from bottom to top.
UsefulBuf is a set of functions for copying, comparing and generally manipulating binary blobs
of data. QCBOR is fundamentally built on it and uses it in its public interface. It is also
used by some of the other SW layers.
A UsefulBuf is a simple structure with just a pointer and a length. This structure is passed
as a parameter and returned as a value as this is allows by C99.
The point of the library is to simplify code and make it cleaner by having a way
to carry around a pointer and length for binary values. It is also to make the code
safer and easier to review by concentrating all pointer manipulation in only a few
There is also a header file called useful_vec.h which provides most of the same
functions in TF-M style code rather than the camel case used by UsefulBuf and
Only the CBOR encoder is needed. There is no decoding.
Just about all SW layers in the stack call the CBOR encoder for some
reason or other.
ECDSA signing and key
The psa_asymmetric_sign() API is called to perform the ECDSA signature.
TODO: What key slot is used, how the key gets into that key slot and what
the key ID will be? This is an open issue for both the fully-secured key
provisioned in the factory and for the global ECDSA debug and test key.
TODO: Figure out out to use psa_export_public_key() to get the ECDSA
public key out so it can be formatted as a COSE key and hashsed to turn it into
the kid protected header.
The psa_hash_start(), psa_hash_update() and psa_hash_finish() functions are used.
SHA-256 is the only function that will be initially hooked up, but the general architecture
will allow for use of others.
COSE Sign 1
This is one of the main parts of the SW stack. It is purpose-built primarily for this
use case, not part of some general full COSE implementation. It only supports
the COSE Sign 1 format. This is the format for singing something when there is
just one signer.
It calls the above-mentioned layers -- UsefulBuf, QCBOR, ECDSA signing and hashing.
It has a simple interface. First q_cose_sign1_init() is called to get things started.
The above layer will have created a CBOR encoding context that is passed to q_cose_sign1_init().
After that q_cose_sign1_init() call, data is added to the CBOR encoding context. When all
the payload is complete, q_cose_sign1_finish() is called and this is when
all the core work of hashing and signing is done.
It uses COSE integer algorithm IDs in its interface.
It supports some debug and tests modes that are runtime selectable.
It allocates no memory, but it does need 300 or so bytes of stack (If this
is too much, it could be allocated differently without too much trouble).
This is a thin layer over COSE Sign 1 that mostly manages the QCBOR encoder
context. It can be used to create an EAT token or a CWT token.
Its main two functions are a start() and finish(). They always have to be called.
In between them are the calls to add to the payload that is to be signed. There are
three ways to build up the payload.
- Call individual functions to add integers and strings. This allows for
very simple payloads only without and nesting
- Add already-formatted CBOR to the payload. This requires separately
encoding the CBOR in a separate buffer and thus needs two copies
of the payload in memory.
- Borrow the CBOR encoding context and use it. This allows
any CBOR structure and doesn't need a separate copy of the encoded
CBOR in memory.
This layer doesn't add any claims or payload.
This layer allocates no memory and uses very little stack over and above that
use by the COSE layer. It writes its output to the buffer passed in to
hold the completed token and doesn't need any internal temporary copies
of the payload.
The Main Service Layer
This implements the public interface that is called from the non-secure
This allocates the buffer into which the token is to be written. This is the
one and only big chunk of memory needed in this SW stack.
This starts up the EAT/CWT library.
This fetches the various claims that make up the payload and adds them
in. The CBOR types are described. Note that in CBOR integers can be any
size from 8 to 64-bits. Similarly, the binary blobs can be any length.
The planned claims as follows:
Binary blob. Input from the non-secure side. This is a standard EAT claim.
Identifies the caller from the non-secure side that requested the token. It
comes from tfm_core_get_caller_client_id(). It is a signed integer. This is
an ARM-proprietary claim.
which is input from the non-secure side
TODO: determine what this is, format and such. The EAT
draft defines a simple enumeration type for boot and debug
state that has no measurements. There is also some claims,
not yet in the EAT draft, for measurement. They are separate.
This may be proprietary or not.
Comes from tfm_plat_get_boot_seed(). It is an unsigned integer.
This is a proprietary ARM claim.
comes from tfm_plat_get_device_id() and is a byte array. It is an
EAT standard claim compatible with the UEID in the EAT draft.
- Algorithm IDs and Keys
- Algorithm IDs
The interfaces to the layers above described use COSE algorithm IDs to
indicate the hash function, ECDSA signing algorithm and such. These
are simple integers so they are easy and efficient.
Because they have to go into the encoded CBOR anyway as part
of the COSE specification, this is efficient.
At the lowest layer, when the operation is actually performed, mapping
to the psa algorithm IDs happens.
Most COSE algorithm IDs are like cipher suites. For example
-7 represents ECDSA with the P256 curve and SHA-256
as the hash.
Not every algorithm ID needed for this implementation is registered
with IANA yet, so some are defined in the proprietary range.
The IANA registry is here: https://www.iana.org/assignments/cose/cose.xhtml
The first three bits of the option flags for the public interface
are a key select with possible values of 0 through 7. They
select both the key and the signing algorithm at the same time
since keys are typically usable with a particular algorithm.
Value 0 is defined as the standard 256-bit key used with
ECDSA P256 and the SHA-256 hash. It selects
COSE algorithm ID -7.
Value 7 is defined as the global test key with ECDSA
P256 and the SHA-256 hash. It is for debug and test
only. It selects COSE algorithm ID -7.
Here are a few examples of possible values that might be
assigned later. Value 1 could be ECDSA P384 with SHA-384 and
corresponds to COSE algorithm ID -35. Value 2 could be
be a privacy-preserving ECDAA signing scheme.
There must be an COSE key ID, kid, header so the verifier of
the signature can know what public key to use to perform the verification.
In COSE the kid a binary blob that can be almost anything.
This implementation will use the SHA-256 hash of the public
key encoded as a COSE Key per the COSE specification. This is
preferred to RFC 5280 SKI because it is a pure CBOR/COSE
solution without any ASN.1 or other legacy formats and because
it is simple.
Debug and Test
This design is such that a near-complete test can be performed
from the non-secure side on a commercially deployed product.
This adds a little size to the code, but is considered worthwhile.
It is also desirable to be able to test this stack on devices that do
not have the attestation key configured or other data items (claims)
Last it is desirable to have non-secure test mode for commercial developers
of attestation without the need for access to a service or key material. This
mode unsecured and has no commercial value.
The following modes facilitate the above.
Short Circuit Signing
In this mode the actual signature algorithm doesn't run at all.
Instead, the signature is just a concatenation of one or more copies of the hash of the
data that would be input to the signing algorithm. It is the concatenation
of multiple copies of the hash to make the size the same as the real
signature would be.
This allows testing of the full stack without any public key cryptography
or any keys configured, or even without any public key cryptography
wired up at all.
It is also deterministic, unlike some signing algorithms that use
a random number in the computation of the signature so the
actual signature bits vary even for the same key and the same
payload. The determinism can make some testing easier.
This is flag 0x80000000.
Global Debug Key
Select this key by giving 7 as the key select value of the options bits.
This 256-bit ECDSA key is entirely fixed and hard-coded into the implementation.
Both the private and public parts of the key are globally published
so they can be used for debug and test.
It is highly useful for debug and test, both of the TF-M attestation
implementation and for users of TF-M attestation to test their
end-end system all the way back to their server.
Use of this key provides no security. An attacker has access to
the private key so they can make any token they want with it.
It is important to always note this key provides no security.
Claim exclusion flag
This debug mode flag excludes all claims from the token except
the nonce / challenge. This allows for binary comparison of a token to
its expected value for testing and will result in the same exact
token every time on every device for every configuration.
This is flag 0x40000000
The most basic test will use the claim exclusion flag and short-circuit
signing to produce a token that is constant. It will be compared byte
string compared to the expected result. No CBOR decoder is needed
to implement this.
The next test will use the CBOR decoder to take apart the token. It will
include all the claims. It will check for their presence, but not their
values since these will vary from device to device. it will use short-circuit
signing. It will compute all the hashes to verify the short-circuit
signature. This test will be fairly complicated.
The next test is a bit TBD, but it is planned to use the Global Debug
Key and actually call ECDSA signature verification on the device. It
will not verify the claims values.
If possible, some tests may be added that will verify the claims values
in the encoded CBOR.
Some adversarial tests:
- Select a key type that is not supported
- Pass in output buffer that is too small
- Pass in a challenge/nonce that is too large