This would be visible in the firmware source. [ ... ] With deterministic build, everybody can check the firmware. That does not mean that everybody HAS to. If 3 of 5 decided to sign something malicious, then the rest of the guys would be whistle-blowing and everybody would know. [ ... ] I was talking about proving that there is a backdoor. As I argued above, if there is one, you should be able to find it in the open-source code. It should be easy to prove.
There is a firmware source posted on github. There is a firmware binary in each client's Trezor. Note the indefinite articles. Can you see the problem now?
Come on guys, this vulnerability not my entry for the Nobel Prize, it is an absolutely trivial and well-known observation. If someone can get a malicious version of the firmware signed, he can easily trick many clients into installing it.
Hackers can even trick many users into installing an unsigned malicious version of the firmware and re-entering the recovery seeds. Do I have to spell out the details?
Trezor now has 16500 lines of code in *.c files and another 7000 in *.h files. This is a total for bootloader, firmware and I might included some testing and GUI code as well, that is not on the device so it is even less. And this includes many features discussed here that are not yet released. I don't see it getting to 100000 any time soon. Provided that some code is imported from other open source libraries, the Trezor code itself is even smaller.
We will see in a couple of years. Judging by the mood of this thread, the Trezor will soon be storing your gaming site passwords, your calorie counts, your dog's gym workout schedule, ...
(The Brazilian voting machine software was very small at the beginning, too.)
Meanwhile, how long do you think it would take for one person to review 20'000 lines of code and make sure that it has no weaknesses (like a broken random number generator, or a line somewhere that sticks the private key into the signed transaction that is sent tout to the infected computer)?
I asked earlier whether the hardware has some sort of memory protection that would prevent one function from accessing data areas of an unrelated function, but got no answer. If it doesn't, the dog workout code will have access to the bitcoin private keys; therefore that code, and every modification to it, must be verified with the same care that is spent on the bitcoin code proper. Worse still if the firmware can modify itself.
I'm not saying malicious firmware cannot be signed. I'm saying it cannot be signed without people knowing. And installing the unsigned one is of course possible as well, but that cannot be done without user knowing it. If user is warned and decides to install it anyway then it is his problem. I did not say it is impossible though.
The Trezor may store your game passwords and other passwords, provided they are derived from the same seed. In fact it can do it already with it's 20000 lines of code. You are exaggerating with the other "use cases". It's not going to happen.
20000 lines of code can be verified in a month or two for backdoors. To fully understand all of it, it takes more time. The point is, it's possible for a single person and people did it.