Of course they're not going to sell processors that only run enclaved code signed by them. That would indeed be silly. I'm saying that code that runs within an enclave will be impossible to reverse engineer without the private keys.
Ok, well I think that in principle it's not such a bad thing. It's exactly as I described earlier: this is a very powerful mechanism, for use and abuse. If you have good evidence to trust your hardware manufacturer and your OS (...and your other software) then it's actually highly resistant to malevolent state actors. And therefore this category of innovation has the potential to safeguard your digital privacy in a way that's as close to absolute as there ever has been (given what we know now about the past). But perhaps SGX itself will be conniving in the extreme, we will find out in time. Intel will do themselves commercial harm to do this too overtly though, I strongly suspect the barriers to entry in the processor design/manufacture market will become lower and lower as we go through the 2020's. Imagine 3D printing your own processor design, as it will happen at some point in our lifetime.
The thing is, I DON'T trust my OS, or my other software. I trust that either I, or others have reverse engineered it to check that it doesn't do anything malicious. Just like I don't actually trust that a bank will return my deposit, I trust that I can take them to court if they don't.