A lot of these are microtransactions (e.g., under a dollar's worth of coins) used for wagering
Actually a lot of them are not really "used for wagering", they're (ab)used as a way to send messages to the client the wagerers are using. Part of the point of the payment protocol work is to give sites like SatoshiDice a messaging system that doesn't require assigning meaning to magic quantities of money.
By the way, SatoshiDice is (afaik) not using compressed pubkeys. That's because bitcoinj doesn't use them. SatoshiDice is open source, it uses a fork of the library. We can probably halve the amount of data it generates just by fixing that.
But twelve months ago no SatoshiDICE existed and nobody knows what 2013's breakaway Bitcoin success story will be but what if there are four of them, each as popular as SatoshiDICE is today. Then the blockchain size limit (at the current restrictions) will be reached.
I fully agree we will reach the block size limits at some point, if only because people will come up with uses for micropayments beyond wagering. I think expecting a quadrupling of tx volume is optimistic, but hey, we're on the same side here. This is why I want to see the block size limit float.
It's doubtful that anyone will be able to hold the entire blockchain at some point in the future, should Bitcoin become a mainstream thing.
Your calculations are useful if my "mainstream thing" you mean "one world currency used for everything", which could only conceivably occur in some Star Trek future. If that does occur then yottabyte storage - for a few parties at least - is hardly inconceivable. I mean there was a time when "hold the web in RAM" sounded absurd yet multiple companies do it today. You don't need many parties to store the entire block chain because all other nodes can prune and only have to handle the working set size.
The current blockchain size is about 4.5G. If we prune all spent outputs, what the size would be?
It's on the order of a couple hundred megs. I forgot the exact size. Perfectly feasible to hold entirely in RAM for now.
That's a terrible way of deciding an important issue such as this. Miners are not a very important part of the Bitcoin economy, and they don't have much more understanding of Bitcoin than anyone else. Their "votes" shouldn't matter more than anyone else's. (There shouldn't be general voting at all, in fact -- democracy is a poor way of making decisions.)
You're right, I guess everyone should just do what I say instead. That's far more efficient. Good to hear you'll be on my side when the time comes
Obviously for a hard forking change it's economic majority that matters, not miner majority. That's why I said maybe transaction version numbers should be adjusted instead/as well. For the majority of participants who have no opinion one way or another, they will just upgrade to whatever their client developers deem best.
Every node right now, to my knowledge, will reject any block coming in at over 1MB. Miners included.
This will shortly cease to be the case because SPV clients are going to start receiving filtered blocks, so cannot check their size. Sooner or later the majority of all users will be on SPV clients. Whether this equates to the majority of economic activity is still uncertain - I'd hope most merchants run their own full node, but I honestly don't know what the makeup of Bitcoin-using merchants will look like over time.
But in the case of removing the 1MB limit, I don't think the agents involved here will agree. Specifically, the miners. SatoshiDice would certainly be on board for 1GB blocks, as would BitPay, Gox, and I'd imagine most end users. But I think miners have incentive to maintain the 1MB limit.
If all economic actors except some miners want the change then it's just equivalent to a 51% "attack" - if you can get a majority of hash power to agree, then the rest of the miners have to go along with it or see no income at all.
However, I think in the presence of a working alternative funding model most miners would accept larger blocks. I don't think there's any advantage to artificially throttling Bitcoins scalability if funding is not obtained via user-levied fees. I know gmaxwell argues strongly that allowing Bitcoin to scale would destroy its decentralization, but again I disagree.
Anyway, these are old, tired debates. The answers will become clear nearer the time.