Bitcoin has a fair number of probabilistic variables built into it, and the wallet's job is to account for that fact.
This has been the stock answer of small-blockian core devs to the issue of usability under saturated conditions: "the wallet can be programmed to compute the right fee, and adjust it (with RBF) if needed."
Their faith in computers is moving, but the wallet cannot compute an answer if it does not have the necessary data. What is the proper fee to get my transaction confirmed in less than 1 hour? Well, it depends on the fees paid by the transactions that are already in the queue, and by the transactions that will be issued by other clients in the next hour. Since the latter are running "smart wallets" too -- perhaps the exact same wallet that I am running -- the problem that the wallet has to solve is basically "choose a number that is very likely to be greater than the number that you are going to choose".
If the network is well below saturation, computing the fee is trivial. If you pay just the minimum fee, your transaction will get confirmed in the next few blocks. A larger fee will have effect only if the backlog of unconfirmed transactions becomes greater than 1 MB. This condition may be caused by an extra-long delay since the previous block, or by the miners solving one or more empty blocks due to "SPV mining" and network delays. As long as the network is not saturated, these backlogs are short-lived (lasting a couple hours at most) and have a predictable distribution of size, frequency, and durations; and many users will not mind the delays that they cause, so they will use minimum fees anyway. Therefore, if I need to have my transaction confirmed as soon as possible, I need to pay only a few times the minimum fee. Those who believe in "smart wallets" seem to be thinking about this situation only.
However, if the network becomes saturated, the situation will be very different. If the average demand T (transactions isssued per second) is greater than the effective capacity C of the network (2.4 tx/s), there will be a long-lasting and constantly increasing backlog. If the daily average T close to C but still less than C, such persistent "traffic jams" will occur during the part of the day when the traffic is well above average. In that case the backlog may last for half a day or more. If the daily average T itself is greater than C, the traffic jam will last forever -- until enough users give up on bitcoin and the daily averaged T drops below C again.
In both cases, while the current traffic T is greater than C, the backlog will continue growing at the rate T - C. If and when T drops below C again, the backlog will still persist for a while, and will be cleared at the rate C - T. In those situations, the frequency and duration of the traffic jams will be highly variable: a slightly larger demand during the peak hours may cause the jam to last several days longer than usual.
In those conditions, choosing the right fee will be impossible. As explained above, the "fee market" that is expected to develop when the network satiurates will be a running semi-blind auction for the N places at the front of the queue, where new bidders are coming in all the time, and those who are already in the hall may raise their bids unpredictably. There cannot be an algorithm to compute the fee that will ensure service in X hours, for the same reason that there is no algorithm to pick a winning bid in an auction.
But the small-blockian Core devs obviously do not understand that.
Not to mention that the "fee market" would be a radical change in the way that users are expected to interact with the system. As bitcoin was designed, and has operated until recently, the user was supposed to prepare the transaction off-line, then connect to a few relay nodes (even just one), send then the transaction, and disconnect again from the network. That will not be possible once the network gets saturated, or close to saturation. The wallet will have to connect to several relay nodes before assembling the transaction, in order to get information about the state of the queue. Since nodes can have very different "spam filters", the wallet cannot trust just one node, but will have to check a few of them and merge the data it gets. After sending the transaction, the wallet must remain connected to the network until the transaction is confirmed, periodically checking its progress in the queue and replacing it with a higher fee as needed. The client will have to provide the wallet in avance with parameters for that process (the desired max delay X and the max fee F), and/or be ready to authorize further fee increases. From the user's viewpoint,
The small-blockian Core devs do not seem to see this as a significant change. Or even realize that the "fee market", from the client's perspective, will be the most radical change in the system since it was created.
So, Adam, where is the "fee market" BIP?
And they do not seem to be aware of the fact that the fee market will cause a large jump in the internet traffic load for the relay nodes. Once the "smart wallets" become the norm, each transaction will require at least one additional client-node access (to get the queue state), possibly several; and more accesses to monitor its progress. So the fee market will certainly harm the nodes a lot more than a size limit increase would.
In fact, it seems that the small-blockian Core devs do not want to understand those problems. I have pointed them out several time to several of them, and they just ignored the problem.
Hence the theory that they want bitcoin to become unusable as a payment system, so that all users are forced to use off-chain solutions...