After a few experiments, it was confirmed by Gavin Andresen that nodes can handle block sizes up to 20 MB:
http://gavintech.blogspot.com/2015/01/twenty-megabytes-testing-results.html
That being said, it's really intended to be a stopgap measure until we find a better solution. The original plan called for a block size limit that gradually increased by something like 1.4x per year (which is really just an arbitrary percentage with no relation to actual usage). This idea was rejected after a while because of the possibility that network bandwidth and computing power in general could reach a plateau in the near future (see Neilsen's Law of Internet Bandwidth and Moore's Law):
Link: http://github.com/gavinandresen/bitcoin-git/commit/5f46da29fd02fd2a8a787286fd6a56f680073770#commitcomment-11025926
Manual forking to 20MB is a stopgap measure because a proper method that deals with the long term scaling of the blockchain by adapting in relation to actual usage hasn't been developed yet.