edit: maybe they should test DASH also? - we ready for some thing like this?
Our blocks would fill up the same and the attack could be performed with less transaction charges incurred, but we have more headroom because we are 4 times more blocks, are more empty than full and there is a flood spike preventative fail safe with instantX that does not need confirming as such but would have to go into the chain at some point.
(btw, my record in testnet using IX flooding into only 1 block is 340 transactions using 1 client)
Bllock size puts a limit on transactions, but also limits blockchain growth. The questions is what happens to the block chain with millions of transactions. It still isn't close to the point of sale transactions in the trillions. DASH doesn't really look any better than BTC. Transaction size is about the same (1KB) and this is what makes the blockchain bloat when it scales up. See below.
For BTCCurrently, bitcoin can only do 2-4 transactions per second. So if 64 million people want to transact in bitcoin, they could only make 1 transaction per year with current block size.
If we look at banking transaction, this is what is needed for blocksize for each.
US Bank Wires(150 Million transactions) = 1.7MB
Swift(international) wires(5 Billion transactions) = 53.3MB
ACH(19 Billion transactions) = 202.4MB
With the 202.4MB blocksize that would add 29GB to the blockchain each day.
For DASHOK so looking at one of the peak transaction days June 21st. 5K average block size (I assume this is Kilobytes) and 2.75K transactions (I assume 2750 transactions)
5KB/2750 * 576 blocks/day = 1.04 KB transaction size(.001MB)
Block size with DASH needed for:
US Bank Wires(150 Million transactions) = 0.74MB
Swift(international) wires(5 Billion transactions) = 24.7MB
ACH(19 Billion transactions) = 93.9MB
Now lets look at the ACH transactions with DASH. 93.9MB X 576 blocks/day so daily add to the blockchain is 54GB.
We need to find a way to cram more transactions in less data, cutting out old transactions, or doing something Dashingly smart.
I don't agree with your calculation, but I like the all inclusive idea of capturing every transaction on the planet.
so using 1k as the average transaction size and extending:-
Total transaction size per day = [(.150 + 5 +19)*10^9] * 1024 =24.7 terrabytes per day !
if 576 blocks per day, that would mean every block size = 24.7 * (10^12) /576 = 42.9 gigabytes per block required !!
assuming 1 block= 1MB,
42.9 GB / 0.001 GB =42882
scalability factor required for entire transactions=
42882 times size reduction !!!
(and with 1 as yet unpruned transaction = 1024 / 42882 = 0.024 Bytes long)
It could be possible if a secondary master records database was created, the same addresses were used, scrubbing the empty addresses and with a limited number of new replacement addresses ? ...