Author

Topic: Raiblocks but with node-pooling. (Read 160 times)

newbie
Activity: 2
Merit: 0
January 13, 2018, 11:09:41 PM
#2
Bump! Not enough expertise (yet?) But at least construcive thinking. I hope 'they' hear you Smiley
newbie
Activity: 20
Merit: 0
December 26, 2017, 02:10:16 PM
#1
An issue Rai has is that each full node has to experience the full brunt of the network by itself. Right now this is not an issue since Rai's network is getting around 0.2tps which is easily handled by the average full node. But what happens when the network tps rises to 10tps, 100tps, 1000tps, 10000tps?? Ledger pruning can keep the total size of the ledger down to a storable size but if the ledger is changing 10000 times every second how do you expect the average full node to be able to download all those changes? Each transactions adds about 1KB to the ledger which means 10000tps will require full nodes to reserve 10MB/s downstream bandwidth just for their Rai node. Keep in mind that 10000tps can be achieved by a single attacker if they want to spend the money/resources.

Possible solution: Gimp the network to around 100tps to protect full nodes.
Mechanism to achieve solution: Increased POW, Transaction Fees,
Problem with solution: While a respectable throughput, it's not high enough to be widely adopted. Also, you'd make it impossible for phone/IOT devices to use Rai. Nanopayments will not be economically attractive to users. Powerful attacker can still spam network although at higher personal costs.

My solution: Unlock tps limits and protect individual full nodes from drowning.
Mechanism to achieve solution: Allow full nodes to pool their resources together to create a single virtual full node. Similar to mining pools, node pools will distribute bandwidth/storage/processing load among each other to easily keep up with 10Ktps network throughput.
Problems with solution: Pooling requires more upload bandwidth than lonewolfing since you have to share everything you download with your pool-members when they want it OnDemand.

Thoughts?

Jump to: