Pages:
Author

Topic: Solving the problem of on-chain scaling - page 2. (Read 1243 times)

member
Activity: 200
Merit: 73
Flag Day ☺
August 14, 2019, 01:36:05 PM
#22
Now explain why all the myriad altcoins which do exactly that are still a speck of dust in the wind compared to Bitcoin.  

Because of the price manipulators , which price is all your simple mind can conceive.

Who said anything about price?  I'm talking about utility.  A currency I'm happy to receive as a payment because I have the capability to purchase tangible goods and services with it.  Other coins can't match the network effects of Bitcoin, which is something you continually underestimate the importance of.  

As for your other remark about non-mining nodes and security, I'm not going to waste my time repeating something to you that I've said enough times in the past (to far superior opponents, no less).  Suffice to say, you are wrong.  End of.  

Did you go in a grocery store and pay with bitcoin, did you pay your electric bill with bitcoin,
did you pay for your cup of coffee with bitcoin.

Answer is NO, because even credit cards charge a lower fee than bitcoin current transaction fees.

Bitcoin is a buy and hold and sell (for the non-stupid.)
It has no useful utility as a payment service due to it fluctuations in price verses fiat and it fluctuations in transaction fees.
Also who had time to wait for 30 minutes for 3 confirmations at a gas station.   Tongue

* Your non-mining node serves no purpose except propping up your fragile ego, that you matter, you don't.  Cheesy

newbie
Activity: 22
Merit: 151
August 14, 2019, 08:47:06 AM
#21
You can reduce a 1gb blockchain to less than 1kb.

You can, but on very specific scenario such as Zip bomb or the whole data have identical long sequence.

I have to say "False until proven real" on your claim

It looks like that there are projects utilizing Zero-Knoledge-Proof like Coda aiming to solve blockchain size problem (may be at some computation cost). Isn't it a promising approach?

Sounds interesting, but AFAIK & IMO (after quick research) it's alternative for light/SPV node (as compared with merkle tree, block header, bloom filter and neutrino).
Someone still need to store whole blockchain and send relevant information with ZK-snarks.

CMIIW.

It was my first impression that that they are trying to make new light/SPV approach but it seems they are not. If I got it right, 20 kB is what blockchain will look like in any validator, i.e. full node, and ZK magic will enable check any transaction against that 20 kB  (However, it looks like that essential computation power is needed to "pack" every new block via ZK).
Users need to store private keys (may be some small metadata about their balances as well) in their devices and the only entities which might need full blockchain are API providers and Blockchain explorers for historical/analytical data
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 14, 2019, 08:05:10 AM
#20
Now explain why all the myriad altcoins which do exactly that are still a speck of dust in the wind compared to Bitcoin.  

Because of the price manipulators , which price is all your simple mind can conceive.

Who said anything about price?  I'm talking about utility.  A currency I'm happy to receive as a payment because I have the capability to purchase tangible goods and services with it.  Other coins can't match the network effects of Bitcoin, which is something you continually underestimate the importance of. 

As for your other remark about non-mining nodes and security, I'm not going to waste my time repeating something to you that I've said enough times in the past (to far superior opponents, no less).  Suffice to say, you are wrong.  End of. 
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 13, 2019, 05:04:56 AM
#19
Time is the flaw in your idea.  If you had a really slow blockchain where new blocks were only appended every hour or so, perhaps then it could handle extreme compression.  But the undeniable fact is that uncompressing such files takes time.  

It also places an additional burden on memory requirements, so those securing the chain would need more powerful hardware to perform that task.  If you limit the potential of who can secure your chain in that way, chances are, it won't be a very secure chain.

But if you could compress data and have easy access to the data you required at the same time, with just a small exchange of computation cost roughly proportional to the amount of data required, it would work no?

The idea of a blockchain is the nodes are constantly receiving new data.  You won't have "easy access" to the recently sent transactions in order to validate and relay them if it takes time to unpack them first.

Compression may be viable for the initial sync, when your start a node for the first time and have to download the entire blockchain, but not for live day-to-day running.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 13, 2019, 03:51:04 AM
#18
Time is the flaw in your idea.  If you had a really slow blockchain where new blocks were only appended every hour or so, perhaps then it could handle extreme compression.  But the undeniable fact is that uncompressing such files takes time. 

It also places an additional burden on memory requirements, so those securing the chain would need more powerful hardware to perform that task.  If you limit the potential of who can secure your chain in that way, chances are, it won't be a very secure chain.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 12, 2019, 11:04:10 AM
#17
You can reduce a 1gb blockchain to less than 1kb.

Uhhhh.... No.  You really can't.  You wouldn't be able to use it because the nodes would be too busy uncompressing the data to be able to validate the blocks in time.

All this "can't tell us about it" nonsense is just a smokescreen because you're out of your depth and you need an excuse to avoid explaining your outlandish claims in any detail.  
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 12, 2019, 04:49:57 AM
#16
The solutions are simple, the problem is political.

Hence your little propaganda campaign?


1. Increase BlockSize
2. Increase Blockspeed

Now explain why all the myriad altcoins which do exactly that are still a speck of dust in the wind compared to Bitcoin.  

How many times do you need it repeated?  DOING THOSE THINGS COMES AT A COST.  When those securing the chain are willing to pay that cost, they will do so.  Calling it a "political problem" is reductionism at best.  The problem is free will, market forces, economic incentive, network effects, consensus and the simple but undeniable fact that the vast majority of all those people across the globe currently securing the Bitcoin blockchain simply don't agree with you.  They've been given the option of running a client to increase to a 2mb base + SegWit and they turned it down.  Most users on this chain didn't want the cost of the 2mb base weight.  The icing-on-the-cake part, where you would also naturally decline to run that particular client, but only because the Segregated Witness part is what you can't reconcile with, just shows how far out of touch you are with what users on this chain want.

legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
August 11, 2019, 01:57:06 PM
#15
3. Increase the compression of the data within the block (will increase CPU overhead for Nodes.)

Have you heard about Compact Block? It significantly reduce block size (from 1MB to 9-20KB) which send to other node by only including these information

  • The 80-byte header of the new block
  • Shortened transaction identifiers (txids), that are designed to prevent Denial-of-Service (DoS) attacks
  • Some full transactions which the sending peer predicts the receiving peer doesn’t have yet

It won't work on IBD (initial block download) though.
legendary
Activity: 1624
Merit: 2481
August 11, 2019, 06:31:43 AM
#14
1. Increase BlockSize
2. Increase Blockspeed
3. Increase the compression of the data within the block (will increase CPU overhead for Nodes.)

None of these would increase the scalability at all.

Increasing the blocksize or decreasing the block interval leads to a lot additional problems, which are way more severe than the scalability problem .
This would be a (very) bad trade off between TPS and decentralization.

Besides that, increasing the blocksize is just postponing the problem. This can not be considered scaling. Linearly postponing a problem is not scaling.


A combination between on-chain and off-chain scaling is necessary. But your 3 points are all extremely bad. None of them would help BTC at all.



BTC devs have claimed onchain scaling is impossible, which is a lie.

They never did.

They scale on-chain with segwit, schnorr, etc..



They are promoting an offchain system , that is nothing more than banking 2.0 for crypto.

They do not promote anything .

Bitcoin is an open system. Everyone chooses for him or herself what to use and which way to go.
There is no central authority behind bitcoin.

Don't spread bullshit.



Which a 8 mb blocksize or moving to a 2½ minute block would have solved for years to come.

No.
We would have created way more problems, leading to centralization without even fixing the problem, just postponing it.



The smart thing would have been to do one of the above,

Actually it would be the dumbest thing to do.



As technology improves , the ways to increase onchain scaling improves and can be done incrementally to match demand,
but like I said, their was nothing smart about the segwit upgrade ,
just a completion of a payoff to blockstream to fuck over bitcoin for the foreseeable future. 

Are you for real ?
Please stop spreading your worthless opinion.

No one cares what trolls like you who doesn't have a clue at all have to say.
member
Activity: 200
Merit: 73
Flag Day ☺
August 11, 2019, 04:50:58 AM
#13
The solutions are simple, the problem is political.

1. Increase BlockSize
2. Increase Blockspeed
3. Increase the compression of the data within the block (will increase CPU overhead for Nodes.)

BTC devs have claimed onchain scaling is impossible, which is a lie.
They are promoting an offchain system , that is nothing more than banking 2.0 for crypto.

The confusion is caused by BTC devs pointing at visa transaction capacity and saying BTC can only match it if it goes offchain.

Now listen very carefully , here is the fucking secret ,
BITCOIN DOES NOT NEED TO MATCH VISA, it only needs to be able to handle it's own needs, which are light years less than visa.
Which a 8 mb blocksize or moving to a 2½ minute block would have solved for years to come.

The smart thing would have been to do one of the above,
but the smart thing was not done and bitcoin is a onchain transaction invalid because of it.

As technology improves , the ways to increase onchain scaling improves and can be done incrementally to match demand,
but like I said, their was nothing smart about the segwit upgrade ,
just a completion of a payoff to blockstream to fuck over bitcoin for the foreseeable future.  

legendary
Activity: 1610
Merit: 1183
August 09, 2019, 02:04:20 PM
#12
You can't magically keep compressing random data "as needed" (what is "as needed"? who gets to say when fees are too expensive anyway?) and you can't keep removing "Unnecessary stuff" (segwit) without ending up in an insecure clusterfuck.

You can thank segwit got in, the way I see it is that we will never much on this field, further scaling is going to be on 2nd layers. I doubt big players are going to want to decrease security in exchange of smaller fees. What can I tell you? if you want the luxury of going on-chain, you have to pay for it.

PS: Im not telling you stop working on whatever you are working, but think of the game theory involved to see if the implementation will ever get accepted.

legendary
Activity: 3472
Merit: 10611
August 09, 2019, 01:19:17 AM
#11
arguing about how some random data/bytes can be compressed or not is meaningless. if you want to talk about compression of bitcoin transactions then take a random block from the bitcoin blockchain and then "compress" that and then "decompress" it, see what happens. then come report your results so we can discuss about real cases.

compression is possible, in fact for each transaction you could find ways to compress them to somewhere around 10% maybe more when you store it on disk but it won't "solve" the on chain scaling though!
legendary
Activity: 3430
Merit: 3080
August 07, 2019, 12:31:12 PM
#10
There is a way to compress random data

But I can't fucking tell you about it because somebody wanted me to destroy my science paper.

But I can't fucking tell you about it even though it'll change everything for the better.

Then why bother talking about it if you keep it to yourself? You're wasting our time and potentially risking yourself

yes it's called "trolling", it happens on the internet
legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
August 07, 2019, 12:03:47 PM
#9
There is a way to compress random data

As long as it's not truly random data, but there's upper limit and i doubt the ratio between uncompressed random data isn't that significant.

But I can't fucking tell you about it because somebody wanted me to destroy my science paper.

But I can't fucking tell you about it even though it'll change everything for the better.

Then why bother talking about it if you keep it to yourself? You're wasting our time and potentially risking yourself
legendary
Activity: 2464
Merit: 3158
August 07, 2019, 06:12:16 AM
#8
Simple counting shows that at most a fraction of 2^-8 (about 0.25%) of all files can be compressed by 1 or more bytes.

And at most 2^-32 or 0.000000024% of all files can be compressed by 4 or more bytes.

This is why we say that random data is incompressible.

Yes.
I don't think we can compress any further the data that ends up in the blockchain.
I truely believe that Bitcoin took another path. I don't think it will ever scale the on-chain side.

The on-chain scaling issue was solved by BCH and BSV ...  Roll Eyes
So we know this is not the way to go.

The most likely attempt to succeed at scaling Bitcoin will rely on 2nd (and why not 3rd?) layer and protocol.
legendary
Activity: 990
Merit: 1108
August 07, 2019, 03:04:15 AM
#7
Simple counting shows that at most a fraction of 2^-8 (about 0.25%) of all files can be compressed by 1 or more bytes.

And at most 2^-32 or 0.000000024% of all files can be compressed by 4 or more bytes.

This is why we say that random data is incompressible.
legendary
Activity: 1624
Merit: 2481
August 07, 2019, 02:45:31 AM
#6
There is a way to compress random data

No, there isn't.
There is not a single lossless compression algorithm which could efficiently compress random data.

I mean.. of course you could save a few bits or bytes (depending on the size of the file / data), because some iterations might appear in the random data (it is random after all, might appear, might not).
But that's not really compression.


For example, i created a ZIP archive of 10 MB of random data:

Create 10MB of random data:
Code:
dd if=/dev/urandom of=random.data bs=1M count=10

Create a ZIP archive:
Code:
tar czf random.data.zip random.data

Verify the filetype:
Code:
$ file random.data
random.data: data

$ file random.data.zip
random.data.zip: gzip compressed data, last modified: Wed Aug  7 07:40:31 2019, from Unix

Now, compare the size of both files:
Code:
$ du -s random.data
10240   random.data

$ du -s random.data.zip
10244   random.data.zip


As you can see, the zipped ("compressed") file is 4 bytes larger than the original one, because it contains additional information (i.e. "this is a zip archive").


You simply cannot compress random data lossless.
It doesn't matter which algorithm you use. Random data can not be compressed.
hero member
Activity: 1778
Merit: 764
www.V.systems
August 06, 2019, 03:44:39 PM
#5
There is a way to compress random data using systems of sequences, programs of life, colour, art, magnetism and science and pretty much any 2d game you can think of, and in future 3d and immersion.

But I can't fucking tell you about it because somebody wanted me to destroy my science paper.

You just have to start thinking about 0's and 1's as more than just code, but as people and potentials that change and adapt. Then you can program strategies on the data itself and combine them to create a balance between computation cost and storage cost.

But I can't fucking tell you about it even though it'll change everything for the better.

I hate it when people say things like I can tell you, and it'll change the world but To me that sounds shady, and makes me think of you (not you personally, speaking in general) as an attention seeking slimeball.

Typically there are two reasons why people can't share something, either to maintain confidentiality or because they don't quite understand what they're talking about.

So which category do you belong to?

Clearly it can't be confidentiality because you would not even have admitted to knowing something in the first place.
hero member
Activity: 1778
Merit: 764
www.V.systems
August 06, 2019, 02:00:32 PM
#4
Potential to me means the ability to surpass the expectations placed on you by someone else.

That's more in the realm of philosophy than tech talk. I think the magic will happen with public IOT decentralisation.

Imagine a world where thousands and millions of small battery powered iot processing devices are installed on the streets, highways, buildings etc, user devices would just need to be powerful enough to stream the data that is being processed unreal time in close vicinity of the user.

That is something that I think could push not only on chain scaling to the next level, but also other aspects of the society.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 06, 2019, 08:53:17 AM
#3
It's also worth remembering that after compressing something, it also needs to be uncompressed to use again, creating additional time to process and increasing the demands on hardware.  There are definitely limits to what can be achieved in this direction.  If it were that simple, we'd already be doing it.
Pages:
Jump to: