Author

Topic: Modular Scaling (Read 209 times)

legendary
Activity: 3122
Merit: 2178
Playgram - The Telegram Casino
March 18, 2019, 05:38:04 AM
#4
While modular arithmetic sees a lot of use in cryptography I don't think it can be meaningfully applied to compression as information is lost and reversal is not possible. In this regard you could compare it to a cryptographic hash function (or any hash function) -- while the output gets reduced to, say 256 bits in the case of SHA256 regardless of the size of the input, you can only use it for validation (as is the case with Bitcoin) but not for compression (because multiple inputs will by necessity lead to the same output, even though the chance of finding a collision is currently infinitesimal).

In terms of compression in general the information always has to go somewhere, so given highly random data (as pointed out by mda) any amount of information that is taken away from the data to be compressed moves to the function required to compress it (as pointed out by odolvlobo). That is, the filesize of compressed high entropy data + the compression function can never be less than the uncompressed high entropy data that serves as its input [1]. While Bitcoin transactions have elements that can be fairly well compressed AFAIK those are already mostly accounted for (ie. compressed in one way or another). What isn't and can't be compressed is the signature data (ie. the witness data that has been segregated as part of... well, segregated witness, aka SegWit). However that's where e.g. Schnorr signatures come into play, which stand to significantly reduce the size of aggregated signatures [2] (e.g. when multiple small inputs are aggregated into a single large output; it is however even imaginable to aggregate the signatures of multiple, unrelated users).

That being said, have a look at MimbleWimble [3]. I think MimbleWimble's cut-through method is pretty much what you are looking for. Essentially it reduces block sizes by getting rid of intermediary transactions once they have become obsolete (eg. Alice -> Bob -> Charlie gets reduced to Alice -> Charlie) while gaining privacy without sacrificing security. It is a fundamentally different transaction / block approach from Bitcoin however, so you won't see this beyond alt coin implementations (Grin and Beam, respectively).


[1] https://marknelson.us/posts/2006/06/20/million-digit-challenge.html
[2] https://medium.com/@SDWouters/why-schnorr-signatures-will-help-solve-2-of-bitcoins-biggest-problems-today-9b7718e7861c
[3] https://github.com/mimblewimble/grin/blob/master/doc/intro.md
legendary
Activity: 4466
Merit: 3391
March 17, 2019, 11:40:59 PM
#3
Are you asking whether the entire blockchain could be stored in a single bit if only the right function could be found? The answer is yes, but then the size of the function becomes the issue.
mda
member
Activity: 144
Merit: 13
March 17, 2019, 06:35:59 AM
#2
Blockchain data has a high degree of randomness. Therefore any kind of compression, yours included, will yield an insignificant improvement.
newbie
Activity: 7
Merit: 14
March 17, 2019, 04:56:09 AM
#1
Excuse any ignorance I may have and please consider that I am attempting to use Cunningham's Law here; posting the wrong answer and waiting for someone to correct me.

Could modular exponentiation, some form of modular math (a concept that I’m new to), or some other mathematical computation be used to infinitely scale bitcoin?

Im this case, a block wouldn’t be an aggregate of the block data but a mathematical computation that (when solved) reveals the data of the block. Similar to a trapdoor function. Bc the data requirement for a single function is much smaller than the result of the aggregate block data (esp. as it grows) and the computation required is “simple” when the trapdoor is solution is available. So, a theoretically infinite amount of transactions could exist in a single computation. The difficulty in finding this trapdoor might be prohibitive, but I could it be reasonable?

Jump to: