Pages:
Author

Topic: [XMR] Monero Improvement Technical Discussion - page 5. (Read 14760 times)

legendary
Activity: 1596
Merit: 1030
Sine secretum non libertas
DRM has nothing to do with it all. Thus I assume you don't understand the issue.

You are not giving him due credit. (AM is not a typical BTCT slouch.)  It is an allusion to "reflections on trusting trust" https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf
sr. member
Activity: 420
Merit: 262
One more point I considered in my holistic analysis is that for most transactions we can't be anonymous. Thus anonymity is more suited to those who want to receive some payment anonymously and hide the funds there and extract them only to public funds in small morsels or to spend in other rare anonymous transactions (e.g. buying some gold bars from someone you trust won't reveal your identity).

In that case one might think you can just use Stealth Addresses (unlinkability) and run a full node to confirm receipt of funds anonymously. No need for Cryptonote, RingCT, nor ZeroCash. But the problem is the payer can be identified and be pressured to reveal your identity.

So this is why we need Zerocash to make the untraceability impervious to meta-data correlation.

But the problem with my proposal for ephemeral Zerocash mixers is that when we take the coins out of the mixer they can now be correlated to our meta-data (e.g. IP address, etc). So thus it seems to hide large funds and only take out small portions publicly as needed, will incur risk of losing those coins in my proposal, but at least they will be provably anonymous.

Anonymity is a clusterfuck. If we can't make trusted hardware, then anonymity is unprovable. Period.

So just give up on anonymity, or get busy trying to make hardware we can trust?

(or if Zerocash has developed a provably secure way to generate a master public key, which I doubt)
sr. member
Activity: 420
Merit: 262
Quote
But I am confident these physics issues can be worked out to a sufficient level of trust.

Only need to confirm that the private key was not communicated from the computer to any one.

I find this kinda weak against your general absolutism. "So Simple Yet So Complex".


After all, what stops all 3 letter agencies, who can own blockchains and can do analysis and attacks etc, to stage the whole thing? Will i be allowed to check that computer?

I mean, i have near to zero understanding of cryptography, but your search for the perfect/ideal solution looks like making you ready to take a huge and dangerous bet.  

I proposed ephemeral mixers based on Zerocash technology. They will be ferreted out if they are doing this, because it will be known that the key was compromised when the mixer expires and everyone has to cash out of the mixer back into the public coin. The bastards can't keep doing it over and over again. The participants will get wise as to the methods the attackers are using.

I am not absolutist. Rather I think correctly and realistically when I weigh marketing, tradeoffs, and delusion as follows:

That will kick ass on Monero, because if I pass through the mixer, I know my anonymity is provable and I know I didn't lose my coins. It is only people who still sitting inside the mixer who risk losing coins. Everything has a risk. I would much rather the microscopic risk of a compromised key (causing me to lose some coins) to the sure risk of meta-data correlation in Monero which can send me to jail! Surely I would be judicious about not mixing all my coins at the same time and not all in the same mixer.

Marketing and design are holistically joined at the hip. Those fools who said the marketing can come later are clueless.
legendary
Activity: 1428
Merit: 1001
getmonero.org
Quote
But I am confident these physics issues can be worked out to a sufficient level of trust.

Only need to confirm that the private key was not communicated from the computer to any one.

I find this kinda weak against your general absolutism. "So Simple Yet So Complex".


After all, what stops all 3 letter agencies, who can own blockchains and can do analysis and attacks etc, to stage the whole thing? Will i be allowed to check that computer?

I mean, i have near to zero understanding of cryptography, but your search for the perfect/ideal solution looks like making you ready to take a huge and dangerous bet.  
sr. member
Activity: 420
Merit: 262
...
FUD. The ceremony is only to computer a public key, nothing else. No other software has to be audited. Only need to confirm that the private key was not communicated from the computer to any one. Period.

How do you know that the public key you see on the screen is the one that was computed and not one that was pre computed before the computer was "placed in lead"?

Edit: DRM in the OS has everything to do with this since it is the perfect place to hide the private key. That is what DRM is designed to do hide private keys.

The hardware has to be audited. But we also have to audit our hardware that we use to run Cryptonote. If Intel is planting spies in the hardware, then we are screwed.

100% trust is impossible. And this is another reason I deprioritized anonymity. It is a clusterfuck.

Also I think perhaps Zerocash was working on a way to generate the public key decentralized, but I haven't kept up with progress on that.

Indeed Zerocash could end up being a Trojan Horse (a way to get fiat in the back door) and that is why I made my proposal to use them only as ephemeral mixes that die periodically, so then we will know if the key was compromised or not.

The result of my proposal is:

  • Stolen coins isn't systemic to the overall coin (same as losing some coins to Mt. Gox and Cryptsy isn't), and at least participants get ongoing ceremonies to get better and better at auditing the hardware.
  • No anonymity is ever lost.
  • No NET coin supply is ever created out-of-thin-air (instead some people lose coins if they chose an insecure mixer that had a compromised key), which is also the case for both Zerocash and RIngCT where coin supply could be created out of thin air and we would never know it due to a bug in cryptography.

That will kick ass on Monero, because if I pass through the mixer, I know my anonymity is provable and I know I didn't lose my coins. It is only people who still sitting inside the mixer who risk losing coins. Everything has a risk. I would much rather the microscopic risk of a compromised key (causing me to lose some coins) to the sure risk of meta-data correlation in Monero which can send me to jail! Surely I would be judicious about not mixing all my coins at the same time and not all in the same mixer.
legendary
Activity: 1260
Merit: 1008
damn. saw this thread bumped and got excited it was in response to my fusion block idea. Instead its this zeroknowledge vaporcoin stuff. You better bring it all back in somehow to MONERO improvement technical discussion lest I wield my moderation powers and shrinkify everything.

is monero implementing ZKP? Last I heard thats a big negative.

This one:

Quote
Note that Monero (Cryptonote one-time rings and every other kind of anonymity technology) also has systemic risk due to combinatorial analysis cascade as more and more users are unmasked with meta-data and overlapping mixes.

might have some legs. As you mentioned, I think the meta-data (what can be referred to as out-of band) can't really be addressed by any protocol. No computer code can stop you from posting on facebook the exact time that you purchased a drone on amazon. I think the general idea though is that with monero (and others) any analysis has a much more steep effort wall than bitcoin.
sr. member
Activity: 420
Merit: 262
...

No closed source. The key would be produced publicly at a ceremony.
...

Using what operating system and firmware?

Of course they will need to convince the public the master key is sound. Or use my idea above of having multiple mixers and timing them out. I believe there is a solution, yet I will agree the current organization of their plans seems legally and structurally flawed.

That is why I say we can transition and beat them. But the technology is real anonymity. If you want real anonymity, you have to find a way to use their technology. Period. (and I have been studying this for a long time)

This does not answer my question which is cut and dry and goes to the heart of the trust issue.

If you apply that line of thinking, then every anonymity is insecure because operating systems and computers are never 100% secure.

I already proposed how to spread the risk out and make it non-systemic.

Note that Monero (Cryptonote one-time rings and every other kind of anonymity technology) also has systemic risk due to combinatorial analysis cascade as more and more users are unmasked with meta-data and overlapping mixes.

Proprietary software solutions have by their very nature a centralized systemic risk that Free Libre Open Source software solutions do not. The type of risks you describe in Monero are trivial compared to the risk of the DRM in the operating system used to generate master key in a centralized proprietary solution such as the one you propose. Furthermore I still do not have an answer to what is a straight forward yes or no question.  

The masterkey is generated once and only the public key is retained. As long as no one saw nor can recover the private key before it was discarded, then there is nothing proprietary remaining in the use of the Zerocash open source. The Zerocash open source code requires a public key to be pasted in. It is the public (ceremony) generation of that key, which determines whether anyone had access to the private key when the public key was created.

DRM has nothing to do with it all. Thus I assume you don't understand the issue.

The only issue is whether the public key can be computed at a public ceremony and the private key was securely discarded. So for example, they could use any computer, encase it in lead before running the computation, and no external connection to the computer other than the screen which reads out the public key.

Then slide the computer into a barrel of acid so that it is permanently destroyed. All done at a public ceremony so there can be no cheating.

Of course one could envision elaborate/exotic means of cheating, such as using radio waves to communicate the private key out to external actor, but again that is why I wrote encase it in lead. There is the issue of how to destroy it while not momentarily removing it from its communication barrier. But I am confident these physics issues can be worked out to a sufficient level of trust.

As for trust, not even the Elliptic Curve Cryptography and other math we use for crypto can be 100% trusted. So if you start arguing silly about 100% trust, then it is safe to ignore as loony.
legendary
Activity: 2282
Merit: 1050
Monero Core Team
...
I am imagining that the type of people designing such a technology would do better than generate a masterkey on Windows et al. I'm actually imagining purpose-built, auditable software and maybe even hardware.

Auditable by whom?

It comes down to Free Software vs Proprietary software. The same is true for the hardware. There is a reason why my question is being avoided here.

By the attendees of said masterkey-generation ceremony.

Actually by anyone who uses the currency. The role of the attendees is to verify that all the software has not changed between what was used and what is released to the public.

Edit: The minute one tries to protect "intellectual property" at any level the trust is gone.
legendary
Activity: 1834
Merit: 1019
...
I am imagining that the type of people designing such a technology would do better than generate a masterkey on Windows et al. I'm actually imagining purpose-built, auditable software and maybe even hardware.

Auditable by whom?

It comes down to Free Software vs Proprietary software. The same is true for the hardware. There is a reason why my question is being avoided here.

By the attendees of said masterkey-generation ceremony.
legendary
Activity: 2282
Merit: 1050
Monero Core Team
...
I am imagining that the type of people designing such a technology would do better than generate a masterkey on Windows et al. I'm actually imagining purpose-built, auditable software and maybe even hardware.

Auditable by whom?

It comes down to Free Software vs Proprietary software. The same is true for the hardware. There is a reason why my question is being avoided here.
legendary
Activity: 1834
Merit: 1019
...

No closed source. The key would be produced publicly at a ceremony.
...

Using what operating system and firmware?

Of course they will need to convince the public the master key is sound. Or use my idea above of having multiple mixers and timing them out. I believe there is a solution, yet I will agree the current organization of their plans seems legally and structurally flawed.

That is why I say we can transition and beat them. But the technology is real anonymity. If you want real anonymity, you have to find a way to use their technology. Period. (and I have been studying this for a long time)

This does not answer my question which is cut and dry and goes to the heart of the trust issue.

If you apply that line of thinking, then every anonymity is insecure because operating systems and computers are never 100% secure.

I already proposed how to spread the risk out and make it non-systemic.

Note that Monero (Cryptonote one-time rings and every other kind of anonymity technology) also has systemic risk due to combinatorial analysis cascade as more and more users are unmasked with meta-data and overlapping mixes.

Proprietary software solutions have by their very nature a centralized systemic risk that Free Libre Open Source software solutions do not. The type of risks you describe in Monero are trivial compared to the risk of the DRM in the operating system used to generate master key in a centralized proprietary solution such as the one you propose. Furthermore I still do not have an answer to what is a straight forward yes or no question.  

I am imagining that the type of people designing such a technology would do better than generate a masterkey on Windows et al. I'm actually imagining purpose-built, (encouragedly) auditable software and maybe even hardware.
legendary
Activity: 2282
Merit: 1050
Monero Core Team
...

No closed source. The key would be produced publicly at a ceremony.
...

Using what operating system and firmware?

Of course they will need to convince the public the master key is sound. Or use my idea above of having multiple mixers and timing them out. I believe there is a solution, yet I will agree the current organization of their plans seems legally and structurally flawed.

That is why I say we can transition and beat them. But the technology is real anonymity. If you want real anonymity, you have to find a way to use their technology. Period. (and I have been studying this for a long time)

This does not answer my question which is cut and dry and goes to the heart of the trust issue.

If you apply that line of thinking, then every anonymity is insecure because operating systems and computers are never 100% secure.

I already proposed how to spread the risk out and make it non-systemic.

Note that Monero (Cryptonote one-time rings and every other kind of anonymity technology) also has systemic risk due to combinatorial analysis cascade as more and more users are unmasked with meta-data and overlapping mixes.

Proprietary software solutions have by their very nature a centralized systemic risk that Free Libre Open Source software solutions do not. The type of risks you describe in Monero are trivial compared to the risk of the DRM in the operating system used to generate master key in a centralized proprietary solution such as the one you propose. Furthermore I still do not have an answer to what is a straight forward yes or no question. 
sr. member
Activity: 420
Merit: 262
...

No closed source. The key would be produced publicly at a ceremony.
...

Using what operating system and firmware?

Of course they will need to convince the public the master key is sound. Or use my idea above of having multiple mixers and timing them out. I believe there is a solution, yet I will agree the current organization of their plans seems legally and structurally flawed.

That is why I say we can transition and beat them. But the technology is real anonymity. If you want real anonymity, you have to find a way to use their technology. Period. (and I have been studying this for a long time)

This does not answer my question which is cut and dry and goes to the heart of the trust issue.

If you apply that line of thinking, then every anonymity is insecure because operating systems and computers are never 100% secure.

I already proposed how to spread the risk out and make it non-systemic.

Note that Monero (Cryptonote one-time rings and every other kind of anonymity technology) also has systemic risk due to combinatorial analysis cascade as more and more users are unmasked with meta-data and overlapping mixes.
legendary
Activity: 1260
Merit: 1008
So I was reading this
http://www.scribd.com/doc/273443462/A-Transaction-Fee-Market-Exists-Without-a-Block-Size-Limit#scribd

and my thoughts started to drift when I encountered the concept that orphanization is one of the impediments to picking what to mine and the whole block size fee market debate etc...

Is there any work in this space regarding what could be called sister blocks, or fusion blocks?

Basically, the way I understand it (and granted, my assumptions could be flawed) is that there exists a set of transactions in the mempool. We'll just use 5 here

Trans1
Trans2
Trans3
Trans4
Trans5

If miner A decides to put 1,2,3 in his block (block A), and miner B decides to put 3,4,5 in his block (block B), they are both technically valid blocks (they both have the previous block's hash and contain valid transactions from the mempool). However, due to the nature of satoshi consensus, if block A makes it into the chain first, block B becomes orphan - even though it is entirely valid.

It's even easier to understand the inefficiency of satoshi consensus if block A has 1,2,3 and block B has 4,5. In this case, there's really no reason both blocks aren't valid.

I see now as I continue to think about this the problem lies in the transaction fees associated with each transaction, for if they exist in two blocks, which block finder gets the reward? But this isn't an intractable problem.

Essentially what I'm thinking is that you can imagine these two blocks existing as blebs on the chain.
                         .
._._._._._._._._._./
                        \,

each dot is a block, and the comma indicates a sister block
in current protocol, this would happen
                         ._._._
._._._._._._._._._./
                        \,_,

And eventually one chain would grow longer (which is ultimately influenced by bandwidth) and the entire sister chain would be dropped, and if your node was on that chain you'd experience a reorg (right?).

why couldn't something be implemented where the above fork turns into a bleb

                         .
._._._._._._._._._./\.
                        \,/

which is eventually resolved to a fusion block

._._._._._._._._._._!_._


where the ! indicates a fusion block. When encountering a potential orphan scenario (daemon receives two blocks in close proximity, or already has added a block but then receives a similar block for the same block height) instead of the daemon rejecting one as orphan, it scans the sister block as a candidate for fusion. There would be some parameters (X% of transactions overlap, only concurrent block height are candidates (this is effectively the time window)). As part of this, the system would somehow need to be able to send transaction fees to different blockfinders, but again this seems tractable (though I await to be schooled as to why its not possible). In addition, the block reward itself would need to be apportioned.

Or is this what a reorg does? The way I understand reorgs, this is different than a reorg.

Though upon creation of the fusion block a reorganization would have to occur. So at the cost of overall bandwidth we provide a countermeasure for the loss of economic incentive for large blocks.

And one problem to address is that you would need a new block header for the fusion block, but this could really just be the hash of the two sister blocks. Both sisters are valid, therefore the hash of those valid blocks is valid.

Ok back to work.
legendary
Activity: 2968
Merit: 1198
moneromoo, can you add this functionality, or does it already exist?

I noticed here that this dude wants to check his paper wallet balance with a viewkey

https://www.reddit.com/r/Monero/comments/3v631e/how_do_i_use_my_view_key_to_view_my_balance/

so obviously he can do that.

But what if they want to check the balance in an offline computer (for increased security or whatever)

can simplewallet access the blockchain without the daemon being syncd? I've noticed that simplewallet gets mad when the daemon isn't syncd. Surely it can just access the blockchain db.

Seems like the wallet should still work even if the computer is offline, but maybe with a warning of the highest block is "too old"

legendary
Activity: 1260
Merit: 1008
moneromoo, can you add this functionality, or does it already exist?

I noticed here that this dude wants to check his paper wallet balance with a viewkey

https://www.reddit.com/r/Monero/comments/3v631e/how_do_i_use_my_view_key_to_view_my_balance/

so obviously he can do that.

But what if they want to check the balance in an offline computer (for increased security or whatever)

can simplewallet access the blockchain without the daemon being syncd? I've noticed that simplewallet gets mad when the daemon isn't syncd. Surely it can just access the blockchain db.
legendary
Activity: 1276
Merit: 1001
As a general rule of thumb, a party cannot rely on another party to cooperate in doing something detrimental to the second party.

If the network wants to detect whether blocks A and B are found by Alice, Alice will make sure to generate two different fingerprints.

Besides, you wouldn't want to embed a fingerprint of the miner in a block for a currency that prides itself on being unlinkable.

Thermal noise is used as a random source, so I'm not even sure you could get any kind of fingerprint by reading off an audio device anyway. Not that you can read off the hardware. My VMs certainly don;t have an HW audio device,
legendary
Activity: 1260
Merit: 1008
just an idea that just burbled up in my mind. Posting for 2 reasons - if it exists, someone will come in and go "someone already thought of this". Or if it doesn't exist, someone will go "it won't work because X"

Integrate an Audio signal into the POW function (somehow)

Rationale: all personal computing devices have audio hardware. There might be a way to use the audio signal to create a fingerprint of the individual computer that will prohibit pooling.

I imagine it as such, though this could be totally off due to an incomplete understanding of POW / pooling protocols.

The mining algorithm starts to attempt to find a solution for a block. Upon starting of this function, an audio recording is initiated (recording a wav file). No microphone would be needed - no audio board will read true silence in the circuitry, so the wav recording will contain electronic noise. Each time a block solution is attempted, a hash of the wav file is included in the hash of the POW. So basically, the duration of the current attempt to find a block solution will equal the duration of the wav file to find a solution. (that matters somehow... I can't figure out why)

Maybe? Perhaps?

legendary
Activity: 2968
Merit: 1198
September 16, 2015, 05:23:11 PM
#59
An algorithm with a much bigger memory footprint, like cuckoo, without a corresponding increase in verification time, might be sized for L4 though.

This focus on cache sizes may be unwarranted.

If you benchmark Cuckoo Cycle for increasing memory sizes,
you only see a small slowdown in access latency(~40%) when moving from
fitting in the 12MB/core on-chip cache to moving way beyond that.

Yup. I think I was the one who pointed that out to you.

My point in mentioning cache is that cache benefits from physical proximity. You can't put two things in the same space, so if you want to introduce more parallelism (horizontal or vertical) in the processing elements you will have to move the memory farther away, incurring a cost, certainly in latency and probably in power usage too. Okay the cost may not be that large, but it does exist, and becomes an obstacle to overcome before even breaking even.

Obviously things like more use of 3D in microelectronics will shift the numbers around but the principle of finite proximal space will remain.
legendary
Activity: 990
Merit: 1108
September 16, 2015, 04:39:04 PM
#58
An algorithm with a much bigger memory footprint, like cuckoo, without a corresponding increase in verification time, might be sized for L4 though.

This focus on cache sizes may be unwarranted.

If you benchmark Cuckoo Cycle for increasing memory sizes,
you only see a small slowdown in access latency(~40%) when moving from
fitting in the 12MB/core on-chip cache to moving way beyond that.
Pages:
Jump to: