Pages:
Author

Topic: [ANN] Spondoolies-Tech - carrier grade, data center ready mining rigs - page 34. (Read 1260202 times)

legendary
Activity: 2968
Merit: 1198
Yes and every time it is discussed it is a dead end because in order to succeed it needs exactly the sort of consensus algorithm that proof-of-work implements. There is no such thing as "proof-of-transmission" as a consensus algorithm that is sybil resistant, at least none that has been shown.
P2P-pool works and is Sybil-resistant. Its faster and easier blocks serve as a very good approximation of global convergence to a single second-level slower and harder block.

P2pool doesn't really "work" because its cost is outsize with its benefits. I used p2pool when I used to mine so I'm well aware of both. You can't really justify using it unless you are trying to be altruistic.

legendary
Activity: 2128
Merit: 1073
Yes and every time it is discussed it is a dead end because in order to succeed it needs exactly the sort of consensus algorithm that proof-of-work implements. There is no such thing as "proof-of-transmission" as a consensus algorithm that is sybil resistant, at least none that has been shown.
P2P-pool works and is Sybil-resistant. Its faster and easier blocks serve as a very good approximation of global convergence to a single second-level slower and harder block.

The obstacles are purely non-technical. Many people find it profitable not to look for solutions. That's pretty much most of the story. Edit: the rest of the story is more like religious/theological argumentation in https://en.wikipedia.org/wiki/Eschatology .
legendary
Activity: 2968
Merit: 1198
There is no "the mempool"

Relativity means there can never be an objective mempool.

I don't think this is really a problem though. Transaction fees are precisely a bribe to a miner to include the transaction in the block. The mechanism exists, so use it.
It would be relatively easy to approximate the existence of the global pending transaction list. I hate the term "mempool" because it is just an example of terminal in-box-thinking. "The mempool" is just a database stored in a really primitive way in memory, instead of some more advanced way like storing them in a https://en.wikipedia.org/wiki/In-memory_database .

The implementation would require folding the p2p-pool mining protocol into the main protocol where it would serve as a sort of proof-of-transmission that is resistant to the Sybil attacks. This has been discussed many times before.

Yes and every time it is discussed it is a dead end because in order to succeed it needs exactly the sort of consensus algorithm that proof-of-work implements. There is no such thing as "proof-of-transmission" as a consensus algorithm that is sybil resistant, at least none that has been shown.

legendary
Activity: 2128
Merit: 1073
There is no "the mempool"

Relativity means there can never be an objective mempool.

I don't think this is really a problem though. Transaction fees are precisely a bribe to a miner to include the transaction in the block. The mechanism exists, so use it.
It would be relatively easy to approximate the existence of the global pending transaction list. I hate the term "mempool" because it is just an example of terminal in-box-thinking. "The mempool" is just a database stored in a really primitive way in memory, instead of some more advanced way like storing them in a https://en.wikipedia.org/wiki/In-memory_database .

The implementation would require folding the p2p-pool mining protocol into the main protocol where it would serve as a sort of proof-of-transmission that is resistant to the Sybil attacks. This has been discussed many times before.

Anyway, that isn't going to be implemented in Bitcoin for political reasons, not for any technical/scientific/economic reasons.

legendary
Activity: 2968
Merit: 1198
any block that contains zero transactions be rejected by the network. This would not only help speed up transactions, but also give the badly coded/lazy/greedy pools a kick up the backside to sort their act out.

Sadly, they will just change from 0 tx blocks to 1 tx blocks (in addition to coinbase), where the single tx is not even from the mempool, but their own private spend that doesn't need checking for validity.

You can't force miners to add useful transactions. You can only provide incentives.

Then that also needs to be addressed. Blockes that contain tx's that aren't from the mempool should also be rejected. There is always a way..... Wink

There is no "the mempool"

Relativity means there can never be an objective mempool.

I don't think this is really a problem though. Transaction fees are precisely a bribe to a miner to include the transaction in the block. The mechanism exists, so use it.

sr. member
Activity: 266
Merit: 250
any block that contains zero transactions be rejected by the network. This would not only help speed up transactions, but also give the badly coded/lazy/greedy pools a kick up the backside to sort their act out.

Sadly, they will just change from 0 tx blocks to 1 tx blocks (in addition to coinbase), where the single tx is not even from the mempool, but their own private spend that doesn't need checking for validity.

You can't force miners to add useful transactions. You can only provide incentives.

Then that also needs to be addressed. Blockes that contain tx's that aren't from the mempool should also be rejected. There is always a way..... Wink
legendary
Activity: 990
Merit: 1108
any block that contains zero transactions be rejected by the network. This would not only help speed up transactions, but also give the badly coded/lazy/greedy pools a kick up the backside to sort their act out.

Sadly, they will just change from 0 tx blocks to 1 tx blocks (in addition to coinbase), where the single tx is not even from the mempool, but their own private spend that doesn't need checking for validity.

You can't force miners to add useful transactions. You can only provide incentives.
sr. member
Activity: 266
Merit: 250
Hello Guy,

Nice to see you back  Wink

Interesting subject & conversation. Personally, I'll go with whatever the concensus is regarding Bitcoin but what I'd like to see, in whatever client is adopted/survives is that any block that contains zero transactions be rejected by the network. This would not only help speed up transactions, but also give the badly coded/lazy/greedy pools a kick up the backside to sort their act out. If there is an algo change, so be it, the huge asics farms can mine other SHA256 coins, which will create competition & thus encourage innovation. I'm sure it won't be the last time a change is implemented by whoever or for whatever means, it's evolution. Nothing stays the same, especially in the coding/crypto world. If it did, it would be surpassed by a more forward thinking & innovative coin & die.

There seems to be a lot of panic & agendas regarding what's going to happen, but I think it's mostly unfounded & Bitcoin will do what it has always done - find a way & overcome any hurdles it comes across on this fun & interesting journey we're on.
donator
Activity: 1414
Merit: 1051
Spondoolies, Beam & DAGlabs
Can you imagine giving ASIC manufacturers the power to influence a PoW system?
Have you read the article ?
donator
Activity: 4760
Merit: 4323
Leading Crypto Sports Betting & Casino Platform
Can you imagine giving ASIC manufacturers the power to influence a PoW system?
donator
Activity: 1414
Merit: 1051
Spondoolies, Beam & DAGlabs
Lesson learned from the Classic coup attempt or why Core needs to prepare a GPU only PoW:

...
prepare a large set of cryptographic hash functions, at least 100 or more initially. Any simple (not memory hard) function will do
...
Each PoW function actually serves for 5 months
...

This proposal if implemented correctly, will bring a never ending GPU mining on Core chain.

Won't this centralize on private armies of GPU programmers hired to optimize all these hash functions, which they then of course will keep private?

Or perhaps it will centralize on private armies of FPGA designers and (access to) FPGA foundries?

Who will be responsible for developing the GPU code that the public at large is supposed to run?
Will they cater to all the different GPU architectures and models?

Hi John,
Yes I agree there will be race to develop GPUs (and maybe FPGAs).
There will also be massive deployments of GPUs farms.
I don't see the above as bad things.

Guy

Maybe I'm a bit naive, but isn't the hashpower keeping the network safe? So nobody can take it over? Aren't ASICs doing that job pretty well?
Changing the PoW - that's an altcoin then. The current miners will not follow, so it's gonna split into two - of which only one has powerful ASIC mining which is doing above mentioned job pretty fine.
Is there something I don't get?
Please read the article. The scenario I'm describing is AFTER Classic was activated.
legendary
Activity: 1600
Merit: 1014
Lesson learned from the Classic coup attempt or why Core needs to prepare a GPU only PoW:

...
prepare a large set of cryptographic hash functions, at least 100 or more initially. Any simple (not memory hard) function will do
...
Each PoW function actually serves for 5 months
...

This proposal if implemented correctly, will bring a never ending GPU mining on Core chain.

Won't this centralize on private armies of GPU programmers hired to optimize all these hash functions, which they then of course will keep private?

Or perhaps it will centralize on private armies of FPGA designers and (access to) FPGA foundries?

Who will be responsible for developing the GPU code that the public at large is supposed to run?
Will they cater to all the different GPU architectures and models?

Hi John,
Yes I agree there will be race to develop GPUs (and maybe FPGAs).
There will also be massive deployments of GPUs farms.
I don't see the above as bad things.

Guy

Maybe I'm a bit naive, but isn't the hashpower keeping the network safe? So nobody can take it over? Aren't ASICs doing that job pretty well?
Changing the PoW - that's an altcoin then. The current miners will not follow, so it's gonna split into two - of which only one has powerful ASIC mining which is doing above mentioned job pretty fine.
Is there something I don't get?
donator
Activity: 1414
Merit: 1051
Spondoolies, Beam & DAGlabs
Lesson learned from the Classic coup attempt or why Core needs to prepare a GPU only PoW:

...
prepare a large set of cryptographic hash functions, at least 100 or more initially. Any simple (not memory hard) function will do
...
Each PoW function actually serves for 5 months
...

This proposal if implemented correctly, will bring a never ending GPU mining on Core chain.

Won't this centralize on private armies of GPU programmers hired to optimize all these hash functions, which they then of course will keep private?

Or perhaps it will centralize on private armies of FPGA designers and (access to) FPGA foundries?

Who will be responsible for developing the GPU code that the public at large is supposed to run?
Will they cater to all the different GPU architectures and models?

Hi John,
Yes I agree there will be race to develop GPUs (and maybe FPGAs).
There will also be massive deployments of GPUs farms.
I don't see the above as bad things.

Guy
legendary
Activity: 990
Merit: 1108
Lesson learned from the Classic coup attempt or why Core needs to prepare a GPU only PoW:

...
prepare a large set of cryptographic hash functions, at least 100 or more initially. Any simple (not memory hard) function will do
...
Each PoW function actually serves for 5 months
...

This proposal if implemented correctly, will bring a never ending GPU mining on Core chain.

Won't this centralize on private armies of GPU programmers hired to optimize all these hash functions, which they then of course will keep private?

Or perhaps it will centralize on private armies of FPGA designers and (access to) FPGA foundries?

Who will be responsible for developing the GPU code that the public at large is supposed to run?
Will they cater to all the different GPU architectures and models?
legendary
Activity: 1600
Merit: 1014

I think the question everyone has been asking is, why are you advocating for PoW changes that delete your own company?

Bizarre.

Can we take it as granted, that there will be no SP50? It's a real pity watching!
 
legendary
Activity: 1666
Merit: 1185
dogiecoin.com

I think the question everyone has been asking is, why are you advocating for PoW changes that delete your own company?
donator
Activity: 1414
Merit: 1051
Spondoolies, Beam & DAGlabs
donator
Activity: 1414
Merit: 1051
Spondoolies, Beam & DAGlabs
Memory bound hashing is very good suggestion: https://github.com/tromp/cuckoo
Is that really any better?

Quoting a recent Hacker News comment of mine (https://news.ycombinator.com/threads?id=tromp)
Quote
Bitcoin mining could be more decentralized if it better resembled a lottery, where huge numbers of people play for an expected loss.

In other words, the lack of people mining at a loss makes mining profitable and hence subject to forces of centralization.

There are several reasons why mining as a lottery substitute is rare, a major one being that commodity hardware is inefficient by many orders of magnitude, making even a botnet next to useless.

Perhaps, if a proof of work, whose efficiency gap (with custom hardware) is at most an order of magnitude, were adopted (or slowly phased in), enough lottery players would arise to make mining unprofitable at scale.

Botnets should then just be welcomed as a modest increase in decentralization.

However I don't expect Spondoolies-Tech to support this vision of unprofitable mining...

Disclaimer: I designed Cuckoo Cycle
Hello John, nice to meet you in our little thread. Mind the dog.
Although I really like Cuckoo Cycle, memory bound hashing is too much susceptible to BotNets, unlike GPU hashing.

The biggest problem with PoW change in general is hash-rate oscillation. Emin Gün Sirer just solved it for me. I'll publish my suggestion soon.
In an essence, It allow automatic change of PoW with only one Hard Fork needed.

Guy
donator
Activity: 4760
Merit: 4323
Leading Crypto Sports Betting & Casino Platform
Quote
Quoting a recent Hacker News comment of mine (https://news.ycombinator.com/threads?id=tromp)
 Bitcoin mining could be more decentralized if it better resembled a lottery, where huge numbers of people play for an expected loss.

Ya, lets make it so the people who are the backbone of Bitcoin all lose money in a gambling ring!  That's the solution!
legendary
Activity: 1526
Merit: 1013
Make Bitcoin glow with ENIAC
Memory bound hashing is very good suggestion: https://github.com/tromp/cuckoo

Is that really any better?

High end GPUs (still costing the same as a CPU) run it 5x quicker and that amount is forever going to diverge. So again its not 1 CPU 1 vote, its 1 CPU and back to the 6 GPUs on ribbon risers and 31 votes. Then you have the problems that RAM isn't an infallible resource - PCI-E based SSDs are getting closer and closer to RAM but with 100x the capacity. I am aware even commercial 4x PCI-E SSDs have 250-500x higher latency than core RAM but what could we do if that was the actual objective? It won't take long for a Spondoolies-Tech V2 to work out a Cuckoo42 ASIC.

Edit:
Seems like you're more concerned with getting the previous pages of discussion buried, I don't think you believe what you're posting. Are the investors calling?

I read the Toomin vs Corem pastebin thingy. Is this a flamboyant ragequit or has Guy lost his marbles? Again?
Pages:
Jump to: