Pages:
Author

Topic: So who the hell is still supporting BU? - page 9. (Read 29824 times)

legendary
Activity: 3038
Merit: 1660
lose: unfind ... loose: untight
February 18, 2017, 05:42:07 PM
No idea why you are babbling about nonsense like 60 minute verification times.  Might as well go full retard red herring and ask what happens if a 60 day or 60 year block happened.

The problem here is you have no idea where the grey area between reasonable and inordinate is, despite having the required knowledge spoon fed to you like a baby ("F2pool's ~1mb tx took 4 min on Electrum server, which would fall behind the network given ~2.5mb tx").

::sigh:: You sure are persistent in your preening pedantic posturing. And you _still_ have not answered the question, posed thrice. I'll even modify the time for you:

Riddle me this: built off a parent of the same block height, a miner is presented -- at roughly the same time:
1) an aberrant block that takes an inordinate amount of time (e.g.,  10 minutes) to verify but is otherwise valid;
2) a 'normal' valid block that does not take an inordinate amount of time to verify; and
3) an invalid block.
Which of these three do you suppose that miner will choose to build the next round atop in order to maximize profit?
hero member
Activity: 686
Merit: 504
February 18, 2017, 03:01:27 PM

*XT gets rekt*
*Classic gets rekt*

*time passes*


Try to guess what happens next!   Cheesy

This time it's different. This time Core got "rekt". They can keep putting out sh*t releases all they want, it won't change reality.

Keep on putting lipstick on that pig, and put her right in the front of your shop window. Maybe try a bow on her head?
hero member
Activity: 686
Merit: 504
February 18, 2017, 02:57:35 PM
Price is tracking BU's odds to win. Bitcoin economic majority = market wants BU and on-chain scaling now.
Bullshit and you know it. The market gives zero fucks about this trashware called BU.

There are many complaints about high fees and long queue waiting times. The market does give a fuck and miners are going to answer its demands.

In the market there is always a cause and an effect, it will definitely be interesting to see how this all pans out. Something has to give but I'm not entirely sure what it is yet.

http://moneyandstate.com/the-true-cost-of-bitcoin-transactions/

Erik Voorhees says that TIME and SANITY are the biggest costs to Bitcoin users. Good point, I know I'm frustrated! "Take me away Monero-baby..."
legendary
Activity: 3906
Merit: 6249
Decentralization Maximalist
February 18, 2017, 02:05:23 PM
Well, you can't really convince the miners when there isn't a HF proposal that properly does this post-segwit. The only BIP that increases the block size sometime and builds on-top of Segwit is luke-jr's. However, that one does't see any growth until a few years into the future.

I heard from Luke's BIP. A block size of 300 kB (if I remember right) is far too low for now - I think that would already be a decision in favour of a digital gold without "regular on-chain payment functionality" which I highly doubt it will work.

But the rest of the plan goes into the right direction. Exactly such a BIP (Segwit and then 17% increase every year) would be my preferred way to solve the problem. I would even propose such a BIP myself but would need the support of Bitcoin experts as I'm not a programmer and so nobody would take me seriously Wink
legendary
Activity: 2674
Merit: 2965
Terminated.
February 18, 2017, 10:37:03 AM
Poor people outnumber rich people , they can move to a coin with cheaper TX fees, and that will affect the market.
Economic majority != poor people.

Hmm, the fact that segwit will never be activated , just can't get past that furball you call a brain.
You are a delusional troll. You don't understand the fact that I don't really care whether Bitcoin upgrades to it or not. These TX fees don't have any effect on me, and they wouldn't even in the case of them growing 10-100 fold.

Too sad, that science dictates to have SW first and than Schnorr... What a pity!
If the right people were being listened to, that's the direction in which we would go.
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
February 18, 2017, 10:33:46 AM
The O(n^2) sigop attack cannot be mitigated with Electrum X or by simply buying a faster Xeon server.

As Gavin said, we need to move to Schnorr sigs to get (sub)linear sig validation time scaling.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.

Informed Bitcoiners like Adam Back and the rest of Core plan to do segwit first, because it pays off technical debt and thus strengthens the foundation necessary to support increased block sizes later.


So you are saying their Developer is not competent enough to find a solution.
I think if the Developer of electrum was actually worried about it , he would have mentioned it when asked point blank on the blocksize issue.



Electrum devs cannot change the fact Bitcoin uses Lamport sigs.  As currently implemented, Lamport sig validation scales quadratically with tx size.

Nobody can fix that until we have segwit and may then change to Schnorr sigs.

Source:



Too sad, that science dictates to have SW first and than Schnorr... What a pity!
legendary
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
February 18, 2017, 09:41:41 AM
The O(n^2) sigop attack cannot be mitigated with Electrum X or by simply buying a faster Xeon server.

As Gavin said, we need to move to Schnorr sigs to get (sub)linear sig validation time scaling.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.

Informed Bitcoiners like Adam Back and the rest of Core plan to do segwit first, because it pays off technical debt and thus strengthens the foundation necessary to support increased block sizes later.


So you are saying their Developer is not competent enough to find a solution.
I think if the Developer of electrum was actually worried about it , he would have mentioned it when asked point blank on the blocksize issue.



Electrum devs cannot change the fact Bitcoin uses Lamport sigs.  As currently implemented, Lamport sig validation scales quadratically with tx size.

Nobody can fix that until we have segwit and may then change to Schnorr sigs.

Source:

legendary
Activity: 1092
Merit: 1000
February 18, 2017, 06:59:04 AM
There are many complaints about high fees and long queue waiting times. The market does give a fuck and miners are going to answer its demands.
If you're complaining about TX fees of 15-30 cents, you don't have money to move the market.

Poor people outnumber rich people , they can move to a coin with cheaper TX fees, and that will affect the market.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.
Yes, Segwit would make that upgrade quite easier. I think that is one of the current plans post-Segwit anyways.

Hmm, the fact that segwit will never be activated , just can't get past that furball you call a brain.

 Cool

FYI:
BTC unlimited hit 31.3% , segwit never got over 30%,
Tide has shifted.  Wink
legendary
Activity: 2674
Merit: 2965
Terminated.
February 18, 2017, 06:44:52 AM
There are many complaints about high fees and long queue waiting times. The market does give a fuck and miners are going to answer its demands.
If you're complaining about TX fees of 15-30 cents, you don't have money to move the market.

In the market there is always a cause and an effect, it will definitely be interesting to see how this all pans out. Something has to give but I'm not entirely sure what it is yet.
This BU stuff has zero relevance to the price.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.
Yes, Segwit would make that upgrade quite easier. I think that is one of the current plans post-Segwit anyways.
legendary
Activity: 1092
Merit: 1000
February 18, 2017, 06:19:04 AM

kiklo,

The O(n^2) sigop attack cannot be mitigated with Electrum X or by simply buying a faster Xeon server.

As Gavin said, we need to move to Schnorr sigs to get (sub)linear sig validation time scaling.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.

Informed Bitcoiners like Adam Back and the rest of Core plan to do segwit first, because it pays off technical debt and thus strengthens the foundation necessary to support increased block sizes later.


So you are saying their Developer is not competent enough to find a solution.
I think if the Developer of electrum was actually worried about it , he would have mentioned it when asked point blank on the blocksize issue.
(His biggest issue is he just wants the debate over so he knows which direction the network is going.)

Segwit is not going to happen , the miners are not giving up their livings just so Core can take over BTC with an iron fist.
So unless Core is forcing a fork, they better start looking for other solutions , like BU.
And if they do force a fork , my money is on the Chinese miners crushing them.

 Cool

FYI:
Electrum is 5% to 10% of the BTC users, it is an all volunteer group, they receive no funding.
So neither BTC core or the Miners are really worried about them or their users.

FYI2:
https://www.buybitcoinworldwide.com/wallets/#hot-wallets
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
February 18, 2017, 06:16:35 AM
Try this: If e.g. A stock price tested against a barrier two times, will it hold 3rd for sure?

Stock prices follow random walks.  Stock prices are not experiments.  The do not support or disprove a a hypothesis.

You don't seem to have a clear understanding of how science works.

I can't imagine living in such a dark, demon-haunted world.

I'm truly sorry you never received a proper education.

Here, read this.

https://explorable.com/falsifiability

Let there be light!   Cool



kiklo,

The O(n^2) sigop attack cannot be mitigated with Electrum X or by simply buying a faster Xeon server.

As Gavin said, we need to move to Schnorr sigs to get (sub)linear sig validation time scaling.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.

Informed Bitcoiners like Adam Back and the rest of Core plan to do segwit first, because it pays off technical debt and thus strengthens the foundation necessary to support increased block sizes later.


Let ICi believe in random walks and sience is answer to all issues. He does not even get how to communicate in civilized manner.
Riddle for you: What distribution do you need for your stock market random walk or ( same task) what one is underlying to the real ( not academic ) economics like bitcoin experiment (e.g. acceptance probability)?
legendary
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
February 18, 2017, 05:57:23 AM
Try this: If e.g. A stock price tested against a barrier two times, will it hold 3rd for sure?

Stock prices follow random walks.  Stock prices are not experiments.  They do not support or disprove a hypothesis.

You don't seem to have a clear understanding of how science works.

I can't imagine living in such a dark, demon-haunted world.

I'm truly sorry you never received a proper education.

Here, read this.

https://explorable.com/falsifiability

Let there be light!   Cool



kiklo,

The O(n^2) sigop attack cannot be mitigated with Electrum X or by simply buying a faster Xeon server.

As Gavin said, we need to move to Schnorr sigs to get (sub)linear sig validation time scaling.

And AFAIK moving to Schnorr sigs at minimum requires implementing Core's segwit soft fork.

Informed Bitcoiners like Adam Back and the rest of Core plan to do segwit first, because it pays off technical debt and thus strengthens the foundation necessary to support increased block sizes later.
legendary
Activity: 1092
Merit: 1000
February 18, 2017, 05:33:44 AM
The problem here is you have no idea where the grey area between reasonable and inordinate is, despite having the required knowledge spoon fed to you like a baby ("F2pool's ~1mb tx took 4 min on Electrum server, which would fall behind the network given ~2.5mb tx").

You don't even seem to be aware verification times vary according to the miners' node software and server hardware/network capabilities, nor of the fact such power imbalances may be used maliciously against other miners.  Did you sleep through the discussion about the GFOC?

Miners' decisions depend on their local conditions, counterparty obligations, goals, motivations, and levels of expertise, the amount of fees in the blocks, their software/hardware/network configuration/capabilities/limitations, what they expect other miners to do and/or their strategy for attacking other miners (game theory), etc.

It appears you are committed to remaining somewhere other than here in the real world where O(n^2) attacks are a problem, and going to stick with the wishful thinking, hand-waving, and "Because Mining Incentives!" slogan.

Good luck with that!   Smiley

@iCEBREAKER,

1. Are you suggesting an ENTIRE Coin network block updating to unlimited because YOU claim 1 developer can't update their code to work with BTC unlimited.
Keyword : You Claim,

Because here is what the Electrum Developer said
Quote
Voegtlin explained that he has no strong position on the preferred block size limit itself – though he regrets that technical discussion has taken a backseat lately.
“I am not in a position to tell what the best block size is,” Voegtlin said
Plus he has no problem updating electrum for Segwit, which you posers claim can reach ~4 Megabyte block size.
Like the rest of the world update or get left behind.  I have no doubts electrum would be updated to work with BU , in short order.


But guess what in the real world when a Vendor is unable to make a product work, we move to a product that does work.  Tongue
http://www.newsbtc.com/2016/10/16/electrum-x-newer-much-faster-variant-electrum-server/
Quote
A much faster version of Electrum Server is now available.
The reimplementation of Electrum Server, called Electrum X is the creation of Neil Booth which he claims to be 10 times faster than the Electrum Server
.


 Cool
hero member
Activity: 578
Merit: 554
February 18, 2017, 05:23:37 AM
Price is tracking BU's odds to win. Bitcoin economic majority = market wants BU and on-chain scaling now.
Bullshit and you know it. The market gives zero fucks about this trashware called BU.

There are many complaints about high fees and long queue waiting times. The market does give a fuck and miners are going to answer its demands.

In the market there is always a cause and an effect, it will definitely be interesting to see how this all pans out. Something has to give but I'm not entirely sure what it is yet.
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
February 18, 2017, 05:13:57 AM

*XT gets rekt*

*time passes*



*Classic gets rekt*

*time passes*


BU is going to win this fight.

Try to guess what happens next!   Cheesy

Yeah. You convinced me again. You find two measurements and extrapolate the future out of this. Great brain!
 Grin

Edit: So we should go long always at the end of every trend?

You are confusing "measurements" with the resolutions of the XT and Classic experiments.

Instruments take measurements, experiments test hypotheses.

What an appallingly ignorant display of your lack of basic familiarity with the philosophy of science.

I didn't say anything about "the end of every trend."  My point is obviously specific to the resolution of the Unlimite_ measurement experiment.

Please try to stay focused on the subject at hand and not wander off into lazy universal generalities about what we should or should not "always" do.

Ok , thx that you stayed on and just w/o any insult.

Try this: If e.g. A stock price tested against a barrier two times, will it hold 3rd for sure?
sr. member
Activity: 378
Merit: 250
February 18, 2017, 05:13:05 AM
Price is tracking BU's odds to win. Bitcoin economic majority = market wants BU and on-chain scaling now.
Bullshit and you know it. The market gives zero fucks about this trashware called BU.

There are many complaints about high fees and long queue waiting times. The market does give a fuck and miners are going to answer its demands.
sr. member
Activity: 378
Merit: 250
February 18, 2017, 05:10:05 AM
BU is going to win this fight.

Try to guess what happens next!   Cheesy

Depends on stubbornness of Core devs and if they are willing to redefine what Bitcoin is, namely, the miners hashing sha256 establish the longest chain consensus part of definition.
If Core devs have an agenda to win at any cost, they will create a different hashing algo client, I think LukeJr already did it with Keccak, and Bitcoin will fork into sha256 and Keccak two chains.
If Core devs give up, users will happily continue to use the chain that old good sha256 miners are hashing.
Do you have a different view?
full member
Activity: 322
Merit: 151
They're tactical
February 18, 2017, 04:50:03 AM
To me the problem with LN is not LN itself, but that there should be no ambiguity on the fact that it's not the same than bitcoin network. It's more this confusion that i find disturbing, like saying writing data in a cache is "just the same" than writing the data disk but just much faster .. Well no it's not the same thing done faster, it's faster precisely because it doesn't do the thing that take the more time and give all the security to the data.

With LN it's a bit the same confusion saying it's just the same than bitcoin network but faster, or more scalable, yes it's faster, and it can scale processing of off chain transaction, but it's still not the same than the real bitcoin network, and it's faster precisely because it bypass the actual processing of the mining node and the global validation through distributed proof of work. And it doesn't solve anything of the actual bitcoin scaling issue. Just bypass it in condition that are still safe in most case, but the case where it matter should be more clear.

It's not even 'just like a system of cache', because if it was so, it would be totally transparent to all the application using the network, the bitcoin network and LN network should be all the time completely synchronized with each other, even if that involve synchronizing with off chain operation. If there is not 100% certainty that the state in the LN network is 100% synchronized all the time with the onchain data, and that operation in the LN network based on out of date data is invalidated, it's not acting like a simple system of cache. Because it can matter in certain case if the operation is happening inside of the LN network or onchain. The two not exactly equivalent. And there could be a difference between asking the balance of an address on the main network and inside of LN network.

But again i think there is room for LN like things, and i don't think 100% of all use of bitcoin need to happen onchain on a fully synchronized local application, and there is room for off chain things, but need to be no ambiguity that the two are completely different things, LN is not solving the true bitcoin scaling problem, and operations happening inside of LN do not provide the same level of security than true on chain operations, even if they are faster and scale better.

And i'm not sure the developers and team are making such a good job at making this clear, and the case where it can matter if the transaction is happening in LN or onchain. And more trying to make it look like it's the same thing but made faster, but it's faster precisely because it's not doing the same thing and bypass the part that actually take the more time.
legendary
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
February 18, 2017, 04:44:11 AM

*XT gets rekt*

*time passes*



*Classic gets rekt*

*time passes*


BU is going to win this fight.

Try to guess what happens next!   Cheesy

Yeah. You convinced me again. You find two measurements and extrapolate the future out of this. Great brain!
 Grin

Edit: So we should go long always at the end of every trend?

You are confusing "measurements" with the resolutions of the XT and Classic experiments.

Instruments take measurements, experiments test hypotheses.

What an appallingly ignorant display of your lack of basic familiarity with the philosophy of science.

I didn't say anything about "the end of every trend."  My point is obviously specific to the resolution of the Unlimite_ measurement experiment.

Please try to stay focused on the subject at hand and not wander off into lazy universal generalities about what we should or should not "always" do.
hero member
Activity: 578
Merit: 554
February 18, 2017, 04:42:10 AM
I would love to see everyone come together and focus on a reasonable compromise. So much divisiveness in the world, sad to see that it is reflected in the digital world as well.
Pages:
Jump to: