Pages:
Author

Topic: Epicenter Bitcoin interview - Mike Hearn - page 2. (Read 2277 times)

legendary
Activity: 1260
Merit: 1008
This is what I've posted on Reddit:
Quote
I have a couple of questions after listening to this episode. Mike was saying that we don't know what will happen when the backlog will keep growing.

Why don't we have some sort of "Staging" blockchain where we can test stuff happening? Why don't we have a live laboratory for these kind of tests? Are we only using math to predict stuff? ELI5!

sidechain anyone ?

Tongue
legendary
Activity: 2156
Merit: 1393
You lead and I'll watch you walk away.

Is this the fork that has Hearn's blacklisting implemented or do we have to wait for the next one? LOL

Those two jackoffs at 'epicenterbitcoin' didn't cover that little aspect of Mike's recent work, and he didn't bring it up for some odd reason.  Amazing, eh?


Yeah, a real shocker. Hearn scares me because he's shown his true colors. For Bitcoin to advance it needs a few basic characteristics to be left intact. Fungibility and scarcity are two biggies in my book. Remove them and Bitcoin won't even have the value of PayPal or MasterCard. At least they have fraud protection.
legendary
Activity: 4760
Merit: 1283

Is this the fork that has Hearn's blacklisting implemented or do we have to wait for the next one? LOL

Those two jackoffs at 'epicenterbitcoin' didn't cover that little aspect of Mike's recent work, and he didn't bring it up for some odd reason.  Amazing, eh?

legendary
Activity: 2156
Merit: 1393
You lead and I'll watch you walk away.
Is this the fork that has Hearn's blacklisting implemented or do we have to wait for the next one? LOL
legendary
Activity: 1904
Merit: 1007
I don't have a clue. But I remember reading articles on Gavin's blog where he showed advantages and disadvantages of the 20MB blocks. Blog posts like this one http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent that I thought were based on testing and real numbers...

I know those articles, but I only see predictions, simulations and lots of math. Nothing concrete. I want to see a real blocks on a test chain that have 20MB of gibberish to run full for one week. I want to see a test chain where we have a 1MB cap and 5MB of transactions. Simulations are usually done before doing testing it live. I'm waiting for the live tests. I know that most of the software companies have a couple of environments with different stages of the software just so they can spot the problems and to filter them from the production/live version. Why don't we have this kind of laboratory???
legendary
Activity: 4760
Merit: 1283

So we agree that 1MB is not forever ever under an operative LN?
My question is: At what rate would the 1MB limit would need to be raised then? and what % of transactions would run under the LN?

Step one - develop and analyze a workable system of scaling that will serve the needs of humanity without destroying the core solution.

Step two - base growth on analysis of the needs of the workable scaling solutions.

Sidechains, as a nearly perfect proxy for actual Bitcoin, provides a coherent scaling potential.  I have my doubts that 1MB will ultimately be enough for the various sidechains to perform the necessary balancing functions they need and allow native Bitcoin users to use the system for truly relevant needs.  If, at the end of the day, my concerns about the strain of 1MB come to pass, I'm certainly open to the possibility of ramping up, but the extent to which I am inclined to do so will be a function of the infrastructure distribution that a tight Bitcoin manages to achieve.

Put another way, I'm willing to have Bitcoin growth use infrastructure distribution and defensive capability be the forcing function but not have it be end-user convenience.  Screwing this up will be a loss for all users on each end of the spectrum.

legendary
Activity: 1512
Merit: 1012
To get real testing data, they probably had some kind of testnet. They probably even have several testnets for different blocksizes. It's obviously just me speculating, but I don't see how could they test this and present this solution (and numbers backing it up) without having some testnet for, well... testing.

I haven't seen this kind of tests. Mike Hearn was complaining that we have no idea what will happen when the backlog of the transactions will keep growing. I know that we have the testnet, but was this 1MB limit actually tested over there? I don't think so. Where is the blockchain that we can break as many times as we wish?

I don't have a clue. But I remember reading articles on Gavin's blog where he showed advantages and disadvantages of the 20MB blocks. Blog posts like this one http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent that I thought were based on testing and real numbers...
hero member
Activity: 700
Merit: 501

Wow!  I've only listened to a few minutes and already I hear blatant mis-information which is vastly below Mike's pay-grade.  In other words, he knows it's bullshit.  The one which triggered my post is that he claims that people on the 'other' side are saying that the block size should remain the same forever.

I have not heard anyone who is vaguely credible make the argument that 1MB has to remain permanent forever with the possible exception of MP.  I and almost everyone else on 'my side' of the argument are perfectly amenable to raising the block size limit, but it needs to be done in a careful manner tuned to the problems at hand and at a time when the problems are as well understood as possible.  In the mean time, the pressure to come up with more far sighted solutions even if it takes some development effort is a really valuable thing.  Sidechain code, for instance, has now been released for alpha and Maxwell's announcement presentation has surprised me in the breadth of technologies which it encompasses.

Clearly Mike is in a full-on panic and the more dubious his utterances become the more credibility he loses.  I'm almost ready to count him out at this time but it's often a mistake to do so pre-maturely.  I think it very possible that he (and his) will try to torpedo Bitcoin itself rather than walk away empty-handed.  I'd to the same if his side got the upper hand.


But we don't know what can happen if we hit the 1MB limit. Do we have enough time to keep studying the subject before we hit said limit? Don't wanna find out the hard way.

People have been projecting full blocks for at least as long as I've been around (circa 2011).  We were hitting capacity in the Satoshi Dice days IIRC, and there were not terribly ill impacts.

Bitcoin is at least marketed to assume capacity limits which drive certain proposed economics (transaction fees.)  An alternative is to have the infrastructure be subsidized by entities who are deriving value (i.e., e-mail where the provider reads it to decide how best to exploit the 'customer') but that has not been formally proposed due to embarrassment I suppose.  Those who are familiar with modern internet economics and technology are almost universally aware of this postulate about Bitcoin support I'm sure.

It is not at all a disaster if people who think they are going to get their coffee purchase into the next block and don't want to pay a realistic fee for an amazing sledgehammer service are disappointed.  The only way that is a disaster is if one labors under the false premise that Bitcoin is doomed unless everyone can use it for everything for free or nearly for free.  I see it as the opposite of a disaster insofar as it will disabuse the ignorant of a false hope.



So we agree that 1MB is not forever ever under an operative LN?
My question is: At what rate would the 1MB limit would need to be raised then? and what % of transactions would run under the LN?
legendary
Activity: 1904
Merit: 1007
People have been projecting full blocks for at least as long as I've been around (circa 2011).  We were hitting capacity in the Satoshi Dice days IIRC, and there were not terribly ill impacts.

There are 2 scenarios:

1) We maintain the 1MB limit and we are going down into an unknown path;

2) We raise the limit to 8MB and we go down to a path that we already know.

I prefer number 2 because it's the most logical way from these 2 scenarios. You choice is not the logical one.

To get real testing data, they probably had some kind of testnet. They probably even have several testnets for different blocksizes. It's obviously just me speculating, but I don't see how could they test this and present this solution (and numbers backing it up) without having some testnet for, well... testing.

I haven't seen this kind of tests. Mike Hearn was complaining that we have no idea what will happen when the backlog of the transactions will keep growing. I know that we have the testnet, but was this 1MB limit actually tested over there? I don't think so. Where is the blockchain that we can break as many times as we wish?
legendary
Activity: 1512
Merit: 1012
This is what I've posted on Reddit:
Quote
I have a couple of questions after listening to this episode. Mike was saying that we don't know what will happen when the backlog will keep growing.

Why don't we have some sort of "Staging" blockchain where we can test stuff happening? Why don't we have a live laboratory for these kind of tests? Are we only using math to predict stuff? ELI5!

To get real testing data, they probably had some kind of testnet. They probably even have several testnets for different blocksizes. It's obviously just me speculating, but I don't see how could they test this and present this solution (and numbers backing it up) without having some testnet for, well... testing.
legendary
Activity: 4760
Merit: 1283

Wow!  I've only listened to a few minutes and already I hear blatant mis-information which is vastly below Mike's pay-grade.  In other words, he knows it's bullshit.  The one which triggered my post is that he claims that people on the 'other' side are saying that the block size should remain the same forever.

Is this the most important thing for you from the podcast?

I've now gotten through almost the entire thing and it is a constant stream of panic of the same nature from the bloatcoin crowd.  Hearn is even anticipating needing to kludge around POW going against him.  The bloatcoin contingent is toast though he may still be able to create damage on the way out the door.

Here's Hearn's XT attack on Bitcoin in a nutshell:

  Embrace - check
  Extend - check
  Extinguish - working on it...and it's not looking good.

legendary
Activity: 4760
Merit: 1283

Wow!  I've only listened to a few minutes and already I hear blatant mis-information which is vastly below Mike's pay-grade.  In other words, he knows it's bullshit.  The one which triggered my post is that he claims that people on the 'other' side are saying that the block size should remain the same forever.

I have not heard anyone who is vaguely credible make the argument that 1MB has to remain permanent forever with the possible exception of MP.  I and almost everyone else on 'my side' of the argument are perfectly amenable to raising the block size limit, but it needs to be done in a careful manner tuned to the problems at hand and at a time when the problems are as well understood as possible.  In the mean time, the pressure to come up with more far sighted solutions even if it takes some development effort is a really valuable thing.  Sidechain code, for instance, has now been released for alpha and Maxwell's announcement presentation has surprised me in the breadth of technologies which it encompasses.

Clearly Mike is in a full-on panic and the more dubious his utterances become the more credibility he loses.  I'm almost ready to count him out at this time but it's often a mistake to do so pre-maturely.  I think it very possible that he (and his) will try to torpedo Bitcoin itself rather than walk away empty-handed.  I'd to the same if his side got the upper hand.


But we don't know what can happen if we hit the 1MB limit. Do we have enough time to keep studying the subject before we hit said limit? Don't wanna find out the hard way.

People have been projecting full blocks for at least as long as I've been around (circa 2011).  We were hitting capacity in the Satoshi Dice days IIRC, and there were not terribly ill impacts.

Bitcoin is at least marketed to assume capacity limits which drive certain proposed economics (transaction fees.)  An alternative is to have the infrastructure be subsidized by entities who are deriving value (i.e., e-mail where the provider reads it to decide how best to exploit the 'customer') but that has not been formally proposed due to embarrassment I suppose.  Those who are familiar with modern internet economics and technology are almost universally aware of this postulate about Bitcoin support I'm sure.

It is not at all a disaster if people who think they are going to get their coffee purchase into the next block and don't want to pay a realistic fee for an amazing sledgehammer service are disappointed.  The only way that is a disaster is if one labors under the false premise that Bitcoin is doomed unless everyone can use it for everything for free or nearly for free.  I see it as the opposite of a disaster insofar as it will disabuse the ignorant of a false hope.

legendary
Activity: 868
Merit: 1006

Wow!  I've only listened to a few minutes and already I hear blatant mis-information which is vastly below Mike's pay-grade.  In other words, he knows it's bullshit.  The one which triggered my post is that he claims that people on the 'other' side are saying that the block size should remain the same forever.

I have not heard anyone who is vaguely credible make the argument that 1MB has to remain permanent forever with the possible exception of MP.  I and almost everyone else on 'my side' of the argument are perfectly amenable to raising the block size limit, but it needs to be done in a careful manner tuned to the problems at hand and at a time when the problems are as well understood as possible.  In the mean time, the pressure to come up with more far sighted solutions even if it takes some development effort is a really valuable thing.  Sidechain code, for instance, has now been released for alpha and Maxwell's announcement presentation has surprised me in the breadth of technologies which it encompasses.

Clearly Mike is in a full-on panic and the more dubious his utterances become the more credibility he loses.  I'm almost ready to count him out at this time but it's often a mistake to do so pre-maturely.  I think it very possible that he (and his) will try to torpedo Bitcoin itself rather than walk away empty-handed.  I'd to the same if his side got the upper hand.



But we don't know what can happen if we hit the 1MB limit. Do we have enough time to keep studying the subject before we hit said limit? Don't wanna find out the hard way.
legendary
Activity: 1904
Merit: 1007

Wow!  I've only listened to a few minutes and already I hear blatant mis-information which is vastly below Mike's pay-grade.  In other words, he knows it's bullshit.  The one which triggered my post is that he claims that people on the 'other' side are saying that the block size should remain the same forever.

Is this the most important thing for you from the podcast?
legendary
Activity: 4760
Merit: 1283

Wow!  I've only listened to a few minutes and already I hear blatant mis-information which is vastly below Mike's pay-grade.  In other words, he knows it's bullshit.  The one which triggered my post is that he claims that people on the 'other' side are saying that the block size should remain the same forever.

I have not heard anyone who is vaguely credible make the argument that 1MB has to remain permanent forever with the possible exception of MP.  I and almost everyone else on 'my side' of the argument are perfectly amenable to raising the block size limit, but it needs to be done in a careful manner tuned to the problems at hand and at a time when the problems are as well understood as possible.  In the mean time, the pressure to come up with more far sighted solutions even if it takes some development effort is a really valuable thing.  Sidechain code, for instance, has now been released for alpha and Maxwell's announcement presentation has surprised me in the breadth of technologies which it encompasses.

Clearly Mike is in a full-on panic and the more dubious his utterances become the more credibility he loses.  I'm almost ready to count him out at this time but it's often a mistake to do so pre-maturely.  I think it very possible that he (and his) will try to torpedo Bitcoin itself rather than walk away empty-handed.  I'd to the same if his side got the upper hand.

legendary
Activity: 1904
Merit: 1074
I have to say, until recently.... I was confused about the acctual proposed change to the block size.

We should not forget that Gavin is human too.. It has to be difficult to try to change something, when everyone is pulling in different directions.

How do you convince a group of people about something, if not all of them are playing for the same team?

At first, I thought even Gavin was not playing for the Bitcoin team... but I now believe he has the best interrest of Bitcoin in mind, with the proposed change. {It's unfortunately clear that they want a clear leader with the Bitcoin-XT fork}

How do you lead a team, if a single leader are not allowed? { It looks like Mike is not satisfied with Wladimir leadership }

Would any company with a team of managers suceed, if there were nobody with a final say?

All, I want to say is... I am glad, I am not a Bitcoin Core developer.  
legendary
Activity: 1904
Merit: 1007
This is what I've posted on Reddit:
Quote
I have a couple of questions after listening to this episode. Mike was saying that we don't know what will happen when the backlog will keep growing.

Why don't we have some sort of "Staging" blockchain where we can test stuff happening? Why don't we have a live laboratory for these kind of tests? Are we only using math to predict stuff? ELI5!
legendary
Activity: 1148
Merit: 1014
In Satoshi I Trust
good talk - and finally we are acting.


common misconception: "20MB is too big of a jump"

answer:
It's just a maximum block-size limit. It does not mean we are going to instantly start making 20mb blocks. Blocks will continue growing at their current pace. This is perhaps the biggest misconception that we will somehow suddenly start seeing 20mb blocks just because the limit is raised.
sr. member
Activity: 378
Merit: 257
Great interview, thanks.  I hope lots of people listen to this, it explains the fork pretty clearly in easy to understand terms.  After all that fud going around it is nice to have some sanity for once.
legendary
Activity: 994
Merit: 1035
I wonder what could have happened if Gavin forked to 20MB before anyone realized there was an error on that calculation. Gavin has definitely lost credibility after that mistake.

Everyone makes mistakes, Gavin is intelligent and reasonable and hasn't lost any credibility with me. I would like to see them acknowledge it however(if they already haven't). It is just a bit confusing that they are still pushing for 20 when by his own logic 8 was the intended number.
Pages:
Jump to: