Pages:
Author

Topic: BUgcoin strikes back - page 2. (Read 6618 times)

legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
March 26, 2017, 12:52:34 AM

SegWit as proposed already will increase the size of the block and the burden on nodes very substantially.
 

Fair enough.  I guess this is where we can shake hands and agree to disagree.  Sure Segwit offers *some* on chain scaling, but too little too late for my taste.

I want to ask if the increase in block size through Segwit not enough for the present number of transactions. Is it not already a big improvement? What do you think the average block size be today if we were using Bitcoin Unlimited?

I am not a segwit expert.  I have heard it starts at 1.7mb but I have also heard that assumes all wallets in the world are using it which probably isn't the case.

I think if we truly had 1.7 today it would be ok for the moment (assuming people that left bitcoin to start using altcoins came back) but the idea is always stay WELL ahead of the curve.

Remember that demand is affected by supply in the sense that if people experience (or even anticipate) a congested network, they will be preemptively dissuaded from using Bitcoin. 

So 1.7 or 2 might be ok for a few months but hopefully we grow far beyond that.
legendary
Activity: 2898
Merit: 1823
March 26, 2017, 12:26:18 AM

SegWit as proposed already will increase the size of the block and the burden on nodes very substantially.
 

Fair enough.  I guess this is where we can shake hands and agree to disagree.  Sure Segwit offers *some* on chain scaling, but too little too late for my taste.

I want to ask if the increase in block size through Segwit not enough for the present number of transactions. Is it not already a big improvement? What do you think the average block size be today if we were using Bitcoin Unlimited?
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
March 25, 2017, 09:28:54 PM
I think the decision to ignore mining in the reference client, was the wrong one.

This, I wholeheartedly agree with. Two years ago core devs were saying mining had almost nothing to do with the core implementation which was a very shortsighted view. The lack of emphasis on the delicate interplay between mining and the reference implementation of the bitcoin network protocol and the block chain was a major setback. Of course one could ask why I didn't get involved then since I was busy hacking on mining code and the answer is a simple one: I don't have any faith in my c++ coding skills so it would have been presumptuous of me to try and hack on bitcoin core.

You know what?

That's the level of the argument we should be having about the big-blocks vs multi-layer debate. The problem is that bigblockers took it to the audience and now it's exclusively politics and we have a complete shitshow of a debate with people who have no idea what they are talking about.

If it was the level set by people like dinofelis who actually have very respectable points, then this would be MUCH better. The problem is that the bigblocker side has degenerated to the point their position is untenable. I think this didn't start completely by a problem of their own, but from a separate problem which is that of software development governance. But the reality is that every client they come with is worse than the previous one and we cannot abstract the underlying technical debate from the fact that their current dev base is completely incompetent as 95%+ of devs with experience in this field (which is a HARD field) "roughly" agree with the Core Roadmap or are willing to make concessions to work with them (me included, although I stay very anonymous in dev because this could cost me my job in finance).

So I feel we don't have a way out that doesn't end up in some sort of radicalisation and infighting. They will resort to aggression, losing the argument on that alone even though they do have fair points like those exposed by dinofelis (I just disagree in his conclusions and in some key assumptions, but I actually agree with a lot of his take).

I'm seriously worried about this. If Bitcoin depends on sorting the problem of "perfect agreement in software development" then we are IN THE SHIT because that is never going to happen.

Good post.  I am all for honest and open dialouge.

The whole reason I am for EC is because its very hard to get agreement and its all been around this single blocksize variable.

I am sure there are issues from 'the big blockers' (you probably see me as one of them, that's fine).

But I also see issues from the small blockers.  For example, refusal to even admit blocks were getting fuller, or censorship of
even discussing the issues...and as you said 'aggression' -- plenty of that on both sides.

I'm really trying to make a personal effort to be as cordial as I can and see where some common ground can be established.
We're all bitcoiners.





donator
Activity: 980
Merit: 1000
March 25, 2017, 09:20:27 PM
I think the decision to ignore mining in the reference client, was the wrong one.

This, I wholeheartedly agree with. Two years ago core devs were saying mining had almost nothing to do with the core implementation which was a very shortsighted view. The lack of emphasis on the delicate interplay between mining and the reference implementation of the bitcoin network protocol and the block chain was a major setback. Of course one could ask why I didn't get involved then since I was busy hacking on mining code and the answer is a simple one: I don't have any faith in my c++ coding skills so it would have been presumptuous of me to try and hack on bitcoin core.

You know what?

That's the level of the argument we should be having about the big-blocks vs multi-layer debate. The problem is that bigblockers took it to the audience and now it's exclusively politics and we have a complete shitshow of a debate with people who have no idea what they are talking about.

If it was the level set by people like dinofelis who actually have very respectable points, then this would be MUCH better. The problem is that the bigblocker side has degenerated to the point their position is untenable. I think this didn't start completely by a problem of their own, but from a separate problem which is that of software development governance. But the reality is that every client they come with is worse than the previous one and we cannot abstract the underlying technical debate from the fact that their current dev base is completely incompetent as 95%+ of devs with experience in this field (which is a HARD field) "roughly" agree with the Core Roadmap or are willing to make concessions to work with them (me included, although I stay very anonymous in dev because this could cost me my job in finance).

So I feel we don't have a way out that doesn't end up in some sort of radicalisation and infighting. They will resort to aggression, losing the argument on that alone even though they do have fair points like those exposed by dinofelis (I just disagree in his conclusions and in some key assumptions, but I actually agree with a lot of his take).

I'm seriously worried about this. If Bitcoin depends on sorting the problem of "perfect agreement in software development" then we are IN THE SHIT because that is never going to happen.
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
March 25, 2017, 08:35:55 PM
I think the decision to ignore mining in the reference client, was the wrong one.

This, I wholeheartedly agree with. Two years ago core devs were saying mining had almost nothing to do with the core implementation which was a very shortsighted view. The lack of emphasis on the delicate interplay between mining and the reference implementation of the bitcoin network protocol and the block chain was a major setback. Of course one could ask why I didn't get involved then since I was busy hacking on mining code and the answer is a simple one: I don't have any faith in my c++ coding skills so it would have been presumptuous of me to try and hack on bitcoin core.
legendary
Activity: 1120
Merit: 1010
March 25, 2017, 08:21:01 PM
Advancements in either p2pool software or something similar would go a long way in starting to alleviate the problems associated with mining centralization. Unfortunately, there hasn't been enough incentive to advance these things. I think instead of removing mining from the reference client, it should have adopted and improved upon the p2pool software.
It's not from lack of incentive. Many of us have spent a lot of man hours with discussion, theories and code in mining that the non-mining world probably is not remotely aware of and quick to dismiss as "greedy miners." The idea behind p2pool was sound, but the reality was that the design is fundamentally flawed in a way that cannot be fixed, making it worse to mine on than a regular pool. P2pool is ultimately simply a merged mined blockchain on top of the bitcoin blockchain - mining on the merged chain means you're simply solo mining on a chain with slightly lower difficulty, yet still far too high a diff. After extensive discussion and investigation it is clear that these flaws cannot be fixed to make it even as attractive as regular pooled mining, let alone better. The design of the current bitcoin proof of work itself means that will always be the case. Without a massive change to the blockchain and proof of work design, pooled mining will always be possible, and the "p2pool" design will never be as good. A distributed peer to peer proof of work design that intrinsically does not lend itself to pooled mining without even changing from sha256d allowing existing hardware to continue mining is indeed a solution but unfortunately p2pool is not it and cannot be made to be it.

Sure, that's why I added "or something similar". If not that specific implementation, something else with similar goals. Also, I was talking about the past (specifically when mining was removed from the reference client). If it would have been added then, perhaps enough eyes would have been on it to either improve or, if impossible as you say, replace it entirely with something superior. I think the decision to ignore mining in the reference client, was the wrong one.

I think pooled mining has had much greater incentive behind it, because pool operators stand to earn money from improving it and giving their hash rate providers the best experience.
member
Activity: 77
Merit: 10
March 25, 2017, 06:37:53 PM
legendary
Activity: 3430
Merit: 3071
March 25, 2017, 06:25:26 PM
Advancements in either p2pool software or something similar would go a long way in starting to alleviate the problems associated with mining centralization. Unfortunately, there hasn't been enough incentive to advance these things. I think instead of removing mining from the reference client, it should have adopted and improved upon the p2pool software.
It's not from lack of incentive. Many of us have spent a lot of man hours with discussion, theories and code in mining that the non-mining world probably is not remotely aware of and quick to dismiss as "greedy miners." The idea behind p2pool was sound, but the reality was that the design is fundamentally flawed in a way that cannot be fixed, making it worse to mine on than a regular pool. P2pool is ultimately simply a merged mined blockchain on top of the bitcoin blockchain - mining on the merged chain means you're simply solo mining on a chain with slightly lower difficulty, yet still far too high a diff. After extensive discussion and investigation it is clear that these flaws cannot be fixed to make it even as attractive as regular pooled mining, let alone better. The design of the current bitcoin proof of work itself means that will always be the case. Without a massive change to the blockchain and proof of work design, pooled mining will always be possible, and the "p2pool" design will never be as good. A distributed peer to peer proof of work design that intrinsically does not lend itself to pooled mining without even changing from sha256d allowing existing hardware to continue mining is indeed a solution but unfortunately p2pool is not it and cannot be made to be it.

Got to agree with ck on this one. p2pool, as much as I like it, struggled more and more as the barrier to entry (both difficulty and ASIC price gouging) in mining clearly delineated it's shortcomings.

I see Holliday's general point too, I think a different design for a p2pool system could be far more successful, and it would be far better if proof of work redesign took that into account. I wonder if that's feasible, but we'll see. Certainly, when only difficulty and economies of scale were the barriers to entry in mining (i.e. the GPU/FPGA days), p2pool had a far higher percentage of the mining market (somewhere close to 10% at one point IIRC)
donator
Activity: 980
Merit: 1000
March 25, 2017, 06:07:37 PM
No, this is unavoidable for ANY form of PoW system in the long run.  It IS already the case BTW: 14 miner pools have essentially all the hash rate.

The reason is not "block size" or whatever.  The reason is the lottery of PoW, and the economies of scale.

Solo mining is not done much any more, because with solo mining, you win ONE BLOCK every two years or so.  That's too much of a lottery.

If you don't want more than 10% income fluctuation (RMS value) in 1 week of your income, it means that you must be part of a team that "wins 100 times" during a week.  As there are 1000 blocks in 1 week, you must hence be part of a pool that has 10% of the total hash rate.  --> there can be only 10 such pools !

Even if you accept larger fluctuations of income, there will at most be a few tens of mining pools.

Now, these mining pools need good network connections, to their miners, and to other mining pools, because every second lost is a second of hash rate lost.  As mining pools don't trust one another, they want to get good links to SEVERAL of their competitors, to avoid the possibility of "selfish mining" which needs variable network delays to get your private block in front of the public block.  

So, AUTOMATICALLY, this ecosystem will evolve towards "a few tens of pools with very good data connections and big data centres".

As an owner of mining gear, you have all interest to be in a big pool ; but you don't want pools the become monopolies, because then they will start eating off your fees.  So as an owner of mining gear, you are going to be such that you want to have "a few big pools".  In order for your mining gear to be efficiently used, you want to be able to have a good data connection to your host pool --> they need good data centres with good connections to all of their miners.

Once that is the case, automatically the above topology follows.  Has not much to do with block size.  Is intrinsic to the PoW system with specialized hardware (ASICS).  It was built into bitcoin from the start.



Automatically, if nothing else is done on top of this system, and if blocks are allowed to grow freely: this system will consolidate on one node-miner and will make zero sense. It won't be a censorship-resistance transaction system and it will be COMPLETELY POINTLESS.

That is why Bitcoin, right now, is a work in progress. Allowed to concentrate to 1, it's a convoluted, slow data-structure that makes no sense whatsoever. It's like downloading files from a standalone Bit-Torrent server running on your own computer.
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
March 25, 2017, 05:51:40 PM
Advancements in either p2pool software or something similar would go a long way in starting to alleviate the problems associated with mining centralization. Unfortunately, there hasn't been enough incentive to advance these things. I think instead of removing mining from the reference client, it should have adopted and improved upon the p2pool software.
It's not from lack of incentive. Many of us have spent a lot of man hours with discussion, theories and code in mining that the non-mining world probably is not remotely aware of and quick to dismiss as "greedy miners." The idea behind p2pool was sound, but the reality was that the design is fundamentally flawed in a way that cannot be fixed, making it worse to mine on than a regular pool. P2pool is ultimately simply a merged mined blockchain on top of the bitcoin blockchain - mining on the merged chain means you're simply solo mining on a chain with slightly lower difficulty, yet still far too high a diff. After extensive discussion and investigation it is clear that these flaws cannot be fixed to make it even as attractive as regular pooled mining, let alone better. The design of the current bitcoin proof of work itself means that will always be the case. Without a massive change to the blockchain and proof of work design, pooled mining will always be possible, and the "p2pool" design will never be as good. A distributed peer to peer proof of work design that intrinsically does not lend itself to pooled mining without even changing from sha256d allowing existing hardware to continue mining is indeed a solution but unfortunately p2pool is not it and cannot be made to be it.
legendary
Activity: 1120
Merit: 1010
March 25, 2017, 03:05:42 PM
Quote
1) 1 central back bone on which the principal miner pools are connected amongst themselves (from 5 to 14 say)

2) mostly direct server-client links to these 5 - 14 data centres by all "seriouis" client nodes, in order to get the most reliable block chain directly from the source as quickly as possible

3) at the periphery, small amateur nodes connecting in P2P mode to the serious client nodes, if they don't manage to go to one of the main servers.

You just described a nice plan for your future BTU network, not anything I'd have the slightest interest about.

It looks like a completely irreconciliable vision of what Bitcoin should be, so let's make the split clean (and good luck with those devs).

No, this is unavoidable for ANY form of PoW system in the long run.  It IS already the case BTW: 14 miner pools have essentially all the hash rate.

The reason is not "block size" or whatever.  The reason is the lottery of PoW, and the economies of scale.

Solo mining is not done much any more, because with solo mining, you win ONE BLOCK every two years or so.  That's too much of a lottery.

If you don't want more than 10% income fluctuation (RMS value) in 1 week of your income, it means that you must be part of a team that "wins 100 times" during a week.  As there are 1000 blocks in 1 week, you must hence be part of a pool that has 10% of the total hash rate.  --> there can be only 10 such pools !

Even if you accept larger fluctuations of income, there will at most be a few tens of mining pools.

Now, these mining pools need good network connections, to their miners, and to other mining pools, because every second lost is a second of hash rate lost.  As mining pools don't trust one another, they want to get good links to SEVERAL of their competitors, to avoid the possibility of "selfish mining" which needs variable network delays to get your private block in front of the public block.  

So, AUTOMATICALLY, this ecosystem will evolve towards "a few tens of pools with very good data connections and big data centres".

As an owner of mining gear, you have all interest to be in a big pool ; but you don't want pools the become monopolies, because then they will start eating off your fees.  So as an owner of mining gear, you are going to be such that you want to have "a few big pools".  In order for your mining gear to be efficiently used, you want to be able to have a good data connection to your host pool --> they need good data centres with good connections to all of their miners.

Once that is the case, automatically the above topology follows.  Has not much to do with block size.  Is intrinsic to the PoW system with specialized hardware (ASICS).  It was built into bitcoin from the start.

Advancements in either p2pool software or something similar would go a long way in starting to alleviate the problems associated with mining centralization. Unfortunately, there hasn't been enough incentive to advance these things. I think instead of removing mining from the reference client, it should have adopted and improved upon the p2pool software.

If hash rate providers had more incentive to be full nodes and retain their "vote", thus becoming actual miners, we might get a clearer picture of how much support these various proposals can muster.

That ship may have sailed long ago, but I think there is clearly an argument to be made that we should reconsider.
legendary
Activity: 1120
Merit: 1010
March 25, 2017, 02:54:12 PM
Access to high-speed uploads is relatively limited around the world. My node wrecks my home connection and I have the fastest provider in East London.

What? Happily running multiple nodes and coins in rural England.

Either you've done something to limit your node, or you have quite fast internet speeds (upload specifically), or no one is using your node for other reasons.

How about giving us some numbers?

I experienced the same thing muyuu is talking about. My node would happily saturate my upload bandwidth to the point where basic web browsing severely impacted. I had to gimp my node (max connections) until finally switching ISP.

Now my node, in stock configuration with around 60 connections on average, uploads well over 1 terabyte per month, yet my quite expensive internet plan can now handle that. My internet bandwidth is in the top percentile, obviously, and unavailable in all but the most developed locations.
hero member
Activity: 770
Merit: 629
March 25, 2017, 02:51:14 PM
Quote
1) 1 central back bone on which the principal miner pools are connected amongst themselves (from 5 to 14 say)

2) mostly direct server-client links to these 5 - 14 data centres by all "seriouis" client nodes, in order to get the most reliable block chain directly from the source as quickly as possible

3) at the periphery, small amateur nodes connecting in P2P mode to the serious client nodes, if they don't manage to go to one of the main servers.

You just described a nice plan for your future BTU network, not anything I'd have the slightest interest about.

It looks like a completely irreconciliable vision of what Bitcoin should be, so let's make the split clean (and good luck with those devs).

No, this is unavoidable for ANY form of PoW system in the long run.  It IS already the case BTW: 14 miner pools have essentially all the hash rate.

The reason is not "block size" or whatever.  The reason is the lottery of PoW, and the economies of scale.

Solo mining is not done much any more, because with solo mining, you win ONE BLOCK every two years or so.  That's too much of a lottery.

If you don't want more than 10% income fluctuation (RMS value) in 1 week of your income, it means that you must be part of a team that "wins 100 times" during a week.  As there are 1000 blocks in 1 week, you must hence be part of a pool that has 10% of the total hash rate.  --> there can be only 10 such pools !

Even if you accept larger fluctuations of income, there will at most be a few tens of mining pools.

Now, these mining pools need good network connections, to their miners, and to other mining pools, because every second lost is a second of hash rate lost.  As mining pools don't trust one another, they want to get good links to SEVERAL of their competitors, to avoid the possibility of "selfish mining" which needs variable network delays to get your private block in front of the public block.  

So, AUTOMATICALLY, this ecosystem will evolve towards "a few tens of pools with very good data connections and big data centres".

As an owner of mining gear, you have all interest to be in a big pool ; but you don't want pools the become monopolies, because then they will start eating off your fees.  So as an owner of mining gear, you are going to be such that you want to have "a few big pools".  In order for your mining gear to be efficiently used, you want to be able to have a good data connection to your host pool --> they need good data centres with good connections to all of their miners.

Once that is the case, automatically the above topology follows.  Has not much to do with block size.  Is intrinsic to the PoW system with specialized hardware (ASICS).  It was built into bitcoin from the start.

donator
Activity: 980
Merit: 1000
March 25, 2017, 02:37:35 PM
Quote
1) 1 central back bone on which the principal miner pools are connected amongst themselves (from 5 to 14 say)

2) mostly direct server-client links to these 5 - 14 data centres by all "seriouis" client nodes, in order to get the most reliable block chain directly from the source as quickly as possible

3) at the periphery, small amateur nodes connecting in P2P mode to the serious client nodes, if they don't manage to go to one of the main servers.

You just described a nice plan for your future BTU network, not anything I'd have the slightest interest about.

It looks like a completely irreconciliable vision of what Bitcoin should be, so let's make the split clean (and good luck with those devs).
hero member
Activity: 770
Merit: 629
March 25, 2017, 02:14:46 PM
The issue of compensating nodes needs to be addressed before placing an even much higher burden on them than they already have. And it goes up exponentially with block size, it's a P2P gossip protocol not "downloading websites" like Gavin used to say - which is moronic.

Actually, it IS "downloading websites".  There are essentially 14 producers of block chain: the main pools.  With them, you have more than 80-90% of the hash rate.  If these pools are no idiots, they are linked amongst themselves by a high-speed back bone network in, preferably, if they don't trust one another, almost full mesh configuration.  These 14 pools' data centres are the principal servers of block chain.  They make the block chain and hence are the ones having it as primary source.  
 
Any direct connection of your full node to one of their data centres (their powerful "node" that can probably handle hundreds of thousands of connections) gets the block chain directly.  If you want to get the block chain through a P2P connection from another non-important node, you are just using that other node as a proxy server of the block chain, and not from the main servers (the 14 data centres).

I say 14, but in fact, 5 is enough: 5 miners have more than half of the hash rate, and they, for sure, are on a high speed backbone.  It is in their interest to set up very strong nodes, so that the rest of the full nodes can connect directly to them, and not through a clumsy and slow P2P network.

So essentially, the mining pool concentration makes automatically from the bitcoin network:

1) 1 central back bone on which the principal miner pools are connected amongst themselves (from 5 to 14 say)

2) mostly direct server-client links to these 5 - 14 data centres by all "seriouis" client nodes, in order to get the most reliable block chain directly from the source as quickly as possible

3) at the periphery, small amateur nodes connecting in P2P mode to the serious client nodes, if they don't manage to go to one of the main servers.

Bitcoin's mining architecture does not favour a P2P network, it favors a central backbone + spokes network.  Note that it is not a "single server" network: all important miner pools (5 - 14) are independent servers of the same block chain.  You can connect your 'serious node" in P2P mode to several miner data centres in order to be sure that none of them is cheating on you and giving you a "retarded" block chain.

As the block chain is sort of "cryptographically signed" with proof of work, the fact that the source is centralized doesn't matter too much in its quality ; nobody can fake a block chain without spending huge amounts of wasted hash rate.  The only thing that can happen is that you get a "late" block chain, but to avoid that, it is best to get it DIRECTLY from those that produce it: the main mining pool data centres, instead of letting it "drip through" the P2P network at snail pace.

sr. member
Activity: 476
Merit: 501
March 25, 2017, 02:06:58 PM
Access to high-speed uploads is relatively limited around the world. My node wrecks my home connection and I have the fastest provider in East London.

What? Happily running multiple nodes and coins in rural England.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
March 25, 2017, 02:01:38 PM

Up to this date, there's been zero on chain scaling.  Segwit's not even here yet and has less support from miners than BU.  Meanwhile,
companies like Fiverr have had drop Bitcoin because the fees and confirmation times are too high/unreliable.  

That's because a bunch of idiots/shills are dead set on blocking it. Then blame the devs that it doesn't happen.

Total strawman.

Miners are smart enough to be in the Bitcoin mining business, but are going to listen to "idiots/shills" instead of making up their own mind and picking a sensible solution?  Yeah that sounds legit.

Why not just do a simple blocksize increase to 2mb like Jeff Garzik suggested over a year ago? 

Because Greg doesn't want to.  Doesn't fit into his roadmap.  Doesn't fit into his Blockstream business plan.  And the rest of Core has been going along with it.

Sorry, but that's the truth.
donator
Activity: 980
Merit: 1000
March 25, 2017, 01:55:09 PM

Up to this date, there's been zero on chain scaling.  Segwit's not even here yet and has less support from miners than BU.  Meanwhile,
companies like Fiverr have had drop Bitcoin because the fees and confirmation times are too high/unreliable.  

That's because a bunch of idiots/shills are dead set on blocking it. Then blame the devs that it doesn't happen.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
March 25, 2017, 01:53:02 PM

SegWit as proposed already will increase the size of the block and the burden on nodes very substantially.
 

Fair enough.  I guess this is where we can shake hands and agree to disagree.  Sure Segwit offers *some* on chain scaling, but too little too late for my taste.

It's not debatable that the extra burden will be very considerable, and the increase in tx/s will be as well with the coming improvements that SegWit makes possible.

You think putting burden on nodes is a goal? rather than increasing tx/s capacity. Because you talk of increasing block size as a feature.

Get a grip.

Up to this date, there's been zero on chain scaling.  Segwit's not even here yet and has less support from miners than BU.  Meanwhile,
companies like Fiverr have had drop Bitcoin because the fees and confirmation times are too high/unreliable.

  
sr. member
Activity: 287
Merit: 250
March 25, 2017, 01:51:46 PM

SegWit as proposed already will increase the size of the block and the burden on nodes very substantially.
 

Fair enough.  I guess this is where we can shake hands and agree to disagree.  Sure Segwit offers *some* on chain scaling, but too little too late for my taste.
Is it possible to worry about the future bitcoin in this case? Why is it recently very often the prices for the crypto currency are raised then fall. I'm worried about my coins, but as soon as new information arrives, I do not know how to react to it. Probably because of the incomprehensible, I will suffer more and more.
Pages:
Jump to: