Pages:
Author

Topic: delete (Read 113403 times)

sr. member
Activity: 370
Merit: 250
April 01, 2014, 02:32:27 PM

just quick look
int64 LatestBlockTime, shouldn't this be unsigned? do you store negative values in BlockTime?

I wonder if an attacker can start in the past move his way backwards in time until he appears in the future.

In principle, you are right. In practise it does not matter.
The origin of LatestBlockTime is from block timestamp which is unsigned 32 bit integer. Of course it would make sense to use unsigned integers, but BlockLastSolved->GetBlockTime() also returns signed 64 bit integer. Don't know why, maybe using signed has made some calculations easier and since the origin is 32 bit there is no problem.

I assume timestamp is Unix time?
So what happens after  06:28:15 UTC on Sun, 7 February 2106?
You will probably have fix it by then but lets say an attacker fast forward to this day and then wraps around 1970 to current day, are you safe?
legendary
Activity: 2072
Merit: 1049
┴puoʎǝq ʞool┴
April 01, 2014, 12:32:03 PM
seems like aurora is still going strong?
sr. member
Activity: 310
Merit: 250
April 01, 2014, 12:29:44 PM
@Nite69


I sent you a PM about your fix.


~BCX~

Wow, seems like there is a tint of cooperation at the end Smiley That's nice.
sr. member
Activity: 477
Merit: 500
April 01, 2014, 12:14:40 PM

just quick look
int64 LatestBlockTime, shouldn't this be unsigned? do you store negative values in BlockTime?

I wonder if an attacker can start in the past move his way backwards in time until he appears in the future.

In principle, you are right. In practise it does not matter.
The origin of LatestBlockTime is from block timestamp which is unsigned 32 bit integer. Of course it would make sense to use unsigned integers, but BlockLastSolved->GetBlockTime() also returns signed 64 bit integer. Don't know why, maybe using signed has made some calculations easier and since the origin is 32 bit there is no problem.
legendary
Activity: 2940
Merit: 1090
April 01, 2014, 11:58:22 AM
Its a lot more than the ~1GH the (known) attacker is supposedly using, though. Can ~1GH beat ~5 using timewarp? I guess we will find out.

Agreed though that it is pathetically low. We know that just a stupid meme can conjure up 100+ gigahashes almost overnight, so any less is just asking for a "PWN the blockchain(s)!" meme to pop up and PWN it...

-MarkM-
sr. member
Activity: 420
Merit: 263
let's make a deal.
April 01, 2014, 11:52:46 AM
auroracoin is ~5ghash/sec.  still v. vulnerable for a coin of this size and value.

http://bitinfocharts.com/comparison/hashrate-aur.html

this was up from the 1.2 ghash/sec when block 5400 rolled around.

legendary
Activity: 2940
Merit: 1090
April 01, 2014, 11:37:37 AM
If the attack works the price might go down. How is their hashing power looking now? Does it seem like enough yet to counter such an attack? If not likely they'll be available cheap once the attack takes effect, if exchanges re-list them after some kind of damage control or recovery plan is done after the attack.

Realistically they ought not be listed on exchanges at all until they fix their hashing power, as being listed increases the number of people who will be hurt and the incentive for attackers to bother attacking.

-MarkM-
newbie
Activity: 2
Merit: 0
April 01, 2014, 11:36:57 AM
It's bound to be a shit coin.
full member
Activity: 159
Merit: 100
April 01, 2014, 11:31:27 AM
Is this a good time to buying some AUR?
The prize volatility is very large, maybe is a chance.
legendary
Activity: 2940
Merit: 1090
April 01, 2014, 11:27:33 AM
The work is at least equal to the difficulty (well, actually, reciprocally equal; congruent, so to speak).

Adding up difficulty instead of work would make the lucky hashes that happen to be way the heck better than their target difficulty requires from counting as any better than would the worst possible hash that suffices to meet the target. If the attacker is getting way the heck more hashes into the blackchain (way the heck more blocks into the blockchain) thennot counting any work in excess of target on each block should statistically tend to lose the attacker more excess work than the defender, but if the attacker is using lower difficulty blocks then still more of the attacker's hashes will tend to meet the target so the attacker will be having more of their hashes count.

Maybe the attacker's advantage might be less but I suspect the attacker would still have the advantage.

-MarkM-
hero member
Activity: 658
Merit: 500
April 01, 2014, 11:11:58 AM
What iopq is saying is the sum of each every block's difficulties, which should be very close to the total amount of work done in all but the short term.  (Basically the discrete version of an integral, not sure if there's a mathematical name for it or if you'd just call it a sum.)

That seems like a reasonable idea, although it may make it easier make very small forks.  (You can mine one block slower than the true chain, which causes your difficulty to go up, then you only need to catch up to the true chain rather than surpass it to gain preference - something along those lines.  You could probably fix this by making a minimum difficulty of at least the last 4 blocks or so greater to override an equal/longer block.)
well then you'd need to mine two blocks to orphan one
and one of them at higher difficulty

but any counter-measures would depend on the exact difficulty-adjustment algorithm and whether the difficulty adjusts every block or every x blocks
someone more knowledgeable than me could probably tell me why my idea is bad
sr. member
Activity: 370
Merit: 250
April 01, 2014, 05:08:47 AM
Actually mikey you might want to ask Nite69, the developer brought on by the Auroracoin team to help prepare the KGW TW fix. Nite69 realizes there is a KGW TW underway and by extension so does the Auroracoin Team even though won't and can't admit it.

About the vulnerability BCX found; I'm quite sure this will fix it, like any time warp vulnerability. Shame us, we should have fixed this already earlier Undecided
Code:
+	int64 LatestBlockTime = BlockLastSolved->GetBlockTime();
  for (unsigned int i = 1; BlockReading && BlockReading->nHeight > 0; i++) {
  if (PastBlocksMax > 0 && i > PastBlocksMax) { break; }
  PastBlocksMass++;
@@ -894,8 +895,10 @@ unsigned int static GravityWell(const CBlockIndex* pindexLast, const CBlock *pbl
  if (i == 1) { PastDifficultyAverage.SetCompact(BlockReading->nBits); }
  else { PastDifficultyAverage = ((CBigNum().SetCompact(BlockReading->nBits) - PastDifficultyAverage
Prev) / i) + PastDifficultyAveragePrev; }
  PastDifficultyAveragePrev = PastDifficultyAverage;
-
- PastRateActualSeconds = BlockLastSolved->GetBlockTime() - BlockReading->GetBlockTime();
+ if (LatestBlockTime < BlockReading->GetBlockTime()) {
+ LatestBlockTime = BlockReading->GetBlockTime();
+ }
+ PastRateActualSeconds = LatestBlockTime - BlockReading->GetBlockTime();
  PastRateTargetSeconds = TargetBlocksSpacingSeconds * PastBlocksMass;
  PastRateAdjustmentRatio = double(1);
  if (PastRateActualSeconds < 0) { PastRateActualSeconds = 0; }
just quick look
int64 LatestBlockTime, shouldn't this be unsigned? do you store negative values in BlockTime?

I wonder if an attacker can start in the past move his way backwards in time until he appears in the future.
legendary
Activity: 996
Merit: 1013
April 01, 2014, 03:44:01 AM

What would be potential drawbacks / sideeffects of this fix?


I'm wondering if there are any cases, where a "legit" block may have earlier timestamp than latest blocktime.
For instance because of different clock settings and daylight saving time adjustments.  

Has there been a lot of rejected blocks since the new fix?
full member
Activity: 145
Merit: 100
April 01, 2014, 03:42:45 AM
What iopq is saying is the sum of each every block's difficulties, which should be very close to the total amount of work done in all but the short term.  (Basically the discrete version of an integral, not sure if there's a mathematical name for it or if you'd just call it a sum.)

That seems like a reasonable idea, although it may make it easier make very small forks.  (You can mine one block slower than the true chain, which causes your difficulty to go up, then you only need to catch up to the true chain rather than surpass it to gain preference - something along those lines.  You could probably fix this by making a minimum difficulty of at least the last 4 blocks or so greater to override an equal/longer block.)
hero member
Activity: 658
Merit: 500
April 01, 2014, 03:30:58 AM
Shouldn't the coin then say that the chain with the most work "block * difficulty" is better than a chain with more blocks at a lower difficulty? As far as I understand every coin out there considers the highest blockchain to be the best. But wouldn't the highest sum of blocks * difficulty be more secure from time warp exploits?

On the contrary, it seems. The timewarp attacker spawns many more blocks than the defender, so if more blocks meant more length/height in the measure of which chain is the winner, the attacker would have even more of an advantage.

Maybe divide by the number of blocks instead of multiplying?

But likely that would lead instead to an attack by sabing up a chain of few blocks that each took massive amounts of time/work to create.

So either way the current way of adding up the work is likely the best, it is just unfortunate that there is no record of how much work was actually done (how many hashes actually done) by hashers between hashes that got lucky (got enough work to meet their target and thus get into the chain and thus count at all toward total work).

-MarkM-

well he's mining at a difficulty several times lower so that's why he's generating more blocks

sum(block * the difficulty at the time of that block) = total chain
the exploit gives a lot of "cheap" blocks at which point the difficulty is low, so the total sum is low

if the height was calculated using the difficulty at the time the block was found, the real chain has fewer blocks but at a much higher difficulty
I postulate that hashpower * time = the height of the blockchain under my calculation which means the real chain wins every time
legendary
Activity: 2940
Merit: 1090
April 01, 2014, 03:26:06 AM
Shouldn't the coin then say that the chain with the most work "block * difficulty" is better than a chain with more blocks at a lower difficulty? As far as I understand every coin out there considers the highest blockchain to be the best. But wouldn't the highest sum of blocks * difficulty be more secure from time warp exploits?

On the contrary, it seems. The timewarp attacker spawns many more blocks than the defender, so if more blocks meant more length/height in the measure of which chain is the winner, the attacker would have even more of an advantage.

Maybe divide by the number of blocks instead of multiplying?

But likely that would lead instead to an attack by saving up a chain of few blocks that each took massive amounts of time/work to create.

So either way the current way of adding up the work is likely the best, it is just unfortunate that there is no record of how much work was actually done (how many hashes actually done) by hashers between hashes that got lucky (got enough work to meet their target and thus get into the chain and thus count at all toward total work).

-MarkM-
hero member
Activity: 658
Merit: 500
April 01, 2014, 03:20:06 AM
Shouldn't the coin then say that the chain with the most work "block * difficulty" is better than a chain with more blocks at a lower difficulty? As far as I understand every coin out there considers the highest blockchain to be the best. But wouldn't the highest sum of blocks * difficulty be more secure from time warp exploits?
legendary
Activity: 2940
Merit: 1090
April 01, 2014, 02:49:43 AM
There is no cumulation of difficulty, only of "work".

But, only chunks (hashes) of "work" that meet or beat the difficulty threshold that is their target get into the chain.

Low enough difficulty and every hash would get into the chain.

High difficulty hardly any hashes luck into enough of a "work" value result to get into the chain so most hashes aren't counted in the chain thus aren't part of the accumulated total "work" that determines which chain is "longest".

-MarkM-
legendary
Activity: 1078
Merit: 1002
100 satoshis -> ISO code
April 01, 2014, 02:46:55 AM
This puzzles me too.

Isn't cumulative difficulty normally the summed target difficulty? Does the TW make the target diff of each block appear higher than what was actually required in hashing power to mine it?
legendary
Activity: 2940
Merit: 1090
April 01, 2014, 02:34:33 AM
Because I am mining at a much lower difficulty than they are.

Trust me 1 GH is plenty.

Less difficulty just lets less work make more blocks, doesn't it? The amount of work is no different? Or is work being calculated as the amount more than current difficulty or somesuch instead of just the amount actually done?

I had not thought (or realised?) that the same amount of work becomes more work when it happens to be applied to a lower difficulty block; I thought work is what you actually do and difficulty is whether that is enough for a block.

Does the fact that lower work hash-results don't get onto the chain at all at higher difficulty make the difference? So that more of the attacker's hashes (and thus their work) get onto the chain than do the defender's hashes (and thus their work)?

(The defender more often than the attacker doing work that isn't enough to get a block into the chain thus never gets counted?)

(Which maybe could also be part of why faster block chains get more security sooner out of the same amount of hashpower?)

-MarkM-
Pages:
Jump to: