It is basically the same problem as when some guys attempt to divide some public key by 2, 4, 8 etc.
Every time we "divide" we cannot tell if the resulting point lies in the lower half of the interval, or whether it is offset by (groupOrder + 1)/2, because we don't know the parity of the private key.
Regarding this very specific problem, one limited approach I can think of is to subtract G from your initial pubkey and keep them both; the initial and the offset resulting from this subtraction.
Now one of these two is sure to have an even private key, as some pubkey minus G = its private key minus 0x1.
Then you "divide" them both and at least one resulting offset will be within the desired interval.
Maybe you could find a similar work around for your specific problem
Yes, that approach works, but what are the consequences?
First, let's say the private key is 0x1. So, we would rather add G and then divide, otherwise we need to add a special condition check in the algorithm ("is P the point at infinity"), making it slower.
But after division, we have two resulting points:
[1/2]P
[1/2](P + G)
and we don't know which one of these two is the one that sits in the lower interval, and which one sits in another interval (points starting at keys [1/2 mod n] = (n+1)/2 onward).
So we must look for either of them. Since we now have two keys to search for, any algorithm will require two times more work at some stage, in an interval that is half in size (either the lower interval, or the [1/2 + k] interval)
Let's say our algorithm has an average run-time of C * sqrt(N) where C is some (any) constant factor and N is the size of the interval.
Let's simplify C to be 1 (we'll have the same end-result if C is either lower or greater then 1).
If we look for the private key in the original interval, average run-time is then sqrt(N)
If we look for 2 private keys in a half-range interval, average run-time is then 2 * sqrt(N/2) = sqrt(2) * sqrt(N) = 1.41 sqrt(N)
My conclusion (non-definitive and open for debates): the average run-time increases by 41% every time we "divide public key by 2" and attempt to work-around the parity problem.
Now, the cited study uses values for the exponent as 6-decimal digit figures. Following the same logic, I would think that this means the predicted results of his analysis are only valid if the private key that is beaing searched is a multiple of 1.000.000. Otherwise, his wild starting points (the exponents are being used to compute initial
wild kangaroo positions) are not really guaranteed to lie in the search interval, but rather in 1.000.000 different intervals.