Pages:
Author

Topic: Could proof of Blockchain (PoBC) be the solution to ASIC industrial mining? (Read 4813 times)

legendary
Activity: 1624
Merit: 1005
I wish you all love and profitable investments!!!
Interesting article at Ethereum: https://blog.ethereum.org/2014/06/19/mining/

Also discusses using the blockchain as part of the mining process.  Biggest difference is that this implmentation uses the header and one transaction from the block for generating the outer hash vs the entire block.  This makes validation easier while still requiring the entire blockchain be available when hashing.

Here is a change in my routine using this concept.  This still uses a three hash process to give good variability on the indexes.

Code:
var inner_hash = sha256(block_header);
var chain_index = floor(chain_height * (inner_hash_last_32_bits / max_uint));
var chain_hash = sha256(blockchain[chain_index].header + inner_hash);
var tran_index = floor(blockchain[chain_index].tran_count * (chain_hash_last_32_bits / max_uint));
var final_hash = sha256(blockchain[chain_index].transaction[tran_index] + chain_hash);



Hmmm, very interesting concept. I like the idea!
sr. member
Activity: 401
Merit: 250
Interesting article at Ethereum: https://blog.ethereum.org/2014/06/19/mining/

Also discusses using the blockchain as part of the mining process.  Biggest difference is that this implmentation uses the header and one transaction from the block for generating the outer hash vs the entire block.  This makes validation easier while still requiring the entire blockchain be available when hashing.

Here is a change in my routine using this concept.  This still uses a three hash process to give good variability on the indexes.

Code:
var inner_hash = sha256(block_header);
var chain_index = floor(chain_height * (inner_hash_last_32_bits / max_uint));
var chain_hash = sha256(blockchain[chain_index].header + inner_hash);
var tran_index = floor(blockchain[chain_index].tran_count * (chain_hash_last_32_bits / max_uint));
var final_hash = sha256(blockchain[chain_index].transaction[tran_index] + chain_hash);
newbie
Activity: 5
Merit: 0
Thanks for the information Huh
sr. member
Activity: 401
Merit: 250
Considering an aricle on CoinDesk today, there definitely appears to be a problem as the number of full nodes in the network is declining.
legendary
Activity: 1442
Merit: 1005
The bootstrapping and orphan block issues are very interesting.  Bootstrapping will be slower.  Not sure if there is a way around that.  Orphans I think could be offset by having a higher number of nodes.  If you only have a few hundred nodes then the difficulty will stay very low and orphans will be a common problem.  I can see this with some altcoins even on Script, such as "SXC" and "WDC".  But, what happens if you have 15,000 nodes (such as Bitcoin)?  My hope is that the number of hashers out there will drive up the difficulty enough to keep the orphan count down.  Of course, this depends on fast network propagation.  I'm not sure if the Bitcoin software currently does network structure balancing to keep the path between all nodes the shortest possible but something like that may be needed to communicate out new blocks as quickly as possible to reduce the number of orphans being generated.
Keep researching, it's not impossible to find new optimizations to the technological aspect.

Also remember that if everyone has the same tools or invests the same sums, as is the case now, the same scenarios happen. Is your method changing this or just appears to solve a non-existent problem?
legendary
Activity: 990
Merit: 1108
If I understand the ASIC miners correctly it would be difficult to give them enough memory or they would end up bandwidth constrained.  Most ASIC miners use multiple chips.  For one chip to work efficiently you would need the entire blockchain available on the ASIC.  I believe the Bitcoin blockchain is around 15GB and growing, so that is far more RAM than practical to build into one chip.  An ASIC

An ASIC for this could just store the first 10% of the whole blockchain,
which, considering that early blocks were very small, probably takes much less than 1%
of that 15GB. Then it just keeps generating inner hashes until the index falls within
the first 10%.

But I have to wonder: doesn't this whole scheme break down with an insane memory
requirement for every single client that needs to verify proofs-of-work?
I thought clients, esp. smartphones, have to be able to verify with minimal resources.
sr. member
Activity: 401
Merit: 250
Second, there is no incentive for people to operate a full node with blockchain history.  
Third, specialized hardware (ASIC) is causing a runaway in difficulty and removing any possible RoI for small miners.
What if we tried a proof of blockchain (PoBC) model?  In order to generate a valid hash you need access to the entire blockchain or at least information from every block in the blockchain.
...
I’d like to see a hash generation done this way instead:
   sha256(sha256(block_header)+history_block_hash))
...
1. Use the hash of the selected block (smallest amount with only 32 bytes per block needed).
2. Use the header of the selected block with the current block inner hash appended and then generate a hash on this (80 bytes per block needed).
3. (my favorite) Take the entire selected block with the current block inner hash appended and then generate a hash on this (requires the entire blockchain to be available).

So, why would we want to do this?  

First, it should stop the ASIC miners in their tracks.

1. Store hashes of each block in the memory, no need to keep the blockchain. You can even have a full node and thin miners accessing it to bootstrap and update. There are 297838 blocks with 256 bits for each hash, requiring at least 4Mb of RAM... easily possible to store inside most ASIC miners.

2. Same as above.

3. This requires some extra latency and some variant of 1. You only need to request the block on demand, store it (less than 1Mb of RAM and hash it).

You should also consider this:

ASICs will always exist for EVERYTHING and ANYTHING. Building a computer with only the minimal components required for this job (no fancy CPU cache, no RAM slots, no fancy chipset, no audio or peripherals) will be possible at a lower cost than generic brand computers. There will be companies making these, and people buying them, and the same thing will happen as it does now, these computers will be single purpose (or hardly re-purposable and up-cyclable). It will be just more complex and less efficient.

As for efficiency, the same economic sunk cost into the security of the network will provide the same results no matter the method. But hashing slower than current methods, and involving extra steps will cause the clients to boostrap slower (they need extra database seeks to obtain and confirm blocks) and will cause more orphan blocks (as each node needs to seek the blockchain and confirm the block is correct).

Forcing more resources for a more wasteful and less eficient method of securing blockchains is just silly and won't have the results you want.

Options 1 and 2 I threw out there mostly to show why they wouldn't really work, for the exact reason you listed.

The bootstrapping and orphan block issues are very interesting.  Bootstrapping will be slower.  Not sure if there is a way around that.  Orphans I think could be offset by having a higher number of nodes.  If you only have a few hundred nodes then the difficulty will stay very low and orphans will be a common problem.  I can see this with some altcoins even on Script, such as "SXC" and "WDC".  But, what happens if you have 15,000 nodes (such as Bitcoin)?  My hope is that the number of hashers out there will drive up the difficulty enough to keep the orphan count down.  Of course, this depends on fast network propagation.  I'm not sure if the Bitcoin software currently does network structure balancing to keep the path between all nodes the shortest possible but something like that may be needed to communicate out new blocks as quickly as possible to reduce the number of orphans being generated.
sr. member
Activity: 401
Merit: 250
1. If you require that hash(X) < Target, then one needs to iterate over parts of X in order to find suitable result thus proving certain amount of "work" has been done to find that solution and secure the transactions, making this a PoW coin.

2. There have been numerous discussions in this and litecointalk.org forum whether or not hash function should be memory hard and consequently deterring developments of asics. Although I'm personally on the anti ASIC side of the fence I believe there is more than one argument to consider here which is why I like your idea.

3. Allow me for a brief digression here: I'm not buying the argument that high hash rate makes network any more secure as for example I could develop a super machine that hashes at 100x the speed of all other miners in the network and by the logic of the hash rate=security network would be 100x more secure once I deploy it, however if I really did this I doubt anyone would consider the network (with one miner only) more secure.

4. So only a distributed network with lot's of users and miners makes the coin secure, assuming no such disruptive technology with potential 100x (or 1'000'000x) speed-up can be easily developed. (Which is a stretch I know)

5. ASICs clearly work against this principle as no one I know can tomorrow buy an ASIC device for mining bitcoin or other coins that would actually make a ROI. Just for the sake of example everyone I know can buy a PC with a GPU in any number of local stores tomorrow and mine some X coin with reasonable expectation of making a ROI. If I (and others) could buy an ASIC in a local store and use it to get my investment back I would sit on the ASIC side of the fence. ASIC developers over price their products making ASICs not friendly to miners.

6. Please note that mining pools also work against point #4 as due to economies of scale large pools are by definition: better funded thus more reliable and secure, better developed, probably offer better customer experience and offer lower variation when mining, so they attract more miners  creating a situation where top 5 pools by size control more or less most of the hash rate of a coin. So attacking say Litecoin would probably require taking out top 5 pools which is lot easier than developing "100x" machine from point #3.

7. So I think your idea would be perfect if it could solve both those problems. As someone on this thread correctly pointed our that adding a hash of some previous block to the header before hashing it would not do the trick as hashes of previous blocks could be precomputed I would go further to say that adding arbitrary memory requirement to the hash function would not work in the long term either as there is no real reason why ASIC developers could not bolt arbitrary amounts of memory (even DRAM or disk) to the ASIC as well still this makes development of such machines more difficult. I believe current term for such a coin is "ASIC resistant".

8. I propose (as you have) that the hash function only works efficiently when miner has access to the whole blockchain. For example sha256(block_chain_data(sha256(block_header))

9. I propose that hash function does not work efficiently in terms of forming pools (this may be impossible)

10. Thus miner variance should be made small by other mechanisms. Small block time does not work, but making the whole network act as a one giant p2p pool probably would. I was thinking something along the lines of allowing all miners to submit blocks with lower than network difficulty. Such blocks would not be considered as final as long as someone actually submits a block with lower than network difficulty. Then all miners would be paid proportionally to their contribution. This would open up problems with coinbase transaction which in an naive implementation could not be properly hashed, but this could be worked out by for example coinbase transaction would not be left to the arbitrary address generated by the miner but would rather be automatically generated to send coins to contributors of the previous round. Since all clients are aware of who these contributors are or rather what are their addresses everyone should be able to verify if coinbase transaction is correct at the time block is mined.

And yes this should be a new coin. Bitcoin will never accept change of this magnitude. Question is: can we put together a team to develop it? Personally I would join in. My suggestion would be that we spend a lot of time planning before we take on coding.

Name: BlockChainCoin or BCC Smiley
Any thoughts? Anyone?

Wow, I take a few days off and suddenly the thread wakes up again.

There was some discussion about prestoring the history block hash as a way of defeating keeping the entire blockchain on hand.  That leaves me to think there was some misunderstanding of my original "option 3" or computing the hash.  A better formula expression of that option would be:

Code:
sha256(sha256(block_header)+sha256(history_block+sha256(block_header)))

Or, here is another way to look at it:

Code:
var inner_hash = sha256(block_header);
var index = floor(chain_height * (inner_hash_last_32_bits / max_uint));
var history_hash = sha256(blockchain[index] + inner_hash);
var final_hash = sha256(inner_hash + history_hash);

You can't keep a prehashed copy of the blockchain as you would need to have the block along with the inner hash from the new block's header in order to calculate a new history hash.

As to the points above:

1: Yes, it is still proof of work, just that it requires a lot more information on hand to do the work.

2-5: My problem with most of the altcoin development is that it plays cat and mouse by swapping in different algorithms instead of stepping back to ask what can we do to improve the overall network.

7-8: Discussed above on how to put the hash together.

9: I think it would still be possible to mine in a pool but each miner in the pool would have to have the entire blockchain on hand.  The way I would see this working is that the client (aka, Bitcoin-QT or bitciond) would have an API which could return a block based on the height of the block.  This API could be done as a network connection, but that would be awfully slow.  Better to do it via an interprocess communication mechanism.  The miner software can make an outbound call via this IPC to get the block.

10: Block time is one of the areas giving me heartburn.  I've seen arguments in favor of the Bitcoin 10-minute block time and I've seen arguments in favor of small block times.  To me this comes down to a weighting of how fast a block can propagate out to sufficient nodes and how fast a transaction can be confirmed by enough points in the network for point-of-sale transactions.  Needs much more discussion.

Now, as for development, I'm going to be backed up for a very long time, unfortunately.  So far, the farthest I've got is compiling the Bitcoin-QT client on Windows.  No small task in itself with all of the dependencies, but fortunately there are some good how-to guides out there.  Looking through the code itself brought up another issue I'll have to contend with.  I spent years working on C before going to some other languages for a number of years and then switching to C# a few years back.  My day job is all C#.  Reading the C++ code in the Bitcoin client is possible but not intuitive for me.  Someone who works daily in C++ will be much quicker and finding the necessary bits and pieces.

I had originally said I prefer to do a clean code base rather than starting with the Bitcoin software.  I'm starting to lean the other direction now.  The Bitcoin node software has years of work behind it which makes it a great starting point for something new.  I'd keep it a goal of segregating out changes well enough that as the Bitcoin software evolves this new fork can still utilize any improvements with mostly clean merges.

Finally, planning vs. coding, agree entirely.  I've been floating a lot of ideas on this thread and I'm sure there are plenty of lessons learned from other efforts as well.  A full plan of action should be worked out before any coding starts.  Question would be where to have this discussion.  This thread disappears into the noise within a couple of hours.  Not sure if there is a better place on this site to have a deeper, organized discussion, if there is a better site for this sort of planning, or if we should setup a completely separate site.
member
Activity: 196
Merit: 10
Is honeypenny Implementing this idea already
full member
Activity: 219
Merit: 100
What if we tried a proof of blockchain (PoBC) model?  In order to generate a valid hash you need access to the entire blockchain or at least information from every block in the blockchain.

Currently, in Bitcoin, a hash is generated as follows:
   sha256(sha256(block_header))

I’d like to see a hash generation done this way instead:
   sha256(sha256(block_header)+history_block_hash))

1. If you require that hash(X) < Target, then one needs to iterate over parts of X in order to find suitable result thus proving certain amount of "work" has been done to find that solution and secure the transactions, making this a PoW coin.

2. There have been numerous discussions in this and litecointalk.org forum whether or not hash function should be memory hard and consequently deterring developments of asics. Although I'm personally on the anti ASIC side of the fence I believe there is more than one argument to consider here which is why I like your idea.

3. Allow me for a brief digression here: I'm not buying the argument that high hash rate makes network any more secure as for example I could develop a super machine that hashes at 100x the speed of all other miners in the network and by the logic of the hash rate=security network would be 100x more secure once I deploy it, however if I really did this I doubt anyone would consider the network (with one miner only) more secure.

4. So only a distributed network with lot's of users and miners makes the coin secure, assuming no such disruptive technology with potential 100x (or 1'000'000x) speed-up can be easily developed. (Which is a stretch I know)

5. ASICs clearly work against this principle as no one I know can tomorrow buy an ASIC device for mining bitcoin or other coins that would actually make a ROI. Just for the sake of example everyone I know can buy a PC with a GPU in any number of local stores tomorrow and mine some X coin with reasonable expectation of making a ROI. If I (and others) could buy an ASIC in a local store and use it to get my investment back I would sit on the ASIC side of the fence. ASIC developers over price their products making ASICs not friendly to miners.

6. Please note that mining pools also work against point #4 as due to economies of scale large pools are by definition: better funded thus more reliable and secure, better developed, probably offer better customer experience and offer lower variation when mining, so they attract more miners  creating a situation where top 5 pools by size control more or less most of the hash rate of a coin. So attacking say Litecoin would probably require taking out top 5 pools which is lot easier than developing "100x" machine from point #3.

7. So I think your idea would be perfect if it could solve both those problems. As someone on this thread correctly pointed our that adding a hash of some previous block to the header before hashing it would not do the trick as hashes of previous blocks could be precomputed I would go further to say that adding arbitrary memory requirement to the hash function would not work in the long term either as there is no real reason why ASIC developers could not bolt arbitrary amounts of memory (even DRAM or disk) to the ASIC as well still this makes development of such machines more difficult. I believe current term for such a coin is "ASIC resistant".

8. I propose (as you have) that the hash function only works efficiently when miner has access to the whole blockchain. For example sha256(block_chain_data(sha256(block_header))

9. I propose that hash function does not work efficiently in terms of forming pools (this may be impossible)

10. Thus miner variance should be made small by other mechanisms. Small block time does not work, but making the whole network act as a one giant p2p pool probably would. I was thinking something along the lines of allowing all miners to submit blocks with lower than network difficulty. Such blocks would not be considered as final as long as someone actually submits a block with lower than network difficulty. Then all miners would be paid proportionally to their contribution. This would open up problems with coinbase transaction which in an naive implementation could not be properly hashed, but this could be worked out by for example coinbase transaction would not be left to the arbitrary address generated by the miner but would rather be automatically generated to send coins to contributors of the previous round. Since all clients are aware of who these contributors are or rather what are their addresses everyone should be able to verify if coinbase transaction is correct at the time block is mined.

And yes this should be a new coin. Bitcoin will never accept change of this magnitude. Question is: can we put together a team to develop it? Personally I would join in. My suggestion would be that we spend a lot of time planning before we take on coding.

Name: BlockChainCoin or BCC Smiley
Any thoughts? Anyone?
legendary
Activity: 1442
Merit: 1005
The crypto scene needs more lateral thinkers like this i.e. people who can take a step back and see the bigger picture.

or needs more little shit mouth piece trolls ?
Someone has a case of the mads today. Here, fix it: https://www.youtube.com/watch?v=9uvIGrnA_3I
legendary
Activity: 1442
Merit: 1005
Second, there is no incentive for people to operate a full node with blockchain history.  
Third, specialized hardware (ASIC) is causing a runaway in difficulty and removing any possible RoI for small miners.
What if we tried a proof of blockchain (PoBC) model?  In order to generate a valid hash you need access to the entire blockchain or at least information from every block in the blockchain.
...
I’d like to see a hash generation done this way instead:
   sha256(sha256(block_header)+history_block_hash))
...
1. Use the hash of the selected block (smallest amount with only 32 bytes per block needed).
2. Use the header of the selected block with the current block inner hash appended and then generate a hash on this (80 bytes per block needed).
3. (my favorite) Take the entire selected block with the current block inner hash appended and then generate a hash on this (requires the entire blockchain to be available).

So, why would we want to do this?  

First, it should stop the ASIC miners in their tracks.

1. Store hashes of each block in the memory, no need to keep the blockchain. You can even have a full node and thin miners accessing it to bootstrap and update. There are 297838 blocks with 256 bits for each hash, requiring at least 4Mb of RAM... easily possible to store inside most ASIC miners.

2. Same as above.

3. This requires some extra latency and some variant of 1. You only need to request the block on demand, store it (less than 1Mb of RAM and hash it).

You should also consider this:

ASICs will always exist for EVERYTHING and ANYTHING. Building a computer with only the minimal components required for this job (no fancy CPU cache, no RAM slots, no fancy chipset, no audio or peripherals) will be possible at a lower cost than generic brand computers. There will be companies making these, and people buying them, and the same thing will happen as it does now, these computers will be single purpose (or hardly re-purposable and up-cyclable). It will be just more complex and less efficient.

As for efficiency, the same economic sunk cost into the security of the network will provide the same results no matter the method. But hashing slower than current methods, and involving extra steps will cause the clients to boostrap slower (they need extra database seeks to obtain and confirm blocks) and will cause more orphan blocks (as each node needs to seek the blockchain and confirm the block is correct).

Forcing more resources for a more wasteful and less eficient method of securing blockchains is just silly and won't have the results you want.
legendary
Activity: 1540
Merit: 1011
FUD Philanthropist™
The crypto scene needs more lateral thinkers like this i.e. people who can take a step back and see the bigger picture.

or needs more little shit mouth piece trolls ?
sr. member
Activity: 401
Merit: 250
The crypto scene needs more lateral thinkers like this i.e. people who can take a step back and see the bigger picture.

Thanks, I'm trying.  I just wish I had the spare cycles to work on the software aspects right now.  Ideally I'd prefer to not just clone Bitcoin-QT yet again but that may be the most expedient way.  I've been doing a little studying of the Bitcoin wire protocol and even started work on a wire traffic listener (in C# if you can believe it) but my spare time is so limited I'm not sure how far I'll be able to get.  My hope is really to inspire someone to take up some of these ideas and run.  In the long run, I don't care if my name is attached as long as something viable results out of the efforts.
member
Activity: 109
Merit: 35
The crypto scene needs more lateral thinkers like this i.e. people who can take a step back and see the bigger picture.
full member
Activity: 182
Merit: 100
Well if it ends up creating a viable crytpo that actually thrives via CPU mining and can't be easily GPU or asic-mined that would be awesome.

I think I remember someone talking about a currency called trade units before but I could be wrong. You could call it "Barter."
sr. member
Activity: 401
Merit: 250
This is a pretty cool idea though I don't know if it would be technically feasible. Is there some absolute rule that ASICs can't just modify their onboard memory?

If I understand the ASIC miners correctly it would be difficult to give them enough memory or they would end up bandwidth constrained.  Most ASIC miners use multiple chips.  For one chip to work efficiently you would need the entire blockchain available on the ASIC.  I believe the Bitcoin blockchain is around 15GB and growing, so that is far more RAM than practical to build into one chip.  An ASIC board could have a shared pool with the blockchain that all chips use, but then you run into the bandwidth issue of all these chips trying to talk to a shared memory space.  I'm not even sure a GPU can be made efficient under these circumstances.  The card won't have enough RAM to hold the entire blockchain so there will be lot of traffic going up and down the PCIe bus as every hash computation will require a different history block.

Unrelated, I have a name idea for my hypothetical currency.  "Trade Units" or "Trade Unit Currency".  I'm sick to death of of everything ending in "coin" and I see this as a measurable unit used to facilitate economic activity, or trade.
full member
Activity: 182
Merit: 100
This is a pretty cool idea though I don't know if it would be technically feasible. Is there some absolute rule that ASICs can't just modify their onboard memory?
sr. member
Activity: 401
Merit: 250
Just throwing more ideas in here for the heck of it. 

Relating to my earlier musing on the rate of decrease in the block reward, perhaps a more elegant solution is possible.  There has been talk about inflation vs deflation of coins and what possible benefit and drawbacks that has.  What if the size of the reward for a block was related to the transaction activity of the currency?  In the beginning, the coin will need a set rate of generation (block reward) to build up a basic supply.  Once a certain threshold is reached the issue then becomes having sufficient currency to use in transactions.  If 1000 units of ?coin are changing hands per day then a smaller block reward is needed to keep sufficient currency in play than if 10,000 units of ?coin are changing hands. 

Doing something like this would require a transaction fee to act as a disincentive for people to move coins around just to simulate activity and boost the block rewards.

Any adjustment of the block reward should also be limited to a max percentage up or down to avoid runaway inflation or deflation.  Let each adjustment have the goal of the block reward being a set percentage of the total transaction activity over the last 'x' blocks but that adjustment can not exceed a change of 'y' percent from the current block reward.  If 'x' is kept low enough it should have a stabilizing effect on the overall amount of currency in circulation to account for lost wallets.  It will also help to keep the currency exchange rate more stable with fiat currency over time as those tend to have a slow inflationary trend as well.

I'm not throwing out specific values for 'x' and 'y' as I think some serious modeling would need to be done to establish sustainable behavior.

One guiding principal I've been keeping with my theorizing here is that the goal of a crypto currency should not be to make anyone rich.  It should have the goal of providing a stable, sustainable, efficient means of economic activity.
sr. member
Activity: 401
Merit: 250
I think I picked the wrong forum to post this on.  Too much junk coin pump and dump traffic drowning out threads.  Just since last night this dropped onto the second page, which is nearly a death sentence.  Can anyone suggest a more development oriented forum worth moving to?

Interesting to hear people saying just code it.  Not that easy.  I may have the ability but definitely not the capacity at this time.  Full time job and family see to that.  Besides, any successful coin is going to be developed by a team rather than an individual.  Implementing something like this is not trivial.  In particular, existing mining software would need considerable change to support history lookup.  This isn't just swapping in another algorithm.  The miner would need to use an outbound API to retrieve blockchain history blocks.  Either a wallet or the core node software would have to support incoming calls for this API.

Although the easiest route would be to just hack up the stock Bitcoin-QT client I'm really tired of seeing so many clones of the same software being developed.  Something needs to be built from the ground up.  I do like the plan floating in the Bitcoin development community to split the wallet function out from the core node software.  I operate a node on my PC but I have no interest in the wallet portion.  Having them separate removes some of the bloat.

Pools would be an interesting adaptation.  I haven't looked deep enough into the Stratum protocol to see if it will already work.  My gut says yes as all the miner software really needs is the header and a range of nonce values to work on.

Another benefit for this whole scheme came to mind last night.  Requiring access to the entire blockchain history makes malware miners difficult.  Too much data to push onto someone's PC and keeping the history updated would generate too much traffic.  If the coin was possible, I'm sure someone would try but detecting it should be a lot easier.

Another idea to consider if this were to turn into a coin.  Block reward halving acts as an incentive for miners to leave a coin just before the halving event.  Rather than halving at set intervals I'd suggest using a percentage decline over a long period.   If the drop was somewhere around 1% it wouldn't be as big of incentive for miners to jump.  As long as the new reward value is run through a floor function to avoid fractionals then the reward will eventually reach zero.  It all comes down to a function of desired final coin count, time per block, time to reach final count.
Pages:
Jump to: