Author

Topic: concurrent validation engine for bitcoin-core (Read 306 times)

jr. member
Activity: 39
Merit: 25
January 29, 2019, 01:07:13 PM
#9
For anyone interested in manycores. Take a look at this site. http://bjump.org/manycore/
jr. member
Activity: 39
Merit: 25
There are couple of reasons i see the validation process is done linearly. First of all leveldb (database of UTXO set) does not support multi-threading. And there are static variables defined inside `validation.h` keeping track of state of blockchain. Maybe I'm wrong.
The state of the blockchain must be updated synchronously and in order of blocks. Otherwise you will run into concurrent update issues which will affect consensus.

That's not a requirement to reach consensus. One can do so after evaluating asynchronously. And merging the result to take final decision.
This is certainly possible. The question is, Is it needed. As domob did point out. It will probably have no effect with current CPUs available. I reckon many cores can change the game. In which verification time of input coins are done faster than validating a block and fetching UTXOs.
If you validate blocks out of order, then a block may contain a transaction which spends an output that is in a block that you have not yet validated.

Merge process will take care of that. I have somewhat good understanding of how validation process is taking place. Same goes for child pays for parent.
staff
Activity: 3458
Merit: 6793
Just writing some code
There are couple of reasons i see the validation process is done linearly. First of all leveldb (database of UTXO set) does not support multi-threading. And there are static variables defined inside `validation.h` keeping track of state of blockchain. Maybe I'm wrong.
The state of the blockchain must be updated synchronously and in order of blocks. Otherwise you will run into concurrent update issues which will affect consensus.

That's not a requirement to reach consensus. One can do so after evaluating asynchronously. And merging the result to take final decision.
This is certainly possible. The question is, Is it needed. As domob did point out. It will probably have no effect with current CPUs available. I reckon many cores can change the game. In which verification time of input coins are done faster than validating a block and fetching UTXOs.
If you validate blocks out of order, then a block may contain a transaction which spends an output that is in a block that you have not yet validated. And if you are processing the transactions in a block out of order, then you may have a transaction which spends an output of a transaction that is found earlier in the block which you may not have validated yet.
jr. member
Activity: 39
Merit: 25
There are couple of reasons i see the validation process is done linearly. First of all leveldb (database of UTXO set) does not support multi-threading. And there are static variables defined inside `validation.h` keeping track of state of blockchain. Maybe I'm wrong.
The state of the blockchain must be updated synchronously and in order of blocks. Otherwise you will run into concurrent update issues which will affect consensus.

That's not a requirement to reach consensus. One can do so after evaluating asynchronously. And merging the result to take final decision.
This is certainly possible. The question is, Is it needed. As domob did point out. It will probably have no effect with current CPUs available. I reckon many cores can change the game. In which verification time of input coins are done faster than validating a block and fetching UTXOs.
staff
Activity: 3458
Merit: 6793
Just writing some code
There are couple of reasons i see the validation process is done linearly. First of all leveldb (database of UTXO set) does not support multi-threading. And there are static variables defined inside `validation.h` keeping track of state of blockchain. Maybe I'm wrong.
The state of the blockchain must be updated synchronously and in order of blocks. Otherwise you will run into concurrent update issues which will affect consensus.
jr. member
Activity: 39
Merit: 25
Multiple cores are currently used at least for signature verification, which is the most compute-intensive process anyway (that's why more than one core is active during sync, as ETFbitcoin noticed).

I don't know how much additional speed could be extracted by making the coin lookup multi-core, but I imagine that this would be harder to do than signature verification because you need to potentially lock data structures.  Also if LevelDB does not already support multiple cores, it would be a bigger (and riskier) effort in replacing the entire database with another one.

So I believe that this project is likely not very useful, at least not for the effort involved.  And if you want to go forward still, I suggest that you provide actual numbers on how much further the speed could be improved (in addition to the already parallelised signature checking).

I see. That can kill this idea. Anyway at least i tried to talk about it.
jr. member
Activity: 39
Merit: 25
It's true that Bitcoin Core only use CPU, but AFAIK it has multi-core/multi-threading support as all cores of processor on my PC have high load when i re-scan whole blockchain.

Based on my knowledge, your idea sounds good, but i don't see why you would need funding, unless you don't have decent PC or can't spare time knowing you can't make a living.

There are couple of reasons i see the validation process is done linearly. First of all leveldb (database of UTXO set) does not support multi-threading. And there are static variables defined inside `validation.h` keeping track of state of blockchain. Maybe I'm wrong.

Well. I'm new to bitcoin. I don't have much skin in the game. I'm just interested in the project. Want to figure out how the economy of development works out.
I can start my work to reach a proof-of-concept. I want to see if anybody interested in spending money on it.
legendary
Activity: 1135
Merit: 1166
Multiple cores are currently used at least for signature verification, which is the most compute-intensive process anyway (that's why more than one core is active during sync, as ETFbitcoin noticed).

I don't know how much additional speed could be extracted by making the coin lookup multi-core, but I imagine that this would be harder to do than signature verification because you need to potentially lock data structures.  Also if LevelDB does not already support multiple cores, it would be a bigger (and riskier) effort in replacing the entire database with another one.

So I believe that this project is likely not very useful, at least not for the effort involved.  And if you want to go forward still, I suggest that you provide actual numbers on how much further the speed could be improved (in addition to the already parallelised signature checking).
jr. member
Activity: 39
Merit: 25
Initial goal, speeding up bitcoin initial validation time based on number of available cores

Assuming bandwidth is not an issue for initial validation, I have a project in my mind to do this initially. And in future it lowers one aspect of scaling hurdles which is processing time for validation.

To achieve higher performance we need to expand the number of cores.
GPUs provide more cores, I think at the moment Nvidia has about 50 compute unit. Also it's cheap at the moment because of all these mining rigs.
And one can achieve even more performance by designing an SoC designed for search and cryptography operation. https://www.sifive.com/

At the moment bitcoin-core validation is taking place in a linear fashion. within a single core.

Here's what I'm thinking.

1. Implementing a new bitcoin specific con-current functional search engine

For validation, UTXO database is required. Which is a list of unspent coins. The search engine only requires to have integer for indexing and storing binary data for txid, public script and vout.
Simple sketch of it would be.
It should divide indexed data in multiple files within a range of integer associated with them. Which allows for each piece to work independently.
And fetch process can be done in bulk.
Write procedures are done periodically as re-write of indexed data.

2. Queue/Merge based managing core

I believe the best way to manage validation is to do each task in functional manner and manage it with putting tasks in queue and merging the result.
So for example when manage core receives a block, It holds state of each transaction and digest the block into input coins. And sends tasks to search and then validates coins with spending script then merges the result.

3. Passive block validation

Meaning validation will get into queue without the need for the previous block finish its job. By validating pow before doing this task, DoS attack won't be an issue.
In this form validation process are done asynchronously. Then final decision is taken by merging the result.


Let me know if you have any questions.
I'm looking for a method to fund this project. As i don't have enough capital to do it myself.
Jump to: