Author

Topic: blockchain (Read 1532 times)

legendary
Activity: 2128
Merit: 1073
August 19, 2012, 12:37:14 PM
#14
Sorry for the newb question: would you mind explaining how to defrag a single folder?
You know, it is hard for me to give a guarantee. There are so many different Windows extensions and so many file systems in Linux, etc.

But the simplest thing to do is (on Windows):

1) Use Windows Explorer to copy files to another physical disk (not another partition on the same disk)
2) Delete the original files
3) Copy back from the backup disk to the original place
4) Keep the backup copy made in (1), well for backup.

On unmodified Windows the above procedure will greatly reduce the number of fragments, frequently to the minimum (one per file). But the various online antivirus protection tools can completely screw this up. So at least exclude the Bitcoin directory from the online virus protection. The occasional scan only virus protection can stay.

If you really don't have a separate physical drive then at least zip the files, delete, unzip and keep the zip for backup.
legendary
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
August 19, 2012, 12:13:13 PM
#13
Some defragment softwares allow that
sr. member
Activity: 254
Merit: 250
August 19, 2012, 12:06:57 PM
#12
Depends on your drive type tho, in the long term you'll extend harddisk lifetime via reduced disk seek at cost disk stress during defragmentation.
I just wanted to make 100% clear that I didn't recommend the defragmentation of the whole disk. Just defragment the Bitcoin directory only! You can even save defragmenting the "Bitcoin/database" subdirectory because BerkeleyDB is smart enough to preallocate its log files in 10MB chunks.

The other cheap way of reducing fragmentation is running with "-printtoconsole". Then the debug.log will scroll off your screen instead of mutually fragmenting itself with blkNNNN.dat.

Sorry for the newb question: would you mind explaining how to defrag a single folder?
legendary
Activity: 1027
Merit: 1005
August 19, 2012, 08:52:06 AM
#11
Good info, thank you! A touch over my head, but Im still new to this whole Bitcoin idea so... I do think it will need optimized in the future, otherwise it will require a weeks worth of downloading on specialized hardware to even use and that may just kill bitcoins as the general public is not going to put up the cost just to use a different currency.
legendary
Activity: 2128
Merit: 1073
August 18, 2012, 09:27:40 PM
#10
I hope they work on optimizing the standard client soon, because it's going to get unmanageable very fast given the rate of transactions.
This isn't going to happen soon. Mike Hearn is at least acknowledging the problem:

Re: storing scripts in the index. Yes, eliminating the blkN.dat files and putting all data into the key/value store may be a way to further reduce seeks in future.

Gavin is flat out against using any well known database techniques:

blah blah blah blah
Gentle reminder to the other bitcoin developers: it is generally best not to feed trolls.  Use the ignore button.

So yeah, use either:

1) software RAM-drives
2) hardware RAM-drives
3) write-behind battery-backed RAID controllers
4) proper SAN filers

EDIT:

5) or even an improper SAN filer: just a separate box running iSCSI target over GigE attachment.

to store those vital files in your servers.
sr. member
Activity: 240
Merit: 250
August 18, 2012, 09:18:10 PM
#9
I hope they work on optimizing the standard client soon, because it's going to get unmanageable very fast given the rate of transactions.  I'm currently testing out running the client in a RAM drive ( http://www.ltr-data.se/opencode.html/#ImDisk ).  That way I can "cheat" when it comes to all the writes and fragmentation.  Whenever I need to power down, I can just save an image of the RAM drive to my SSD and restore it when I start back up.  That way I don't have to deal with destroying my SSD or the fragmentation and speed issues of my HDD.
legendary
Activity: 2128
Merit: 1073
August 18, 2012, 08:58:20 PM
#8
Depends on your drive type tho, in the long term you'll extend harddisk lifetime via reduced disk seek at cost disk stress during defragmentation.
I just wanted to make 100% clear that I didn't recommend the defragmentation of the whole disk. Just defragment the Bitcoin directory only! You can even save defragmenting the "Bitcoin/database" subdirectory because BerkeleyDB is smart enough to preallocate its log files in 10MB chunks.

The other cheap way of reducing fragmentation is running with "-printtoconsole". Then the debug.log will scroll off your screen instead of mutually fragmenting itself with blkNNNN.dat.
newbie
Activity: 49
Merit: 0
August 18, 2012, 08:48:54 PM
#7
Depends on your drive type tho, in the long term you'll extend harddisk lifetime via reduced disk seek at cost disk stress during defragmentation. Yes, it doesn't apply to SSD or other flash storage medium.
legendary
Activity: 2128
Merit: 1073
August 18, 2012, 08:45:01 PM
#6
I dont understand how defraging will help the download speed... or anything else really for that matter. Care to explain?
During download every transaction in every blocks gets verified against its source transactions. The layout of the data: separate files for index of transactions and actual transaction scripts to causes immense number of disk seeks. Seeking into fragmented files is doubly slow, because the OS has to actualy first seek to the MFT (Windows) or indirect index blocks (Linux).

So the block download process is disk bound not network bound.

I wouldn't actually recommend interrupting the initial download to defragment. Just defragment the Bitcoin directory right before starting the bitcoin{d,-qt} after you haven't run it, say, over the weekend.

This is a kind of pick your poison choice: it is slow on the rotational disks.
If you try to run it off of SSD then bitcoin{d,-qt} exhibits pessimal behavior for the SSD media: very high write amplification of the blkNNNN.dat files and lots of writes with no reads whatsoever in the BerkeleyDB log files (write ahead). If you try to run in on MLC SSD the very act of initial blockchain download puts at least couple of months worth of wear on the MLC flash.

People tried to run bitcoind on the ultra cheap Dell nettops with Linux and 8GB MLC disk drive. Quite often the 8GB SSD would fail before completing the initial blockchain download. This was where Satishi bitcoin was still with WX-widgets, not QT. But the underlying data storage hasn't meaningfully changed. The only otimization thus far was to keep just two most recent Berkeley DB logs instead of the higher, variable number from start to finish.
legendary
Activity: 1027
Merit: 1005
August 18, 2012, 08:20:11 PM
#5
I dont understand how defraging will help the download speed... or anything else really for that matter. Care to explain?
newbie
Activity: 58
Merit: 0
August 18, 2012, 09:52:50 AM
#4
Yes, the blocks are getting bigger.
It seems the problem would be solved in the way so only miners and power users will be having full block-chain, and everyone else would run lightweight clients. What will happen to miners in 10 year we don't know yet - some say that bitcoins will be so popular that miners will buy some really expensive hardware from all the transaction fees Wink
legendary
Activity: 2128
Merit: 1073
August 18, 2012, 09:48:32 AM
#3
In addition to the above (bigger blocks) there is an issue of fragmentation on disk. The blkNNNN.dat files are pretty much maximally fragmented (as far as possible for an append-only file). Those files are also mutually fragmenting with the simultaneously written debug.log.

Just defragment the bitcoin directory after the chain download finishes for a significant speedup. And keep defragmenting it regularly.
legendary
Activity: 1022
Merit: 1000
August 18, 2012, 09:43:37 AM
#2
When downloading the blockchain from the bitcoin client for the first time why does it seem to slow down as it gets closer to 100%? At first it was downloading what looked like 1000's of blocks a second and now I have 400 left and it has taken minutes to drop below that. Im on a 50MB connection too, btw.

More and more transactions, so the blocks are getting bigger, even more since satoshidice.
legendary
Activity: 1027
Merit: 1005
August 18, 2012, 09:36:27 AM
#1
When downloading the blockchain from the bitcoin client for the first time why does it seem to slow down as it gets closer to 100%? At first it was downloading what looked like 1000's of blocks a second and now I have 400 left and it has taken minutes to drop below that. Im on a 50MB connection too, btw.

On a side note... the total downloaded size is over 3GB right now. Imagine how big this thing will be in 10 years...
Jump to: