How much space does the DB require?
Guess I'll just have to test in on the VPS this weekend. My test VM with 16GB storage ran out of diskspace after ~800k blocks.
Also, I can't find a way to specity the mysql host in the configuration file. Does it absolutely need a local mysql server to function? I'd prefer using a remote mysqldb that could handle all the queries more easily.
Can I change the host address in comm.py without breaking it?conn = pymysql.connect(db=CONFIG["database"]["dbname"], host='127.0.0.1', port=3306, user=CONFIG["database"]["dbuser"],passwd=CONFIG["database"]["dbpassword"])
Changed the host and it works with a remote server
Lol... with abe it was a little harder to get this working...
Lookin' forward to a completely loaded DB \(^_ ^)/
The size of the database depends greatly on the number of transactions in the block chain.
Using HoboNickels as an example:
~2.3 Million blocks - ~6.5 million TX_IN and ~6.5 million TX_OUT
Size of the active database: 6.7 GB
I will add an option to set the host address for the database in the next commit. Personally, I have never had need for it so I never thought to put it in.
This is happening on my server after every few weeks . We have a getblock call to altcoin daemon for each blocks. After few block numbers bitcoind RPC hangs (I have tried waiting for 30 min, but doesn't help, have to kill daemon and restart). I'm also using a node to make rpc calls.
This is why there are two timers in the database loader. The main loop timer is set at 5 minutes while individual RPC calls are set at 10 seconds. Since adding the timers I have not had to reset the loader or the daemons. If the loader gets interpreted/timeout, the next time it is called it re-parses the last 5 blocks including transactions, this helps guarantee information integrity if the loader is interrupted in the middle of parsing blocks/transactions. I have not noticed the coin daemons RPC completely locking up where it will not answer subsequent calls, they appear only to ignore/hang on one call.
The things like the rich list and largest tx... is that processed after all the blocks are loaded or when invoking stats.py? I imagine it will take some time to process it on 1.7M blocks.
The stats module is called when the database loader has completed its cycle. However, the stats module can be run independently, but the information will only be accurate to the point the database is loaded. Largest TX is populated on the fly as the loader is parsing transactions, again only will be acurate to the point where the database is loaded.
Added - Running the stats.py module while the loader is running might interrupt the loader due to the way Python handles module imports. If one really wants to run the stats module while the loader is running I would suggest making a copy of comm.py named something like comm2.py and change the import in stats.py to import the copied file instead of comm.py.