Pages:
Author

Topic: [ANNOUNCE] Abe 0.7: Open Source Block Explorer Knockoff - page 32. (Read 220986 times)

legendary
Activity: 2576
Merit: 2267
1RichyTrEwPYjZSeAYxeiFBNnKC9UjC5k
I'll have to have a closer look. I'm thinking I'd like something of this but more for event-based notification rather than aggregate/historical data.

There may be some way to improve the speed of the inserts too. I'll take a look at that.
hero member
Activity: 481
Merit: 529
Is most of the slowness with importing the blockchain in the SQL inserts?
I think so, but I don't have profile data.
legendary
Activity: 2576
Merit: 2267
1RichyTrEwPYjZSeAYxeiFBNnKC9UjC5k
Is most of the slowness with importing the blockchain in the SQL inserts?
hero member
Activity: 481
Merit: 529
A bug has caused http://abe.john-edwin-tobey.org to lose track of the bitcoins outstanding as of Block 230560.  Details are in TODO.txt.

This affects the experimental no-statistics branch and possibly the master branch.  It appears to depend on multiple concurrent loading processes.  For best results, use the master branch and avoid concurrent loading until this is fixed.  To avoid concurrent loading, either use a single process (not FastCGI) or have all but one process use an empty datadir list ("datadir []" in abe.conf).  When I set up FastCGI, I use datadir=[] and create a separate loader job that runs Abe continuously in a loop with the live datadir(s) and --no-serve.

Sorry for any inconvenience!
hero member
Activity: 481
Merit: 529
@salfter

Thanks for the education.  It would be pretty easy to add JSON and JSONP support for API functions.  I'll add it to the to-do list and create an issue on GitHub.  Patches are welcome.
hero member
Activity: 651
Merit: 501
My PGP Key: 92C7689C
I have a P2Pool monitor webapp:

http://alfter.us/p2pool.html

I'd like to have it display the balance of the default payout address for the P2Pool instance.  In my case, I'm mining Litecoin, so I had considered hitting up the Litecoin block explorer for this information...subtract the result of the getsentbyaddress query from the result of the getreceivedbyaddress query and you're done.  Actually getting that information into your browser with just JavaScript is troublesome, though.  Obtaining data from another domain is usually blocked by the browser.  The existing queries against P2Pool are handled by YQL, but the Litecoin block explorer has a robots.txt which blocks YQL. JQuery would facilitate direct queries...but only if the data source supports JSONP.  Most of the Abe API calls return plaintext, not JSON or JSONP.  I'm getting my own copy of Abe up and running against the Litecoin blockchain so that I can at least aim YQL queries at it (it'll take nearly any kind of data and format it as either XML or JSON), but are there any plans to add/improve JSON support in the Abe API?  Is there some other approach to what I'm trying to do that would work?  The more functionality I can keep in the client, the better.  (I could've knocked together some PHP to do what I want, but that seems a bit like cheating.)
hero member
Activity: 481
Merit: 529
John,

Does the current master branch work with bitcoind 0.8.1 if the -txindex=1 setting is used?
What about the no-statistics branch?
Can I leave bitcoind 0.8.1 running while using Abe?

In theory, both branches work with and without txindex.  You only need txindex if you want to load data over RPC from bitcoind.  Benefits of RPC loading are (1) you see unconfirmed transactions, and (2) large new blocks show up faster because most of the transactions are already in (provided that you frequently force a catch-up by loading a page or running in --no-serve mode).  See the comments about "default-loader" in abe.conf or Issue #17 about setting up RPC.

You might even manage to run in RPC mode without txindex, given an already loaded database, but this situation is vulnerable to a condition where bitcoind processes an output and its spending input before Abe loads the output.  So I specify txindex.

Note that I mistakenly documented a need to run "bitcoind -rescan" the first time with txindex in effect; that should be "bitcoind -reindex" instead.

In practice, I sometimes see the following error [EDIT: fixed] with bitcoind 0.8.1 when a blockfile other than the current one contains trailing NULs (zero bytes).
Code:
MerkleRootMismatch: Block header Merkle root does not match its transactions. block hash=14508459b221041eab257d2baaa7459775ba748246c8403609eb708f0e57e74b
I work around it by trimming the NULs from the file (using /usr/bin/truncate in Linux and the blkfile_offset value from the error trace) and restarting.  I would like to fix this in the code.

In theory, reading the blockfiles while bitcoind runs is unsafe due to race conditions and the use of an undocumented feature.  In practice, I do not recall this ever causing a problem until Bitcoin 0.8, and the problems related to running simultaneously with 0.8.x are fixed.  Abe checks each block's Merkle root so as not to load a corrupt or incomplete block, so at worst, an error should involve resetting the datadir row to the start of the current file (or pass --rescan to Abe for a slower but easier fix).

Also, do you know any way to check that -txindex=1 is indeed active and was used for the last reindex? My ~/.bitcoin/blocks is currently 8.6GiB of which 837MiB is index. ~/.bitcoin/chainstate is 208MiB.

I am sure there is some file that will be a certain size, but I don't know it.  My blocks, index, and chainstate directories are similar to yours in size, and I have a confirmed working txindex.  The way I know is by finding a transaction all of whose outputs are spent, and fetching it with "bitcoind getrawtransaction".  If it says "transaction not in index" bad, if it gives you a hex string, good.

Thanks for your work, I just donated ฿0.2.

Thank you for your support.
newbie
Activity: 13
Merit: 0
John,

Does the current master branch work with bitcoind 0.8.1 if the -txindex=1 setting is used?
What about the no-statistics branch?
Can I leave bitcoind 0.8.1 running while using Abe?

Also, do you know any way to check that -txindex=1 is indeed active and was used for the last reindex? My ~/.bitcoin/blocks is currently 8.6GiB of which 837MiB is index. ~/.bitcoin/chainstate is 208MiB.

Thanks for your work, I just donated ฿0.2.
sr. member
Activity: 426
Merit: 250
Absolutely awesome!
hero member
Activity: 481
Merit: 529
The experimental branch (no-statistics) now imports UNCONFIRMED (MEMPOOL) transactions (as well as blocks) over RPC from a running bitcoind, provided that:

  • Bitcoind is new enough to support getrawmempool, getrawtransaction, getblock, and getblockhash (tested 0.8.x; 0.7.x may work)
  • If using bitcoind 0.8.x, -txindex is in effect (see here)
  • The datadir.chain_id column is populated: for BTC, this means execute the following SQL: UPDATE datadir SET chain_id = 1 WHERE chain_id IS NULL

My demonstration server satisfies these requirements and includes unconfirmed transactions in search results.

This feature ought to be easily portable to the master branch, in case anyone wants it.  However, I'd feel more comfortable with a configuration option to tell Abe whether or not to use RPC.  By default, it tries RPC and falls back to blockfile scans if that fails.  Also, the new RPC code could use testing and improvements.
sr. member
Activity: 426
Merit: 250
I would pledge up to 2 BTC (for full support).

Anyone else?

4 BTC (at today prices)
donator
Activity: 543
Merit: 500
I would pledge up to 2 BTC (for full support).

Anyone else?
hero member
Activity: 481
Merit: 529
Any news on multisig support? Are you working on it or waiting for donations? Smiley
i would contribute to a multisig bounty.
Like a zero-fee transaction, eventually it'll get in.  Probably.  Smiley  In the absence of a business model, my priorities are driven by my personal needs, which lately are about space efficiency, bitcoind 0.8.x compatibility, and integrating contributions from other developers.  Donations do encourage me to work, at roughly $40/hour for features that I want to add anyway, up to $100 for site-specific things that don't improve Abe for most users.  (I'd quote a price in BTC, but BTC/USD is too volatile, as I have ranted in other threads, for reasons that led me to create Abe in the first place.)

I'd like to get clear what "multisig support" means.  Now that I have a full database, I can search for new transaction types.  There have been a few hundred transaction outputs that use OP_CHECKMULTISIG as specified in BIP 11 M-of-N Standard Transactions.  Blockchain.info calls these "escrow" outputs.  Examples: 1 2.  Given an address that redeemed an M-of-N output, Blockchain.info lists the transaction, as in this example.  But if the address did not sign it, its page does not list the transaction: example.  Mapping from addresses back to these transactions will take some extra work.

There have also been a few hundred outputs as in BIP 16 Pay to Script Hash (P2SH).  These have receiving addresses that start with "3" as per BIP 13.  Example: 1.  These output scripts are opaque, but the redeeming input reveals one or more addresses.  As far as I know, Blockchain.info does not correlate an address so revealed with the original P2SH transaction.  Perhaps it would be useful to do so, and this would require some extra work.

There have also been a few thousand regular pubkey outputs as in this example that Abe can not parse because the public key is shorter than usual.  This has nothing to do with multisig and is really a bug in Abe, but it would be nice to fix.  There are also a thousand or so outputs of miscellaneous types unrecognized by Abe.  They may all be nonstandard, but it would be nice to make sense of any that are frequent or demonstrably redeemable.

ThomasV sent me this:

Hi,

it would be nice if Abe could support multisig addresses. For this I think you only need to update deserialize.py.

I did it in Electrum, so you can lookup Electrum's deserialize.py file:
https://github.com/spesmilo/electrum/blob/master/lib/deserialize.py
(you don't want to use that file as is, it probably has other differences)

see at the end, the function called get_address_from_input_script.

It is not directly usable: Abe would have to pass the script to the function where currently it passes pubkey_hash to another function.  But this would be very easy and would add at least some value regarding P2SH.

A full implementation of M-of-N, P2SH, and mapping from address back to M-of-N transaction would take me about 20 hours, so figure 11 BTC at current rates.
donator
Activity: 543
Merit: 500
sr. member
Activity: 426
Merit: 250
Any news on multisig support? Are you working on it or waiting for donations? Smiley
i would contribute to a multisig bounty.
donator
Activity: 543
Merit: 500
Any news on multisig support? Are you working on it or waiting for donations? Smiley
hero member
Activity: 481
Merit: 529
FYI, http://abe.john-edwin-tobey.org is current with the BTC chain using the no-statistics branch and the HOMEPAGE template variable to make chain/Bitcoin the default view.  It uses the new abe_loader script (in tools/) to stay up-to-date, and the FCGI process has read-only SQL permissions.

Problems?  Please report them here or by email to [email protected].
hero member
Activity: 481
Merit: 529
The next Namecoin block, 99502, has a size of 52606 bytes. In order to allow 99502 (and subsequent blocks) to be imported into the Abe database, I dropped the txout_detail view (because PostgreSQL does not allow changing the column types in tables used by views), removed the length restriction from the column txout_scriptpubkey in the table txout, and recreated the txout_detail view.

I'm sure this solution is dangerous if someone manages to stuff much larger blocks into a block chain, but I do not know a reasonable limit for this column yet.

Thanks for the report.  It's too bad that junk got into the Namecoin blockchain.  Do you have the actual script length?  http://explorer.dot-bit.org/tx/1474553 shows the transaction size as 32767, and the long script would be most of that, but that explorer may have truncated it.  You can verify this with something like:

Code:
SELECT MAX(LENGTH(txout_scriptPubKey)) FROM txout JOIN tx USING (tx_id) WHERE tx_hash = '0bb558f73a543f2631acbd8c5614d3ed2171eb710a586b4485b8303d4a4a0b61';

I'll consider simply increasing the column width, but that has the side effect of making MySQL create the column as type TEXT rather than VARCHAR (unless using binary-type=binary).  This may not be so bad.  The column does not take part in any joins, and the vanilla app never searches it.  Anyone?
newbie
Activity: 58
Merit: 0
My Abe installation was stuck on Namecoin block 99501 (2013-03-10 13:15:28). Attempting to rescan the block chain yielded the following error:

Code:
block 326226 already in chain 3
commit
Exception at 463166665
Failed to catch up {'blkfile_number': 1, 'dirname': '/home/notawake/.namecoin', 'chain_id': None, 'id': Decimal('2'), 'blkfile_offset': 463114051}
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2428, in catch_up
    store.catch_up_dir(dircfg)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2493, in catch_up_dir
    store.import_blkdat(dircfg, ds, blkfile['name'])
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2626, in import_blkdat
    store.import_block(b, chain_ids = chain_ids)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 1662, in import_block
    tx['tx_id'] = store.import_and_commit_tx(tx, pos == 0)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2147, in import_and_commit_tx
    tx_id = store.import_tx(tx, is_coinbase)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2091, in import_tx
    store.binin(txout['scriptPubKey']), pubkey_id))
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 464, in sql
    store.cursor.execute(cached, params)
DataError: value too long for type character varying(20000)

The next Namecoin block, 99502, has a size of 52606 bytes. In order to allow 99502 (and subsequent blocks) to be imported into the Abe database, I dropped the txout_detail view (because PostgreSQL does not allow changing the column types in tables used by views), removed the length restriction from the column txout_scriptpubkey in the table txout, and recreated the txout_detail view.

I'm sure this solution is dangerous if someone manages to stuff much larger blocks into a block chain, but I do not know a reasonable limit for this column yet.
hero member
Activity: 481
Merit: 529
Is there a list somewhere of all of the publicly-available servers/websites running Abe?  When blockchain.info was down, I went looking for one, but couldn't find any that were still running.
The wiki page links to http://abe.bitcoinstats.org:2750/, which appears current.
Pages:
Jump to: