Pages:
Author

Topic: C# Node - page 2. (Read 4688 times)

donator
Activity: 1218
Merit: 1079
Gerald Davis
July 18, 2013, 04:58:57 PM
#27
Blockchain size is not finished growing and it's pretty large to be storing in a relational data store.  I'd be tempted to just store it in the file system.  Most people are running a logging file system these days so making a backup when doing any work might be sufficient.

For a standard node you are right there likely is very little use to store the full blocks in a database.  For efficiency full nodes generally just validate the block, store the header, and use the block to update the UXTO. In essence using full blocks just to build the UXTO.  Full nodes normally never need the historical blockchain except to build the UXTO in a trustless manner.  For most nodes a flat file is more than sufficient and this why the reference client does just that.

However I think it IS useful as a development platform to parse and store the blockchain in a database.  This is useful to building higher level tools for analysis.  I imagine that is how sites like blockexplorer and blockchain.info work. 


newbie
Activity: 5
Merit: 0
July 18, 2013, 04:46:26 PM
#26
Thought about the blockchain size issue again and thought if this is for sure on windows you may want to look at the built in ISAM database in windows known as the Extensible Storage System.  This is the engine behind Active Directory and Exchange Server and provides ACID data storage for up to terabytes and it comes with Windows.

More info here:
http://msdn.microsoft.com/en-us/library/gg269259(v=exchg.10).aspx

And since this is a system API using a non-managed .DLL here is a codeplex project wrapping the .dll in a managed code wrapper.
http://managedesent.codeplex.com/documentation
newbie
Activity: 5
Merit: 0
July 17, 2013, 11:32:21 PM
#25
You may want to use the repository pattern to abstract data access.
http://msdn.microsoft.com/en-us/library/ff649690.aspx

Make it testable with inversion of control and dependency injection.
http://msdn.microsoft.com/en-us/library/aa973811.aspx

Blockchain size is not finished growing and it's pretty large to be storing in a relational data store.  I'd be tempted to just store it in the file system.  Most people are running a logging file system these days so making a backup when doing any work might be sufficient.

With it in the file system then you can use async file access to limit how much memory you need to do your work.
http://msdn.microsoft.com/en-us/library/jj155757.aspx

In this case then your data layer would actually be helper functions for common action you need to take on the the file.

for other stuff you could store it in a simple data store since the needs for capacity would be much less.  Unless you are using Mono to target other platforms you'll be limited to windows, so stick with that stuff.  if SQL Compact is not your choice, you might try ravendb.
http://ravendb.net/
donator
Activity: 1218
Merit: 1079
Gerald Davis
July 17, 2013, 03:15:48 PM
#24
Got an out of memory exception at around ~ block 216,057.  I had all three caches set to 100 however the system has 16GB of memory, roughly 12GB free.  Looks like BitSharp.Client was using 958.6MB of memory at the time of the exception

Quote
BlockDataCache: DataCache encountered fatal exception in StorageWorker: "Exception of type 'System.OutOfMemoryException' was thrown."

at System.IO.MemoryStream.set_Capacity(Int32 value)\r\n   at System.IO.MemoryStream.EnsureCapacity(Int32 value)\r\n   at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)\r\n   at System.IO.BinaryWriter.Write(Byte[] buffer)\r\n   at...

Was able to restart and it resumed without issue.  Stranglely it is running as a 32bit process so that may have had something to do with it.  Need to check configuration settings as it should be building as Any CPU.

Yeah, I've had a rough time with running it in 32-bit as well. I do my best not to hold onto any objects for too long, but it seems that the GC struggles to keep up with the amount of objects going in and out of memory. I'm not a GC expert at all though, so I can't really say for sure what exactly's going in. It seems that I'm stressing things with the way I'm using ImmutableHashSet.

The issue wasn't so much 32bit process running out of memory.  I can't seem to figure out why it was loading as a 32 bit process.  Architecture was "AnyCPU".

The System.Data.SQLite assembly is "AnyCPU" (PE=PE32 & 32BITREQ=0) with no native code (ILONLY=1).

Quote
corflags System.Data.SQLite.dll
Microsoft (R) .NET Framework CorFlags Conversion Tool.  Version  4.0.30319.17929

Copyright (c) Microsoft Corporation.  All rights reserved.

Version   : v4.0.30319
CLR Header: 2.5
PE        : PE32
CorFlags  : 0x9
ILONLY    : 1
32BITREQ  : 0
32BITPREF : 0
Signed    : 1

For some reason System.Data.SQLite is referencing the x86 not the x64 version of SQLite.Interop.dll forcing the entire assembly to run as a 32 bit process.  Forcing x64 as architecture results in a "BadImageFormatException" exception on System.Data.SQLite so once again for some unknown reason it is loading the assembly as 32 bit.  I am not sure why as the point of the split System.Data.SQLite.dll (managed AnyCPU wrapped) and SQLite.Interop.dll (native dll in both x86 and x64 versions) is to allow both x86 and x64 projects from the same reference right?

Anyways I bypassed the issue by just using the mixed mode x64 assembly.
http://system.data.sqlite.org/index.html/doc/trunk/www/downloads.wiki

Still not sure exactly why it didn't work.

Quote
I'd really like to have this optimized for being able to quickly access the data, even if it does even up requiring 64-bit mode to run well.
Agreed.  For my purposes x86 compatibility is a non-issue and as time goes on the data will only get larger.  32 bit general purpose clients make sense where the client is really only interested in the UXTO but for a general purpose parser to build blockchain related tools I see little value in trying to make x86 work.

Quote
I'd like it to be really simple to program against the node easily for things like querying data. At the moment I'm trying to figure out how I'm going to store historical transaction data in a manner that will allow for that that kind of querying access. I want the end result to really be able to scale, though. I'd like to make this as high throughput as I can... lots of work ahead. Smiley

Well a nice start so far. On my system it synced the blockchain from the genesis block significantly faster than bitcoind did.  
member
Activity: 72
Merit: 10
July 16, 2013, 07:18:30 PM
#23
Got an out of memory exception at around ~ block 216,057.  I had all three caches set to 100 however the system has 16GB of memory, roughly 12GB free.  Looks like BitSharp.Client was using 958.6MB of memory at the time of the exception

Quote
BlockDataCache: DataCache encountered fatal exception in StorageWorker: "Exception of type 'System.OutOfMemoryException' was thrown."

at System.IO.MemoryStream.set_Capacity(Int32 value)\r\n   at System.IO.MemoryStream.EnsureCapacity(Int32 value)\r\n   at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)\r\n   at System.IO.BinaryWriter.Write(Byte[] buffer)\r\n   at...

Was able to restart and it resumed without issue.  Stranglely it is running as a 32bit process so that may have had something to do with it.  Need to check configuration settings as it should be building as Any CPU.

Yeah, I've had a rough time with running it in 32-bit as well. I do my best not to hold onto any objects for too long, but it seems that the GC struggles to keep up with the amount of objects going in and out of memory. I'm not a GC expert at all though, so I can't really say for sure what exactly's going in. It seems that I'm stressing things with the way I'm using ImmutableHashSet.

I'd really like to have this optimized for being able to quickly access the data, even if it does even up requiring 64-bit mode to run well. I'd like it to be really simple to program against the node easily for things like querying data. At the moment I'm trying to figure out how I'm going to store historical transaction data in a manner that will allow for that that kind of querying access. I want the end result to really be able to scale, though. I'd like to make this as high throughput as I can... lots of work ahead. Smiley
donator
Activity: 1218
Merit: 1079
Gerald Davis
July 16, 2013, 07:00:10 PM
#22
Got an out of memory exception at around ~ block 216,057.  I had all three caches set to 100 however the system has 16GB of memory, roughly 12GB free.  Looks like BitSharp.Client was using 958.6MB of memory at the time of the exception

Quote
BlockDataCache: DataCache encountered fatal exception in StorageWorker: "Exception of type 'System.OutOfMemoryException' was thrown."

at System.IO.MemoryStream.set_Capacity(Int32 value)\r\n   at System.IO.MemoryStream.EnsureCapacity(Int32 value)\r\n   at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)\r\n   at System.IO.BinaryWriter.Write(Byte[] buffer)\r\n   at...

Was able to restart and it resumed without issue.  Stranglely it is running as a 32bit process so that may have had something to do with it.  Need to check configuration settings as it should be building as Any CPU.
member
Activity: 72
Merit: 10
July 15, 2013, 06:21:11 PM
#21
I finally got FireBird working. Here's one of the more interesting things I had to do to read a byte array from the database. Smiley

        public static byte[] GetCharBytes(this DbDataReader reader, int i)
        {
            var value = reader.GetString(i);

            //TODO for the love of...
            Guid guid;
            if (value.Length == 36 && Guid.TryParse(value, out guid))
            {
                return guid.ToByteArray();
            }
            else
            {
                //TODO make sure this won't actually mangle anything, see Guid above
                return value.Select(x => (byte)x).ToArray();
            }
        }
member
Activity: 72
Merit: 10
July 15, 2013, 05:06:14 PM
#20
Oh, you'll absolutely be able to plug in a different SQL or whatever back-end. For unit tests I have a "back-end" that just stores everything in memory in a dictionary, for example. I just want a decent out-of-box solution.

I have FireBird close to working with binary columns, running into a snag now where it's not accepting nulls. I have to say I'm pretty shocked that FireBird doesn't have proper binary column support.

SQL Express w/ local DB option?  It is what I usually use for rapid prototyping.  SQL Server Compact Edition is another option.

It is possible (although a pain in the ass) to compile google's leveldb (same db used by bitcoind) as a visual studio project so if you want something lighter weight that is an option.



SQL Express was actually my first option, unfortunately the size limit means I can't fit the full blockchain in it.

I'm actually super close to figuring out the semantics I need to use to get FireBird working. Even if I don't, I think I'll probably end up going with it anyway, and just store all my numbers as strings I guess.
donator
Activity: 1218
Merit: 1079
Gerald Davis
July 15, 2013, 05:03:02 PM
#19
Not sure if anyone encountered this problem but using git the solution file (and other files in root directory as a well as 4 of the project files) were not cloned.    Not sure why and it may have been my setup but downloading the zip from github, extract, and making it a new repo worked.
member
Activity: 72
Merit: 10
July 15, 2013, 04:57:59 PM
#18
Oh, you'll absolutely be able to plug in a different SQL or whatever back-end. For unit tests I have a "back-end" that just stores everything in memory in a dictionary, for example. I just want a decent out-of-box solution.

I have FireBird close to working with binary columns, running into a snag now where it's not accepting nulls. I have to say I'm pretty shocked that FireBird doesn't have proper binary column support.
Well, for the majority of the C# developers "plug in a different SQL or whatever back-end" means "change the connection string". I have yet to meet any C# developer who didn't already have MS-SQL or Oracle either installed on the same machine or immediately available by just giving the program the name of the server. In fact one of the more common problems working with C# developers was that they have so many different backends installed (or at least connectable with no prompting) that this becomes the source of confusion and seemingly irreproducible results.

We have certainly have a very different development background.

It's more of a personal principle. I like all development environments that I create to be runnable with zero configuration, or as close to as humanly possible. Sensible defaults out of box to get you started, and then allow you to tweak the environment from there.
legendary
Activity: 2128
Merit: 1073
July 15, 2013, 04:54:47 PM
#17
Oh, you'll absolutely be able to plug in a different SQL or whatever back-end. For unit tests I have a "back-end" that just stores everything in memory in a dictionary, for example. I just want a decent out-of-box solution.

I have FireBird close to working with binary columns, running into a snag now where it's not accepting nulls. I have to say I'm pretty shocked that FireBird doesn't have proper binary column support.
Well, for the majority of the C# developers "plug in a different SQL or whatever back-end" means "change the connection string". I have yet to meet any C# developer who didn't already have MS-SQL or Oracle either installed on the same machine or immediately available by just giving the program the name of the server. In fact one of the more common problems working with C# developers was that they have so many different backends installed (or at least connectable with no prompting) that this becomes the source of confusion and seemingly irreproducible results.

We have certainly have a very different development background.
donator
Activity: 1218
Merit: 1079
Gerald Davis
July 15, 2013, 04:49:44 PM
#16
Oh, you'll absolutely be able to plug in a different SQL or whatever back-end. For unit tests I have a "back-end" that just stores everything in memory in a dictionary, for example. I just want a decent out-of-box solution.

I have FireBird close to working with binary columns, running into a snag now where it's not accepting nulls. I have to say I'm pretty shocked that FireBird doesn't have proper binary column support.

SQL Express w/ local DB option?  It is what I usually use for rapid prototyping.  SQL Server Compact Edition is another option.

It is possible (although a pain in the ass) to compile google's leveldb (same db used by bitcoind) as a visual studio project so if you want something lighter weight that is an option.

member
Activity: 72
Merit: 10
July 15, 2013, 04:43:51 PM
#15
Oh, you'll absolutely be able to plug in a different SQL or whatever back-end. For unit tests I have a "back-end" that just stores everything in memory in a dictionary, for example. I just want a decent out-of-box solution.

I have FireBird close to working with binary columns, running into a snag now where it's not accepting nulls. I have to say I'm pretty shocked that FireBird doesn't have proper binary column support.
legendary
Activity: 2128
Merit: 1073
July 15, 2013, 04:40:11 PM
#14
For development purposes, I want the solution to be ready to run as-is with no installation required.
Thanks for the concise and honest answer.

In my experience people and organizations who choose to develop in C# do that primarily because of the tremendous choice of the data storage layer backends. By excluding one of the strongest features of C# you'll also exclude the majority of the C# developers who would otherwise be very interested in your project.

Hi, are there any other developers out there interested in or working on a bitcoin c# node? I've been working on one for a couple of months and I think I'm starting to see some promising results. I'd love to connect with anyone else interested in this work.
member
Activity: 72
Merit: 10
July 15, 2013, 04:20:31 PM
#13
For development purposes, I want the solution to be ready to run as-is with no installation required.
legendary
Activity: 2128
Merit: 1073
July 15, 2013, 04:17:41 PM
#12
Anyone have any recommendations on a good embedded database I can use from C#? SQLite is turning out to be awful for concurrent writes, and Firebird does not seem to support binary data properly.
I'm curious why you insist on an embedded database? Why not use one of the many database abstraction layers available on Windows or simply the C# LINQ?
member
Activity: 72
Merit: 10
July 15, 2013, 02:46:46 PM
#11
Anyone have any recommendations on a good embedded database I can use from C#? SQLite is turning out to be awful for concurrent writes, and Firebird does not seem to support binary data properly.
member
Activity: 72
Merit: 10
July 12, 2013, 07:45:03 PM
#10
I'm using SQLite for the database and BouncyCastle for the ECDsa verification and for SHA-256 and RIPEMD-160 I'm using the built-in .Net libaries.

The ECDsa verification is awful right now. It takes me 1/10th of a second just to verify a single signature, I have the verification disabled for the moment because of the speed. I haven't had any luck with OpenSSL yet to try it out.

I don't have much in the way of overall code documentation yet, I'll be working on that. Everything is put together just in broad strokes right now so that I have something usable to work with. There is a ton of design and clean-up to do. Smiley

Anyway, thanks for checking it out!
donator
Activity: 1218
Merit: 1079
Gerald Davis
July 12, 2013, 07:33:19 PM
#9
Very nice I will have to grab a copy and play around with it this weekend.  I know the answer is probably read the code but what are you using for a datastore/database and what libraries are you using for the functions crypto (Native Framework? OpenSSL? )?
member
Activity: 72
Merit: 10
July 12, 2013, 07:25:45 PM
#8
Here's an initial screenshot, this UI is just to support the development work. I've included a simple block navigator to try out some very basic data visualization. I'm only showing the UTXO changes on the screen, but at each step along the way I have the entire UTXO calculated and immediately available for hand-off. It's able to navigate through close to a hundred blocks per second at today's UTXO size on my i5 hardware.

Pages:
Jump to: