Pages:
Author

Topic: [ANN][Datacoin] Datacoin blockchain start announcement (Minor code upd + logo) - page 33. (Read 171860 times)

newbie
Activity: 7
Merit: 0
Seriously? Is anybody of so called "datacoin community" interested in datacoin development or everybody is floating in the comfortable zone of predicting the future of cryptocurrencies and philosophy in general...? Or, maybe nobody hasn't seen my previous posts? So, let me rephrase:
New Datacoin High Performance, embracing security, stability and performance improvements of Primecoin-HP12 is available for download and testing at http://github.com/g1g0/datacoin-hp !!!
Should I f*ckin' pay somebody for compiling and running the code? YES - 50DTC will go to the first person who posts on this sluggish thread her/his SERIOUS impressions about running the Datacoin port of HP12 ...
patience, i do see ur post and will test it tomorrow on my local network. i appreciate your effort mate.
newbie
Activity: 30
Merit: 0
Seriously? Is anybody of so called "datacoin community" interested in datacoin development or everybody is floating in the comfortable zone of predicting the future of cryptocurrencies and philosophy in general...? Or, maybe nobody hasn't seen my previous posts? So, let me rephrase:
New Datacoin High Performance, embracing security, stability and performance improvements of Primecoin-HP12 is available for download and testing at http://github.com/g1g0/datacoin-hp !!!
Should I f*ckin' pay somebody for compiling and running the code? YES - 50DTC will go to the first person who posts on this sluggish thread her/his SERIOUS impressions about running the Datacoin port of HP12 ...
legendary
Activity: 2646
Merit: 2793
Shitcoin Minimalist
newbie
Activity: 30
Merit: 0
If it comes to mining, the hp12-based datacoind seems to outperform the old one by ... 4% (based on 60 "chainsperday" samples taken every minute - is this a reliable measure?).
Well - better than nothing, I guess  Roll Eyes

g
newbie
Activity: 30
Merit: 0
Hi!

any chance that someone has time to work with core code? Smiley
No much work to do with the core. If someone has some free time the HP12 optimizations could be ported to the hp branch of the client.
This would provide some performance improvement over the previous client.


I ported hp12 changes to datacoin-hp. I diffed primecoin-0.1.2-hp11 against primecoin-0.1.2-hp12 and primecoin-0.1.2-hp11 against datacoin-hp. Based on these two diffs I selected files that could be patched automatically. The remaining part I patched manually. I compiled and run it on linux x86_64 only. Please test it ON A NEW WALLET (or backup first) and let me know how it works.
https://github.com/g1g0/datacoin-hp.git
Donations welcome Smiley

DTC: DQWeXamzqqtinrFaiZmbFwoa6Si2uZiBTQ
XPM: AcDgPBpNPMgYKaQYpQEP2njCMLbDBQiahh
BTC: 1PzBiHQEDZ2awULgmhGMSP4RNQ7q2oKEcx
sr. member
Activity: 246
Merit: 250
The future of this coin:

-A system that is in metastable equilibrium and that is controlled by an operation based on data relating to the condition of its metastable equilibrium is prone to failure due to metastability in its electronics (FIBER OPTICS).

-Datacoin is the beginning and dawn of technology that can be used to limit failure in such a system by asynchronous circuit.

For those interested in this conceptual application of datacoin, further explanation:

-Within an isolated system in thermal equilibrium maintained in a fiber optic environment, glass can prevent processes or parts from reaching same equilibrium. The temperature or the spacial distribution of temperature can be changed by changing the state of the materials within the system (hydrogen).

-Left for a period of time as a adiabatically isolating vessel with rigid walls, the system containing the thermally heterogeneous distribution of hydrogen, under the influence of a steady gravitational field along its tall dimension, will settle to a state of uniform temperature. Whats important to understand here is thermodynamic equilibrium concepts. From thermal dynamic equilibrium to non-equilibrium, phase changes increase the energy levels of the syustem, making it a system in metastable equilibrium. ITS OPERATION FOUNDED ON DATA. (reliability of data...)

-In such a system, hydrogen would be heated within a separate environment contained using fiber optic technology principles (or it could be mixed with heated material that it doesnt react adversely with, whatever is best) abd then injected into the system through its harness. The harness would be a delivery system as well as a physical anchor for the system. The harness would thus connect any other object to the system, by connecting to the habitable point of the star.

-The heat and increase from the hydrogen mix heats the temperature in the system until thermal equilibrium is achieved, based on the desired and programmed outcome(datacoin  Smiley

If this has interested you, See Radiative Equilibrium and Dynamic Equilibrium relationship with star.
full member
Activity: 214
Merit: 100
Looking to buy 50,000 Datacoin off exchange. PM me if interested.
Will use escrow
hero member
Activity: 637
Merit: 500
I think 'prime' proof-of-work is interesting, but that it will ultimately result in an economic failure for data storage.

I'd like to propose a gradual (but requiring a hard-fork) upgrade to 'proof of data' which would just be to find sha 256 hash of the required difficulty *over the entire blockchain*. This basically means your hash rate is proportional to how fast you can read the blockchain off disk (or you keep it in memory)
This approach to Proof of Storage (or Proof of data) could be interesting, I am not sure if I get your idea. Do you mean hashing data chunks until you find a hash with the target difficulty ?

What I wonder is if we're thinking of the same thing, and by the time I get done optimizing proof-of-data if it ends up being very similiar to your concept.. Or is there some paper on proof-of-resource you can link me to?
Also wondering what @super3 means by "new proof of resource algorithms", never heard about this term before.

About doing a new coin or hard-forking DTC. First of all consider the fact that the original Datacoin developer _seems_ to have disappeared so whoever proposes a hard fork (and the community accepts it, which I am sure it will) would also be in charge of maintaining the original code base (this could be a pain). Also, there are some issues with Datacoin when blocks are full of data (it should be tested more thoroughly).

One drawback of creating a new coins is that most of them have low credibility and getting exposure is harder every day. Also with a new coin you would loose the current Datacoin community, although small, the community is focused on the new features.

What I would do is :
- Create a new fork of a well maintained chain (Bitcoin, Litecoin, ...).
- Add new features.
- Propose the Datacoin holders to invest in the new coin through a Proof-of-Burn (essentially you destroy Datacoins and get NewCoins), this way you will give some value to Datacoin and at the same time you will get the part of the community interested in your coin. After the PoB, the NewCoins are distributed among the addresses that burned DTC with a premine for the total of DTC burned (or whatever ratio is fair) which will be distributed to the new holders.
- Set up some marketing strategy, services, pools, whatever ...
- Launch the newcoin.

Anyway the most important part are the new features and that's where the details are missing at the moment. Proof of Storage is definately possible but I have yet to see a working proof of concept, super3 ?
BTW, I would love to get involved in this in some way.

My 2 mBTC

EDIT: I assumed everybody does know what ProofOfBurn is which may not be true. You send DTC to an unspendable address (a valid address with an unknown private key), see the Counterparty thread here at BTCtalk for an explanation. I believe this is one of the most fair distribution systems. Also Datacoin would recover since it has an infinite supply.
sr. member
Activity: 350
Merit: 250
DTC unofficial team
Yea..unfortunately dev. hasn't shown up so that is the cause of price dwindling...If someone with good tech skills can help dev. or possibly take over until he is back that'd be great.

I think 'prime' proof-of-work is interesting, but that it will ultimately result in an economic failure for data storage.

I'd like to propose a gradual (but requiring a hard-fork) upgrade to 'proof of data' which would just be to find sha 256 hash of the required difficulty *over the entire blockchain*. This basically means your hash rate is proportional to how fast you can read the blockchain off disk (or you keep it in memory)

It might have to be implemented as something that uses a pseudo-random number generator so that each block only requires a deterministic, but psuedorandom sequence of blocks, (a subset of the blockchain) so that you cover the entire blockchain over a a few hundred blocks

Thoughts anyone? Good idea? Bad idea? Something to do first in a new coin?
No, I would implement the proof of resource algorithms that are coming out this year. There are way more efficient ways to do this.
I have a concept for proof of storage, but I haven't had time to write a whitepaper for it.

What I wonder is if we're thinking of the same thing, and by the time I get done optimizing proof-of-data if it ends up being very similiar to your concept.. Or is there some paper on proof-of-resource you can link me to?

The other question still stands.. Once we have it, do we do a new coin, or hardfork-upgrades to datacoin?

We should have some kind of hardfork. I hope that it will be implemented as the next stage of Datacoin evolution.
sr. member
Activity: 271
Merit: 254
Yea..unfortunately dev. hasn't shown up so that is the cause of price dwindling...If someone with good tech skills can help dev. or possibly take over until he is back that'd be great.

I think 'prime' proof-of-work is interesting, but that it will ultimately result in an economic failure for data storage.

I'd like to propose a gradual (but requiring a hard-fork) upgrade to 'proof of data' which would just be to find sha 256 hash of the required difficulty *over the entire blockchain*. This basically means your hash rate is proportional to how fast you can read the blockchain off disk (or you keep it in memory)

It might have to be implemented as something that uses a pseudo-random number generator so that each block only requires a deterministic, but psuedorandom sequence of blocks, (a subset of the blockchain) so that you cover the entire blockchain over a a few hundred blocks

Thoughts anyone? Good idea? Bad idea? Something to do first in a new coin?
No, I would implement the proof of resource algorithms that are coming out this year. There are way more efficient ways to do this.
I have a concept for proof of storage, but I haven't had time to write a whitepaper for it.

What I wonder is if we're thinking of the same thing, and by the time I get done optimizing proof-of-data if it ends up being very similiar to your concept.. Or is there some paper on proof-of-resource you can link me to?

The other question still stands.. Once we have it, do we do a new coin, or hardfork-upgrades to datacoin?
legendary
Activity: 1094
Merit: 1006
Yea..unfortunately dev. hasn't shown up so that is the cause of price dwindling...If someone with good tech skills can help dev. or possibly take over until he is back that'd be great.

I think 'prime' proof-of-work is interesting, but that it will ultimately result in an economic failure for data storage.

I'd like to propose a gradual (but requiring a hard-fork) upgrade to 'proof of data' which would just be to find sha 256 hash of the required difficulty *over the entire blockchain*. This basically means your hash rate is proportional to how fast you can read the blockchain off disk (or you keep it in memory)

It might have to be implemented as something that uses a pseudo-random number generator so that each block only requires a deterministic, but psuedorandom sequence of blocks, (a subset of the blockchain) so that you cover the entire blockchain over a a few hundred blocks

Thoughts anyone? Good idea? Bad idea? Something to do first in a new coin?
No, I would implement the proof of resource algorithms that are coming out this year. There are way more efficient ways to do this.
I have a concept for proof of storage, but I haven't had time to write a whitepaper for it.
sr. member
Activity: 350
Merit: 250
DTC unofficial team
Yea..unfortunately dev. hasn't shown up so that is the cause of price dwindling...If someone with good tech skills can help dev. or possibly take over until he is back that'd be great.

I think 'prime' proof-of-work is interesting, but that it will ultimately result in an economic failure for data storage.

I'd like to propose a gradual (but requiring a hard-fork) upgrade to 'proof of data' which would just be to find sha 256 hash of the required difficulty *over the entire blockchain*. This basically means your hash rate is proportional to how fast you can read the blockchain off disk (or you keep it in memory)

It might have to be implemented as something that uses a pseudo-random number generator so that each block only requires a deterministic, but psuedorandom sequence of blocks, (a subset of the blockchain) so that you cover the entire blockchain over a a few hundred blocks

Thoughts anyone? Good idea? Bad idea? Something to do first in a new coin?

It seems to me that SSD drives will win the game, and there will be only several 'data-rates' (like hashrates): usual HDDs, fast HDDs, RAIDs, SSDs, SSD clusters.
btw - check your PM
sr. member
Activity: 271
Merit: 254
Yea..unfortunately dev. hasn't shown up so that is the cause of price dwindling...If someone with good tech skills can help dev. or possibly take over until he is back that'd be great.

I think 'prime' proof-of-work is interesting, but that it will ultimately result in an economic failure for data storage.

I'd like to propose a gradual (but requiring a hard-fork) upgrade to 'proof of data' which would just be to find sha 256 hash of the required difficulty *over the entire blockchain*. This basically means your hash rate is proportional to how fast you can read the blockchain off disk (or you keep it in memory)

It might have to be implemented as something that uses a pseudo-random number generator so that each block only requires a deterministic, but psuedorandom sequence of blocks, (a subset of the blockchain) so that you cover the entire blockchain over a a few hundred blocks

Thoughts anyone? Good idea? Bad idea? Something to do first in a new coin?
sr. member
Activity: 271
Merit: 254
I'm actually using Datacoin right now to prototype my decentralized storage engine. I'm going to be creating some sort of fork of Datacoin for that. What do you think would be the best way to integrate the existing Datacoin community.
Great! Of course you can fork original repo and create your own version of datacoin core and\or perl explorer. It will be the best way for sharing ideas with communities, I think. Also maybe you should write some updates on the forum

I'm going to be working on putting Datacoin in the https://bitbucket.org/dahozer/catcoin catbox, and once it all works, you should be able to build a CatcoinRelease blockchain-compatible client , OR build a datacoin blockchain-compatible client from the same source code repository.

I also just registered #datacoin-dev on irc.freenode.net, so if you are interested in developing datacoin , as well as datacoin derivatives, I will probably be a lot easier to find on irc than on the forums.
legendary
Activity: 1094
Merit: 1006
Is anyone willing to sell me a large amount of Datacoin at market?
sr. member
Activity: 308
Merit: 250
Riecoin and Huntercoin to rule all!
Registration For DatacoinFoundation.org is now completed. Website Redesign is on its way for Datacoin.info. More news on the way.
legendary
Activity: 1094
Merit: 1006

I'm actually using Datacoin right now to prototype my decentralized storage engine. I'm going to be creating some sort of fork of Datacoin for that. What do you think would be the best way to integrate the existing Datacoin community. 

I guess the latest data on the most recent blocks where some of your tests  Roll Eyes

When you say fork, do you mean a fork that will use the same blockchain or a different blockchain ?
Are you going to use online services for storage ? I saw several transactions with data like the following lately :


[{"filehash": "8b4bf712241860b10c4b5307ccb73f5680c3c04ae1f06a5212508ce93f0a4efb", "filename": "8b4bf71_Bitcoin_Series_24__The_Mega-Master_Blockchain_List_Ledra_Capital.pdf", "version": "0.2", "filesize": 515251, "datetime": "1394679316", "uploads": [{"url": "http://rghost.net/53020838", "host_name": "rghost"}, {"url": "http://multiupload.nl/WDCAL8C9AM", "host_name": "multiupload"}, {"url": "http://gfile.ru/a4fjp", "host_name": "gfile_ru"}]}]


What about magnet links or other P2P references ? This would make the data more resilient.

This is starting to get interesting.
Good luck!
Ha ha. I forgot you can't hide from the blockchain. Yes, that is some metadata from my decentralized storage engine web nodes.
Yes, that information can be added as well, and will be in the future.
sr. member
Activity: 308
Merit: 250
Riecoin and Huntercoin to rule all!
I'm actually using Datacoin right now to prototype my decentralized storage engine. I'm going to be creating some sort of fork of Datacoin for that. What do you think would be the best way to integrate the existing Datacoin community.

Excellent news! More datacoin news to come in less than a month!
hero member
Activity: 637
Merit: 500

I'm actually using Datacoin right now to prototype my decentralized storage engine. I'm going to be creating some sort of fork of Datacoin for that. What do you think would be the best way to integrate the existing Datacoin community. 

I guess the latest data on the most recent blocks where some of your tests  Roll Eyes

When you say fork, do you mean a fork that will use the same blockchain or a different blockchain ?
Are you going to use online services for storage ? I saw several transactions with data like the following lately :


[{"filehash": "8b4bf712241860b10c4b5307ccb73f5680c3c04ae1f06a5212508ce93f0a4efb", "filename": "8b4bf71_Bitcoin_Series_24__The_Mega-Master_Blockchain_List_Ledra_Capital.pdf", "version": "0.2", "filesize": 515251, "datetime": "1394679316", "uploads": [{"url": "http://rghost.net/53020838", "host_name": "rghost"}, {"url": "http://multiupload.nl/WDCAL8C9AM", "host_name": "multiupload"}, {"url": "http://gfile.ru/a4fjp", "host_name": "gfile_ru"}]}]


What about magnet links or other P2P references ? This would make the data more resilient.

This is starting to get interesting.
Good luck!
sr. member
Activity: 350
Merit: 250
DTC unofficial team
I'm actually using Datacoin right now to prototype my decentralized storage engine. I'm going to be creating some sort of fork of Datacoin for that. What do you think would be the best way to integrate the existing Datacoin community.
Great! Of course you can fork original repo and create your own version of datacoin core and\or perl explorer. It will be the best way for sharing ideas with communities, I think. Also maybe you should write some updates on the forum
Pages:
Jump to: