Author

Topic: [ANN][BURST] Burst | Efficient HDD Mining | New 1.2.3 Fork block 92000 - page 940. (Read 2170895 times)

sr. member
Activity: 350
Merit: 250
It uses the current baseTarget to do the calculations, so: Not 100% accurate.




So with 24TB, how much you can earn in 24 hours? Grin Wink

Roughly $50/day at current rates.
sr. member
Activity: 280
Merit: 250
I am thinking about future where PB farm is common. Let say if its 1000 PB , internet speed have to increase ?

You're still fine with 1Mbit. Maybe even less.
member
Activity: 64
Merit: 10
Yes you're mining

Thank you Smiley.

It's all kinda hard to understand for a newcomer, here are some questions. I know they were prob. answered before but hell, in 400 pages topic its really hard to find anything...

How can I know how much disk space miner is using?
I saw other people posting screens with files generated on their hard disk. Are those generated once block is found? Or should I be getting them as well?
hero member
Activity: 1036
Merit: 531
I'm not quite sure whether I set miner up correctly... Im getting

Code:
C:\Users\xxxxx\Desktop\pocminer_v1>C:\Windows\System32\java.exe -Xmx750m
 -cp pocminer.jar;lib/*;lib/akka/*;lib/jetty/* pocminer.POCMiner mine http://127
.0.0.1:8125
Added key:xxxxxxx
 -> 1582xxxxxxxxx
[default-akka.actor.default-dispatcher-2] INFO org.eclipse.jetty.util.log - Logg
ing initialized @2593ms
{"baseTarget":"7005503","height":"10003","generationSignature":"8c1c89770ae0b36a
c43555dae7a86bd0e2c7e033272af8f24f79da4b45ec0e8b"}
{"baseTarget":"7005503","height":"10003","generationSignature":"8c1c89770ae0b36a
c43555dae7a86bd0e2c7e033272af8f24f79da4b45ec0e8b"}

Could someone please tell me if its mining all right?

Yes you're mining
member
Activity: 64
Merit: 10
I'm not quite sure whether I set miner up correctly... Im getting

Code:
C:\Users\xxxxx\Desktop\pocminer_v1>C:\Windows\System32\java.exe -Xmx750m
 -cp pocminer.jar;lib/*;lib/akka/*;lib/jetty/* pocminer.POCMiner mine http://127
.0.0.1:8125
Added key:xxxxxxx
 -> 1582xxxxxxxxx
[default-akka.actor.default-dispatcher-2] INFO org.eclipse.jetty.util.log - Logg
ing initialized @2593ms
{"baseTarget":"7005503","height":"10003","generationSignature":"8c1c89770ae0b36a
c43555dae7a86bd0e2c7e033272af8f24f79da4b45ec0e8b"}
{"baseTarget":"7005503","height":"10003","generationSignature":"8c1c89770ae0b36a
c43555dae7a86bd0e2c7e033272af8f24f79da4b45ec0e8b"}

Could someone please tell me if its mining all right?
hero member
Activity: 1036
Merit: 531
https://bchain.info/BURST/tools/calculator

This calculator is reliable?  

It uses the current baseTarget to do the calculations, so: Not 100% accurate.

Quote
Normally not overlapping
Not broken plots

Does a bad connection can cause higher deadline too?

No, a bad connection will delay new blocks to your node (if you go solo), and reduce your chances, but has no effect on the deadline.

How high are your deadlines? How many GB?

9TB ( 4 + 4 + 1 ) My deadline depends and vary a lot, sometimes 6 numbers, sometimes 4, and sometimes more (i'm on a pool)
hero member
Activity: 868
Merit: 1000
It uses the current baseTarget to do the calculations, so: Not 100% accurate.




So with 24TB, how much you can earn in 24 hours? Grin Wink

you need 24-48 days to plot those first, good luck lol

What if he "copy and paste" ? That is where the magic happens
sr. member
Activity: 355
Merit: 250
Is it possible to make p2pool to help decentralize this coin ?
hero member
Activity: 868
Merit: 1000
No, a bad connection will delay new blocks to your node (if you go solo), and reduce your chances, but has no effect on the deadline.

How high are your deadlines? How many GB?

Does internet connection matters if for example 100 TB ?

You do not need a faster internet connection with more TB. 1Mbit is plenty.

With 100TB your deadlines should be around 4k seconds, only some of them higher.



I am thinking about future where PB farm is common. Let say if its 1000 PB , internet speed have to increase ?

Bitcoin holds the record for having the highest computing power on Earth.

I wonder if Burst will hold the record for having the highest disk space.

Imagine in the future Burst can make use of those plot to store data.

It could possibly store data for the entire world
legendary
Activity: 3248
Merit: 1070
It uses the current baseTarget to do the calculations, so: Not 100% accurate.




So with 24TB, how much you can earn in 24 hours? Grin Wink

you need 24-48 days to plot those first, good luck lol

also 3TB are 2720 after formatting so you have 21760gb at the end, not 24
sr. member
Activity: 280
Merit: 250
No, a bad connection will delay new blocks to your node (if you go solo), and reduce your chances, but has no effect on the deadline.

How high are your deadlines? How many GB?

Does internet connection matters if for example 100 TB ?

You do not need a faster internet connection with more TB. 1Mbit is plenty.

With 100TB your deadlines should be around 4k seconds, only some of them higher.

Block 10000!!
legendary
Activity: 1059
Merit: 1000
hero member
Activity: 868
Merit: 1000
No, a bad connection will delay new blocks to your node (if you go solo), and reduce your chances, but has no effect on the deadline.

How high are your deadlines? How many GB?

Does internet connection matters if for example 100 TB ? or even 1000 TB ? Not sure whether internet usage increase with space.
sr. member
Activity: 280
Merit: 250
So what I can do with 24TB per day?  Grin Wink

~3 Blocks/day. Happy mining!
legendary
Activity: 1059
Merit: 1000
It uses the current baseTarget to do the calculations, so: Not 100% accurate.




So with 24TB, how much you can earn in 24 hours? Grin Wink
sr. member
Activity: 280
Merit: 250
https://bchain.info/BURST/tools/calculator

This calculator is reliable?  

It uses the current baseTarget to do the calculations, so: Not 100% accurate.

Quote
Normally not overlapping
Not broken plots

Does a bad connection can cause higher deadline too?

No, a bad connection will delay new blocks to your node (if you go solo), and reduce your chances, but has no effect on the deadline.

How high are your deadlines? How many GB?
hero member
Activity: 1036
Merit: 531
But merge process (merge by dcct) take very very more time.

You started with a low stagger size of 1024. Merge has to read lots and lots of small chunks to do the merging, which takes a long time. Try to start with a higher stagger size if possible.

Thx for you're answer dcct.

Normally not overlapping
Not broken plots

Does a bad connection can cause higher deadline too?
sr. member
Activity: 458
Merit: 250
beast at work
hey guys.is there still that program that calculate your plot size before starting them


number of nonces * 256 / 1.000.000 = size in Gb
PlotSize (GB) = NumberOfNonces / 4096
NumberOfNonces = PlotSize (GB) * 4096

your formula gives a rough estimation, the one posted by me it`s more... exact (file on disk size).

1**********************9_60000000_40960_40960 - 10Gb (your version) - 10.485Gb (mine)
1GB is 1024*1024 kB. Your formula is wrong. Just check any one plotfile.

are you sure ?


Of course I'm sure. 10485760kB = 10240MB = 10GB. It`s a basics, bro! Smiley

wtf are you talking about, bro ? you realize that filesize is not the same as size on disk and it makes a HUGE difference when you plan your plot file, right bro ?

let me give you a hint... 100G on paper equals to about 107Gb on disk, so i`ll strongly advise to calculate file size using the formula I`ve written above because it`s not theoretical.
newbie
Activity: 16
Merit: 0
I like to share a little tool I wrote to increase stagger size for already finished plots. No more 8191 limit!

Its simple C code - for linux only - its short, so check & compile yourself.

https://bchain.info/merge.c

It does not hold everything in memory but seeks on disk to get the scoops.

version for Windows - https://www.dropbox .com/s/g4xs48y0505lu0u/merge.exe?dl=0
someone test? this is working fine?

it`s working as expected

Any UAT ?

We have a memory limit of 2000 MB per Process in Windows. For the maximum stagger size should be used merge.exe.
legendary
Activity: 1059
Merit: 1000
Jump to: