Pages:
Author

Topic: List of all Bitcoin addresses ever used - currently UNavailable on temp location - page 6. (Read 3605 times)

newbie
Activity: 6
Merit: 15
First of all, great project!



(...)
Quote
The longer the list, the longer it will take to sort one additional line.
At some point a database might beat raw text sorting, but for now I'm good with this Smiley
Using a database will not solve this problem. There are some things a DB can do to make sorting go from O^2 to O^2/n, but this is still exponential growth.

You make the argument that your input size is sufficiently small such that having exponential complexity is okay, and you may have a point.
Going with these two versions:
(...)
Since I got no response to my question above, I'll go with 2 versions:
  • All addresses ever used, without duplicates, in order of first appearance.
  • All addresses ever used, without duplicates, sorted.
The first file feels nostalgic, the second file will be very convenient to match addresses with a list of your own.

I don't see how sorting would be exponential for any of these lists..

All addresses ever used, without duplicates, sorted.
  • We already have a list with all the addresses ever used sorted by address (length n).
  • We have a list of (potentially) new addresses (length k).
  • We sort the list of new items in O(k log k).
  • We check for duplicates in the new addresses in O(k).
  • We then read the big list line by line while simultaneously running through the list of new addresses and comparing the values in O(n + k). In this case we can directly write the new file to disk line by line; only the list of new addresses is kept in memory.

Resulting in O(n + k log k + 2k). In this particular case one might even argue that n > k log k + 2k, therefore O(2n) = O(n) However, it's late here and I don't like to argue.

You only need enough memory to keep the new addresses in memory and enough disk space to keep both the new and old version on disk at the same time.

The 'All addresses ever used, without duplicates, in order of first appearance' list could be created in pretty much the same way.

I'll see if I can whip some code together.


File hosting
Have you considered releasing the big files as torrents with a webseed? This will allow downloaders to still download from your server and then (hopefully) continue to seed for a while; taking some strain of your server.

You might even release it in a RSS feed so that some contributors could automatically add it to their torrent clients and start downloading with e.g. max 1 Mb/s and uploading with >1Mb/s, this will quickly allow the files to spread over the peers and further move downloads away from your server.


legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
daily updates also need to be post there, if possible
This VPS is currently downloading other data from Blockchair, which only allows once connection at a time. I expect this to take another month (at 100 kB/s), after that I can enable daily updates (txt-files with unique addresses for that day) again.

I haven't decided yet how and where to do regular updates to the 20 GB files (this is quite resource intensive).
member
Activity: 310
Merit: 34
Due to another VPS that decided to run off with my prepayment (Lol: for 2 weeks), this data is currently unavailable. I'm not sure yet where to move, if it takes too long I'll upload the data elsewhere (but in that case without regular backups).

Update:
I've uploaded the latest version to a temporary location: blockdata.loyce.club/alladdresses/.

daily updates also need to be post there, if possible,
Thankx
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
Due to another VPS that decided to run off with my prepayment (Lol: for 2 weeks), this data is currently unavailable. I'm not sure yet where to move, if it takes too long I'll upload the data elsewhere (but in that case without regular backups).

Update:
I've uploaded the latest version to a temporary location: blockdata.loyce.club/alladdresses/.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
I am not sure what level of access you have to the AWS account sponsoring your site.
Just root access to loyce.club, but addresses.loyce.club and alladdresses.loyce.club aren't hosted at AWS. This month so far, they've passed 1 TB of traffic, so it was a good call not to use AWS (this would cost $90).

Quote
However, it is possible to setup a storage bucket so that anyone can access it, but that the requestors IP address is among the IP addresses of the same region the files are stored in.
That seems like overkill for this.

Quote
Using a database will not solve this problem. There are some things a DB can do to make sorting go from O^2 to O^2/n, but this is still exponential growth.
For a database it would only mean checking and adding 750k addresses per day, instead of sorting the entire data again. I expect sort to take less long too when the majority of ("old") data is already sorted, but haven't tested for speed differences.

Quote
AWS is very reliable.
I have never experienced any downtime with AWS, unlike all VPS providers I've ever used. Those "external projects" don't have much priority to me, if it's down I don't lose scraping data.

Quote
This works out to approximately a 24-minute download. I measured a download speed of ~125 Mbps using a colab instance.
It's doing the biweekly data update, that probably slowed it down too.
copper member
Activity: 1610
Merit: 1898
Amazon Prime Member #7
I had used AWS as an example because I believed you used it for some of your other projects.
Correct, loyce.club runs on AWS (sponsored).

Quote
Yes, transferring data to the internet is very expensive. You can use a CDN (content delivery network) to reduce costs a little bit. 5 TB of data is a lot.
I highly doubt I'd find a cheaper deal Cheesy I hope not to use the full 5 TB though, I expect some overselling and don't want to push it to the limit.
I am not sure what level of access you have to the AWS account sponsoring your site. However, it is possible to setup a storage bucket so that anyone can access it, but that the requestors IP address is among the IP addresses of the same region the files are stored in. See this stack overflow discussion. You can also setup the storage bucket such that the requestor pays for egress traffic.


Quote
The longer the list, the longer it will take to sort one additional line.
At some point a database might beat raw text sorting, but for now I'm good with this Smiley
Using a database will not solve this problem. There are some things a DB can do to make sorting go from O^2 to O^2/n, but this is still exponential growth.

You make the argument that your input size is sufficiently small such that having exponential complexity is okay, and you may have a point.



I was under the impression that traffic out of the AWS network (for AWS) will count as egress traffic, and will be billed accordingly.
AWS charges $0.09/GB, and especially since this one is sponsored, I don't want to abuse it. I love how stable the server is though, it has never been down.
AWS is very reliable. I would not expect much downtime when using AWS or other major cloud providers. Egress traffic is very expensive though.

Downloads are fast, I've seen 20-100 MB/s. Enjoy Smiley

This works out to approximately a 24-minute download. I measured a download speed of ~125 Mbps using a colab instance.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
I'm glad to see this service is being used too:
Image loading...

I'd love to hear feedback (because I'm curious): what are you guys using this for?
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
It took a while, and the new VPS got a lot slower by now, but I've enabled updates again:
Updates
Sorting a list that doesn't fit in the server's RAM is slow. Therefore I only update both large files (addresses_sorted.txt.gz and  addresses_in_order_of_first_appearance.txt.gz) twice a month (on the 6th and 21st, updates take more than a day). Check the file date here to see how old it is. If an update fails, please post here.
In between updates, I create daily updates: alladdresses.loyce.club/daily_updates/. These txt-files contain unique addresses (for that day) in order of appearance.
I won't keep older snapshots.
Downloads are fast, I've seen 20-100 MB/s. Enjoy Smiley

My latest count: 764,534,424 Bitcoin addresses have been used.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
There's a problem though. There are:
756,494,121 addresses according to addresses_in_order_of_first_appearance.txt.gz
756,524,407 addresses according to addresses_sorted.txt.gz
Obviously, these numbers should be the same. I haven't scheduled automated updates yet, I first want to recreate this data from scratch to see which number is correct.
After recreating this data, I now have 757,437,766 unique addresses (don't click this link unless you want to download 18 GB).
My next step would be to add a few days of data, and count addresses again. Next, I'll recreate all data "from scratch", and see if I end up with the same numbers. I don't know why there's a difference, and I don't like loose ends in my data.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
I had used AWS as an example because I believed you used it for some of your other projects.
Correct, loyce.club runs on AWS (sponsored).

I meant on a VPS with another cloud provider with unmetered traffic, such as Hetzner.
I'm not using anything with "unmetered" traffic.



Still working on restoring all data from scratch. I'm curious to see if it matches any of the 2 existing files.
I don't really get the focus on data traffic though, right after I got a good deal on a new VPS. I'm good for now Smiley

I was under the impression that traffic out of the AWS network (for AWS) will count as egress traffic, and will be billed accordingly.
AWS charges $0.09/GB, and especially since this one is sponsored, I don't want to abuse it. I love how stable the server is though, it has never been down.
copper member
Activity: 1610
Merit: 1898
Amazon Prime Member #7
~snip

If you have the network capacity then it's better to just serve it locally (except, AWS bills your upload traffic too  Angry)
Your local ISP might not like it very much if you are uploading that much data.

Sorry, when I said locally, I meant on a VPS with another cloud provider with unmetered traffic, such as Hetzner.

I guess I have been doing too much of my work on the cloud to tell the difference anymore.
Ahh, gotcha.

I was under the impression that traffic out of the AWS network (for AWS) will count as egress traffic, and will be billed accordingly. Migrating your data from AWS to GCS will incur a charge from AWS for the amount of your data. There might be ways around this, I'm not sure.
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
~snip

If you have the network capacity then it's better to just serve it locally (except, AWS bills your upload traffic too  Angry)
Your local ISP might not like it very much if you are uploading that much data.

Sorry, when I said locally, I meant on a VPS with another cloud provider with unmetered traffic, such as Hetzner.

I guess I have been doing too much of my work on the cloud to tell the difference anymore.
Vod
legendary
Activity: 3668
Merit: 3010
Licking my boob since 1970
Your local ISP might not like it very much if you are uploading that much data.

Quickseller, most ISPs have a download bottleneck - not upload.

So few people upload more than they download that most ISPs don't even restrict uploads. 

What ISP does LoyceV use that does not like uploading?
copper member
Activity: 1610
Merit: 1898
Amazon Prime Member #7
As a FYI, you generally will not want to host files on a server. You will probably want to host files in a storage bucket that can be accessed by a server.
Amazon charges $0.09 per GB outgoing data, that's rediculous for this purpose (my current 5 TB bandwidth limit would cost $450 per month when maxed out). And Amazon wants my creditcard instead of Bitcoin.
I had used AWS as an example because I believed you used it for some of your other projects.

Yes, transferring data to the internet is very expensive. You can use a CDN (content delivery network) to reduce costs a little bit. 5 TB of data is a lot.

Quote
Separately, sorting lists are not scalable, period.
Actually, sort performs quite well. I've tested:
10M lines: 10 seconds (fits in RAM)
50M lines: 63 seconds (starts using temporary files)
250M lines: 381 seconds (using 2 GB RAM and temporary files)
So a 5 times larger file takes 6 times longer to sort. I'd say scalability is quite good.
I think you are proving my point. The more input you have, the more time it takes to process one additional input.

To put it another way, it takes 1 unit of time to sort a list with a length of 2, it takes 1 + a units of time to sort a list with a length of 3, it takes 1 + a + b units of time to sort a list with a length of 4, and so on. The longer the list, the longer it will take to sort one additional line.

As a FYI, you generally will not want to host files on a server. You will probably want to host files in a storage bucket that can be accessed by a server.

If you want to update a file that takes a lot of resources, you can create a VM, execute a script that updates the file, and uploads it to a S3 (on AWS) bucket. You would then be able to access that file using another VM that takes fewer resources.

That may save on local resources but you will be paying a lot of money per month if people download several hundred gigabytes each month particularly if the files are large like the files hosted in the OP.

If you have the network capacity then it's better to just serve it locally (except, AWS bills your upload traffic too  Angry)
Your local ISP might not like it very much if you are uploading that much data.
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
As a FYI, you generally will not want to host files on a server. You will probably want to host files in a storage bucket that can be accessed by a server.

If you want to update a file that takes a lot of resources, you can create a VM, execute a script that updates the file, and uploads it to a S3 (on AWS) bucket. You would then be able to access that file using another VM that takes fewer resources.

That may save on local resources but you will be paying a lot of money per month if people download several hundred gigabytes each month particularly if the files are large like the files hosted in the OP.

If you have the network capacity then it's better to just serve it locally (except, AWS bills your upload traffic too  Angry)
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
Thanks for the update the last .gz you had I think was from September.
Correct (August 6 and September 2).

As a FYI, you generally will not want to host files on a server. You will probably want to host files in a storage bucket that can be accessed by a server.[/quote]
Amazon charges $0.09 per GB outgoing data, that's rediculous for this purpose (my current 5 TB bandwidth limit would cost $450 per month when maxed out). And Amazon wants my creditcard instead of Bitcoin.

Quote
If you want to update a file that takes a lot of resources, you can create a VM, execute a script that updates the file, and uploads it to a S3 (on AWS) bucket. You would then be able to access that file using another VM that takes fewer resources.
Still, that's quite excessive for just 2 files that are barely used.

Quote
Separately, sorting lists are not scalable, period.
Actually, sort performs quite well. I've tested:
10M lines: 10 seconds (fits in RAM)
50M lines: 63 seconds (starts using temporary files)
250M lines: 381 seconds (using 2 GB RAM and temporary files)
So a 5 times larger file takes 6 times longer to sort. I'd say scalability is quite good.

It just takes a while because it uses temporare disk storage. Given enough RAM, it can utilize multiple cores.

Quote
There are some things you can do to increase the speed, such as keep the list in RAM, or cutting the number of instances the entire list is reviewed, but you ultimately cannot sort an unordered very large list.
The 256 GB RAM server idea would cost a few dollars per hour, so I'll do with less.
copper member
Activity: 1610
Merit: 1898
Amazon Prime Member #7
Some results: The awk-thing uses just over 1 GB memory for 10 million addresses. So for 1.5 billion addresses, a 256 GB server should be enough. At AWS, that would cost a few dollars per hour.
As a FYI, you generally will not want to host files on a server. You will probably want to host files in a storage bucket that can be accessed by a server.

If you want to update a file that takes a lot of resources, you can create a VM, execute a script that updates the file, and uploads it to a S3 (on AWS) bucket. You would then be able to access that file using another VM that takes fewer resources.

Separately, sorting lists are not scalable, period. There are some things you can do to increase the speed, such as keep the list in RAM, or cutting the number of instances the entire list is reviewed, but you ultimately cannot sort an unordered very large list.
newbie
Activity: 12
Merit: 3
Just yesterday, I got a good deal on a new VPS (more memory, more disk, more CPU and more bandwidth). It's dedicated to only this project (and I have no idea how reliable it's going to be). I've updated the OP.

There's a problem though. There are:
756,494,121 addresses according to addresses_in_order_of_first_appearance.txt.gz
756,524,407 addresses according to addresses_sorted.txt.gz
Obviously, these numbers should be the same. I haven't scheduled automated updates yet, I first want to recreate this data from scratch to see which number is correct.

Thanks for the update the last .gz you had I think was from September.

legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
Just yesterday, I got a good deal on a new VPS (more memory, more disk, more CPU and more bandwidth). It's dedicated to only this project (and I have no idea how reliable it's going to be). I've updated the OP.

There's a problem though. There are:
756,494,121 addresses according to addresses_in_order_of_first_appearance.txt.gz
756,524,407 addresses according to addresses_sorted.txt.gz
Obviously, these numbers should be the same. I haven't scheduled automated updates yet, I first want to recreate this data from scratch to see which number is correct.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
Are you downloading Blockchair dumps at the slow rate?
Yes. But 100 kB/s isn't a problem anymore: the initial download took a long time, but for daily updates it doesn't take that long.

Quote
I just contacted Blockchair for an API key, which enables people to download at the fast rate, and a support rep told me they cost $500/month.
I thought they'd offer it for free for certain users, but this makes sense from a business point of view.

Quote
If network bandwidth is a problem I'm able to host this on my hardware if you like.
Just this month I'm at 264 GB for this project, and 174 GB for all Bitcoin addresses with a balance. That means this full list is only downloaded a few times per month, but the funded addy list is downloaded a few times per day.
I'm more in need for more disk space for sorting this data, but I haven't decided yet where to host it. 100 GB disk space isn't enough.
Pages:
Jump to: