Author

Topic: Cheapest V100X4 or P100X4 Server. I pay $100s(1-2%) if I go with your config. (Read 327 times)

full member
Activity: 309
Merit: 118
I suppose you have already considered this, but due to economies of scale, sometimes it MIGHT be more cost effective to use gaming gpus for your scientific application. I remember someone comparing these against the more professional setup for some medic imaging software.

The fun part is, you may get it from mining rigs. With opencl and cuda, the sky is the limit.

Unfortunately it does depend on your particular application, you would have to test them by yourself. But don't disregard those gaming gpus, especially the new ones that appear more general scientific purposed than 3d graphics; still those older mining rigs people sometimes get rid of for cheap, could give you a surprise...

I guess the fun part is spreading your workload in the smaller data chunks for the things to actually work...

I've considered it, but problem is training models require at least 16GB cards. Jukebox for example, only works with a 16GB card at least the 5B model. And older cards like the k80 are much slower compared to the modern ones.
Is there another setup that is comparable to a 4XV100 but at fraction the cost? I'd prefer not to pay $10,000s if possible, like $50,000 is too much.
legendary
Activity: 2030
Merit: 1569
CLEAN non GPL infringing code made in Rust lang
I suppose you have already considered this, but due to economies of scale, sometimes it MIGHT be more cost effective to use gaming gpus for your scientific application. I remember someone comparing these against the more professional setup for some medic imaging software.

The fun part is, you may get it from mining rigs. With opencl and cuda, the sky is the limit.

Unfortunately it does depend on your particular application, you would have to test them by yourself. But don't disregard those gaming gpus, especially the new ones that appear more general scientific purposed than 3d graphics; still those older mining rigs people sometimes get rid of for cheap, could give you a surprise...

I guess the fun part is spreading your workload in the smaller data chunks for the things to actually work...
full member
Activity: 309
Merit: 118
Hello There,

I am looking for suggestions. They have to work exactly for what I am looking for, but at the cheapest price. If I go with your configuration, you'll get 1-2% of the server cost
paid in Bitcoin to you
.
Server's can run up to $10,000s so that could be $100s+

The server must contain at least 4 V100 Tesla GPU cards, have 128GB-256GB+ Ram, 1TB nvme, 2-4TB HDD.
Suggest an alternative cheaper GPU variant build that can work for the following applications: OpenAI jukebox neural networks, StyleGAN2/image net, machine learning applications.

I would like if possible to not pay too much. I pay $400/Week to rent a v100 server, and looking for a way to save $$$ for long-term usage.

I Found this: https://www.mydigitaldiscount.com/supermicro-superserver-1029gq-txrt-1u-dual-processor-gpu-server-with-4x-nvidia-tesla-p100-sxm2-gpus-installed/ $7999 for 4 P100

$24,254: https://www.ebay.com/itm/DELL-R740-256GB-RAM-6x-NVIDIA-Tesla-V100-GPU-Rendering-Server-for-V-Ray/293415919263?_trkparms=aid%3D1110002%26algo%3DSPLICE.SOI%26ao%3D1%26asc%3D225086%26meid%3Dc31852f6d43f4ec7ad30c35cb0619267%26pid%3D100047%26rk%3D6%26rkt%3D12%26sd%3D303373001241%26itm%3D293415919263%26pmt%3D1%26noa%3D1%26pg%3D2047675%26algv%3DSellersOtherItemsV2%26brand%3DDell&_trksid=p2047675.c100047.m2108

Thoughts:

How can some them be so cheap in comparison to the actual parts? Like an v100 GPU is $8400+ 6 alone would be $50,000.

https://www.bhphotovideo.com/c/product/1368043-REG/nvidia_nvidia_900_2h400_0000_000_tesla_p100_16gb_cowos.html
p100 GPU $6000/each. 4 of these are $24,000. So how can that server above be $7999? Is it a scam?

Is it better to keep renting into the card release in September? Then wait for the cards to drop in price? I heard the Nvidia 3000 series, and A100 GPU is coming out then.
Still suggest a server anyways.


Jump to: