Max number of GPUs?
Does anyone know if there is a hard limit in Windows XP Pro x64 for GPU enumeration? i.e. What's the absolute and/or theoretical maximum number of GPUs that the operating system will support if motherboard hardware exists?

I am thinking of assembling one computer with 6 GTX295 cards in it for a total of 12 GPUs. Will Windows XP Pro x64 or Windows Vista support that many GPUs?

I am aware of the many issues associated with building such a machine, right now I just need to know if Windows will enumerate that many...

Jason
Does anyone know if there is a hard limit in Windows XP Pro x64 for GPU enumeration? i.e. What's the absolute and/or theoretical maximum number of GPUs that the operating system will support if motherboard hardware exists?



I am thinking of assembling one computer with 6 GTX295 cards in it for a total of 12 GPUs. Will Windows XP Pro x64 or Windows Vista support that many GPUs?



I am aware of the many issues associated with building such a machine, right now I just need to know if Windows will enumerate that many...



Jason

#1
Posted 02/23/2009 09:22 PM   
Odds of it working are approximately zero because of system BIOS issues. Eight GPUs are officially supported with R180 drivers in Linux, but I don't know what the max is for Windows.
Odds of it working are approximately zero because of system BIOS issues. Eight GPUs are officially supported with R180 drivers in Linux, but I don't know what the max is for Windows.

#2
Posted 02/23/2009 09:42 PM   
[quote name='tmurray' post='509675' date='Feb 23 2009, 03:42 PM']Odds of it working are approximately zero because of system BIOS issues. Eight GPUs are officially supported with R180 drivers in Linux, but I don't know what the max is for Windows.[/quote]

I have 8 GPUs working in Windows but I'm concerned that it's the maximum number.
[quote name='tmurray' post='509675' date='Feb 23 2009, 03:42 PM']Odds of it working are approximately zero because of system BIOS issues. Eight GPUs are officially supported with R180 drivers in Linux, but I don't know what the max is for Windows.



I have 8 GPUs working in Windows but I'm concerned that it's the maximum number.

#3
Posted 02/24/2009 12:08 AM   
[quote name='Atlas Folder' post='509669' date='Feb 23 2009, 03:22 PM']I am thinking of assembling one computer with 6 GTX295 cards in it for a total of 12 GPUs. Will Windows XP Pro x64 or Windows Vista support that many GPUs?[/quote]

Out of sick curiosity, how do you plan to get 6 cards in one computer?
[quote name='Atlas Folder' post='509669' date='Feb 23 2009, 03:22 PM']I am thinking of assembling one computer with 6 GTX295 cards in it for a total of 12 GPUs. Will Windows XP Pro x64 or Windows Vista support that many GPUs?



Out of sick curiosity, how do you plan to get 6 cards in one computer?

#4
Posted 02/24/2009 12:51 AM   
[quote name='seibert' post='509729' date='Feb 23 2009, 06:51 PM']Out of sick curiosity, how do you plan to get 6 cards in one computer?[/quote]

[url="http://www.asus.com/products.aspx?l1=3&l2=179&l3=815&l4=0&model=2697&modelmenu=1"]This[/url] plus [url="http://www2.multithread.co.uk/mtcshop/images/linitx.com/products/PCI_Express_X16_Flexi_Riser_Card_-_7cm_Cable_main.jpg"]these[/url] and a custom machined riser card holder. Ideally the whole shebang would be water cooled and overclocked to nearly 22 TeraFLOPS in one (big) box.

Would get in the neighborhood of 108,000 ppd in folding at home from the one machine.

Jason
[quote name='seibert' post='509729' date='Feb 23 2009, 06:51 PM']Out of sick curiosity, how do you plan to get 6 cards in one computer?



This plus these and a custom machined riser card holder. Ideally the whole shebang would be water cooled and overclocked to nearly 22 TeraFLOPS in one (big) box.



Would get in the neighborhood of 108,000 ppd in folding at home from the one machine.



Jason

#5
Posted 02/24/2009 03:49 AM   
The official answer, as far as I know, is that right now we support (that is, with every driver release we have, these configurations are tested) eight on Linux and I'm not 100% sure (at least four) on WinXP. Past eight and you've entered crazy configuration land (population: you), but I really do think you'll hit SBIOS limitations before you hit driver problems. I know we have a test machine set up for many-GPU experiments and with 12 GPUs we ran into numerous system BIOS issues that we had to fix before we could even boot to a console.

So unless you can try it first and make sure the BIOS works before you buy anything, I would not risk it. Even then, you are in spooky driver territory, and I won't promise that it works.
The official answer, as far as I know, is that right now we support (that is, with every driver release we have, these configurations are tested) eight on Linux and I'm not 100% sure (at least four) on WinXP. Past eight and you've entered crazy configuration land (population: you), but I really do think you'll hit SBIOS limitations before you hit driver problems. I know we have a test machine set up for many-GPU experiments and with 12 GPUs we ran into numerous system BIOS issues that we had to fix before we could even boot to a console.



So unless you can try it first and make sure the BIOS works before you buy anything, I would not risk it. Even then, you are in spooky driver territory, and I won't promise that it works.

#6
Posted 02/24/2009 06:40 AM   
[quote name='tmurray' post='509803' date='Feb 24 2009, 12:10 PM']The official answer, as far as I know, is that right now we support (that is, with every driver release we have, these configurations are tested) eight on Linux and I'm not 100% sure (at least four) on WinXP. Past eight and you've entered crazy configuration land (population: you), but I really do think you'll hit SBIOS limitations before you hit driver problems. I know we have a test machine set up for many-GPU experiments and with 12 GPUs we ran into numerous system BIOS issues that we had to fix before we could even boot to a console.

So unless you can try it first and make sure the BIOS works before you buy anything, I would not risk it. Even then, you are in spooky driver territory, and I won't promise that it works.[/quote]

This is where virtualization could come handy. XEN (and Hyper-V too I guess) already supports VT-D hardware.

Many vendors are moving to Virtualized compute centers. I know Intel is backing Virtualization because there is no need to create or port multi-threaded apps to leverage multi-core hardware. But being under Virtualization Umbrella can benefit CUDA as well.

This will alleviate driver issues.. but main BIOS (NOT the Bios under the virtual PC) issues might still remain
[quote name='tmurray' post='509803' date='Feb 24 2009, 12:10 PM']The official answer, as far as I know, is that right now we support (that is, with every driver release we have, these configurations are tested) eight on Linux and I'm not 100% sure (at least four) on WinXP. Past eight and you've entered crazy configuration land (population: you), but I really do think you'll hit SBIOS limitations before you hit driver problems. I know we have a test machine set up for many-GPU experiments and with 12 GPUs we ran into numerous system BIOS issues that we had to fix before we could even boot to a console.



So unless you can try it first and make sure the BIOS works before you buy anything, I would not risk it. Even then, you are in spooky driver territory, and I won't promise that it works.



This is where virtualization could come handy. XEN (and Hyper-V too I guess) already supports VT-D hardware.



Many vendors are moving to Virtualized compute centers. I know Intel is backing Virtualization because there is no need to create or port multi-threaded apps to leverage multi-core hardware. But being under Virtualization Umbrella can benefit CUDA as well.



This will alleviate driver issues.. but main BIOS (NOT the Bios under the virtual PC) issues might still remain

Ignorance Rules; Knowledge Liberates!

#7
Posted 02/24/2009 07:06 AM   
[quote name='tmurray' post='509803' date='Feb 24 2009, 12:40 AM']The official answer, as far as I know, is that right now we support (that is, with every driver release we have, these configurations are tested) eight on Linux and I'm not 100% sure (at least four) on WinXP. Past eight and you've entered crazy configuration land (population: you), but I really do think you'll hit SBIOS limitations before you hit driver problems. I know we have a test machine set up for many-GPU experiments and with 12 GPUs we ran into numerous system BIOS issues that we had to fix before we could even boot to a console.

So unless you can try it first and make sure the BIOS works before you buy anything, I would not risk it. Even then, you are in spooky driver territory, and I won't promise that it works.[/quote]
Oh that makes me want to try it all the more. I have 15 GTX295's now, a $370.00 motherboard isn't going to slow me down.

I'll send Asus an email (that probably won't get a response) and ask, but the fact that the board has 6 PCIe slots [i]implies[/i] that the BIOS can handle the devices. Which means that problems beyond that would be OS/Driver stuff that might be resolvable through virtualization.

Interesting!

Jason
[quote name='tmurray' post='509803' date='Feb 24 2009, 12:40 AM']The official answer, as far as I know, is that right now we support (that is, with every driver release we have, these configurations are tested) eight on Linux and I'm not 100% sure (at least four) on WinXP. Past eight and you've entered crazy configuration land (population: you), but I really do think you'll hit SBIOS limitations before you hit driver problems. I know we have a test machine set up for many-GPU experiments and with 12 GPUs we ran into numerous system BIOS issues that we had to fix before we could even boot to a console.



So unless you can try it first and make sure the BIOS works before you buy anything, I would not risk it. Even then, you are in spooky driver territory, and I won't promise that it works.

Oh that makes me want to try it all the more. I have 15 GTX295's now, a $370.00 motherboard isn't going to slow me down.



I'll send Asus an email (that probably won't get a response) and ask, but the fact that the board has 6 PCIe slots implies that the BIOS can handle the devices. Which means that problems beyond that would be OS/Driver stuff that might be resolvable through virtualization.



Interesting!



Jason

#8
Posted 03/01/2009 11:23 PM   
Why not just put them in separate boxes? I would think that putting all the cards in one box would actually be slower due to hitting other system bottlenecks (CPU/Memory/HDD/internal busses).

Also, for extra ridculousness points, get this ([url="http://www.koolance.com/water-cooling/product_info.php?product_id=372"]http://www.koolance.com/water-cooling/prod...?product_id=372[/url]) to cool the whole thing, and overclock all the cards for extra speed.
Why not just put them in separate boxes? I would think that putting all the cards in one box would actually be slower due to hitting other system bottlenecks (CPU/Memory/HDD/internal busses).



Also, for extra ridculousness points, get this (http://www.koolance.com/water-cooling/prod...?product_id=372) to cool the whole thing, and overclock all the cards for extra speed.

GPU.NET: Write your GPU code in 100% pure C#.

Learn more at tidepowerd.com, and download a free 30-day trial of GPU.NET. Follow @tidepowerd for release updates.



GPU.NET example projects

#9
Posted 03/02/2009 04:11 AM   
[quote name='profquail' post='512442' date='Mar 1 2009, 10:11 PM']Why not just put them in separate boxes? I would think that putting all the cards in one box would actually be slower due to hitting other system bottlenecks (CPU/Memory/HDD/internal busses).

Also, for extra ridculousness points, get this ([url="http://www.koolance.com/water-cooling/product_info.php?product_id=372"]http://www.koolance.com/water-cooling/prod...?product_id=372[/url]) to cool the whole thing, and overclock all the cards for extra speed.[/quote]
I have 15 GTX295's spread out in different machines already that I dedicated to Folding@Home 24/7. The thought was to make a technology demo, but after spending some time on the Asus forums it appears to be a dead end. Someone there posted:

[quote]From what I understand The first 3 PCIe sockets is linked to the NF200 chipset, andonly work for Graphiccards.

The other 3 PCIe sockets are linked with the x58 chipset.
You can use these for soundcard and other PCIe cards.[/quote]
My ridiculousness points are already fairly high, I wanted something truly outrageous. ([url="http://atlasfolding.com/"]atlasfolding.com[/url])

Jason
[quote name='profquail' post='512442' date='Mar 1 2009, 10:11 PM']Why not just put them in separate boxes? I would think that putting all the cards in one box would actually be slower due to hitting other system bottlenecks (CPU/Memory/HDD/internal busses).



Also, for extra ridculousness points, get this (http://www.koolance.com/water-cooling/prod...?product_id=372) to cool the whole thing, and overclock all the cards for extra speed.

I have 15 GTX295's spread out in different machines already that I dedicated to Folding@Home 24/7. The thought was to make a technology demo, but after spending some time on the Asus forums it appears to be a dead end. Someone there posted:



From what I understand The first 3 PCIe sockets is linked to the NF200 chipset, andonly work for Graphiccards.



The other 3 PCIe sockets are linked with the x58 chipset.

You can use these for soundcard and other PCIe cards.


My ridiculousness points are already fairly high, I wanted something truly outrageous. (atlasfolding.com)



Jason

#10
Posted 03/02/2009 05:51 AM   
I don't think that's true. NF200 is a PCIe switch + some special sauce for NV GPUs, but as far as I know there's nothing limiting PCIe links from NF200 from being used with non-NV GPUs.
I don't think that's true. NF200 is a PCIe switch + some special sauce for NV GPUs, but as far as I know there's nothing limiting PCIe links from NF200 from being used with non-NV GPUs.

#11
Posted 03/02/2009 08:52 AM   
Scroll To Top