Quadro Cards vs. Games Cards
At work we use Nvidia Quadro cards as they do SDI capture and playback pretty quickly.
Only problem is we're a small company and they cost a shit load, we just got a demo in and the GFX card costs around £4000 with the two extra input and output cards bringing the total up to around £8000, or nearly $12000(US). That's with discount.
Now, I just want to know why a quadro card is over 10 times more expensive than a top end gaming card. I know you get custom drivers for apps such as 3DSMax and such, and this particular card has 4GB of RAM on board, which we don't really need to be honest, but even lower end Quadro cards are still really expensive.
Years back I was actually given a low end Quadro card which was worth around £500, I swapped out my current £90 Nvidia FX5200 and stuck C&C Generals on to be greeted with a few more frames per second. That's it.
So, can anyone say why they're so expensive and who would benefit from the extra features they must offer?
Also, does anyone know any other solution with fast SDI in-out but using standard gaming cards rather than the obscenely expensive Quadro.
Cheers
---When I'm in command, every mission's a suicide mission!
The difference between a graphics workstation GPU to a gaming GPU is about the same as the difference between a scalpel and a steak knife: yeah, they both do the same general thing (draw/cut stuff), but one is about precision and the other is about speed. When you need a cluster of graphics workstations to render frames in a movie in parallel, you want each of those workstations to produce the same, exact results for any given frame, which means you can't rely on a consumer-grade gaming GPU with its many speed-oriented algorithm compromises.
For the full story, NVidia made this technical brief available. It's old, but it should give you an idea of the type of things going on.
For the full story, NVidia made this technical brief available. It's old, but it should give you an idea of the type of things going on.
[Formerly "capn_midnight". See some of my projects. Find me on twitter tumblr G+ Github.]
Cheers for that. From reading through the doc you linked to it does seem quite old as most consumer cards support the features mentioned there now, but I can imagine the accuracy issue you mentioned is probably the main feature people look for now.
We don't need that at all so all I need to do is find a good SDI solution and we're onto a winner.
The Quadro has a nifty setup where the SDI video can come in one card, be passed straight onto the graphics card and then go straight out the the output card. I wonder if that's really much better than mixing different manufacturers cards that need to use the bus?
We don't need that at all so all I need to do is find a good SDI solution and we're onto a winner.
The Quadro has a nifty setup where the SDI video can come in one card, be passed straight onto the graphics card and then go straight out the the output card. I wonder if that's really much better than mixing different manufacturers cards that need to use the bus?
---When I'm in command, every mission's a suicide mission!
Actually, there isn't a lot of technical difference between the Quadro and the consumer cards. The Quadro line has a few features that are vital to professionals, but not to gamers. SDI, quad stereo, external sync, etc. Some of those features are also available on consumer cards, but have been intentionally disabled in order to create a line of pro products. They are more expensive not because of better hardware, but because the target market for these products is able and willing to pay more.
Besides that, NVidia also offers much better service to pro customers. For example, if you find a bug in their pro driver, you have a guarantee that it will be fixed for you.
Besides that, NVidia also offers much better service to pro customers. For example, if you find a bug in their pro driver, you have a guarantee that it will be fixed for you.
Thanks for the info guys.
I've also notice lots of the features seem to be optimised for OpenGL, i.e. OpenGL is mentioned a lot in the documents I've read and the new cards only support DX10.1 but OpenGL 3.
Seeing as we're using DirectX, and only DX9 at the moment would that mean we're not getting any extra value from a Quadro card other than the SDI option?
I've also notice lots of the features seem to be optimised for OpenGL, i.e. OpenGL is mentioned a lot in the documents I've read and the new cards only support DX10.1 but OpenGL 3.
Seeing as we're using DirectX, and only DX9 at the moment would that mean we're not getting any extra value from a Quadro card other than the SDI option?
---When I'm in command, every mission's a suicide mission!
You also get real support and certification. Try getting a driver related glitch fixed on a GeForce card.
The price is just business. They have extra features but make up less than 4% of the number of graphics cards sold. They are also primarily purchased by those who have funding and who "need" them rather than those who want them. It sucks for a small business where the cost is a bigger percentage, but that's just the way the world works I'm afraid.
The price is just business. They have extra features but make up less than 4% of the number of graphics cards sold. They are also primarily purchased by those who have funding and who "need" them rather than those who want them. It sucks for a small business where the cost is a bigger percentage, but that's just the way the world works I'm afraid.
Quote: Original post by theZapper
Thanks for the info guys.
I've also notice lots of the features seem to be optimised for OpenGL, i.e. OpenGL is mentioned a lot in the documents
OpenGL has traditionally been the high end graphics API of choice. DirectX seems to be making inroads but the fact that most high end software packages are cross platform and originally from the UNIX world does limit its uptake somewhat.
Quote: Original post by theZapper
and the new cards only support DX10.1 but OpenGL 3.
That's because OpenGL3 is DX10.x based and NV currently don't have a DX11 card on the market (they are releasing the normal cards on March 26th but expect very very limited availbity into May); when they do expect a DX11/OpenGL4 update to the docs.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement