It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
GeraltOfRivia_PL: What stops me from simply making a high-end graphics card myself by copying NVIDIA in every way?
I'm aware that I'm (seriously) answering a joke, but, I'd just like to point out that even if you were a talented hardware designer with years of experience in designing GPUs, you could not copy NVIDIA. They do not publicize any data about the inner workings of their GPUs or drivers, unlike AMD.
Post edited January 23, 2021 by WinterSnowfall
high rated
That's like asking why we don't just program our own AAA games instead of buying them.
avatar
Dalswyn: How hard can it be to compete with an industry where nanometers are relevant?
Technically speaking, he isn't talking about GPUs but cards.

So basically there's absolutely nothing to the OP from ripping components off from his Nvidia, and then designing his own card to put those components back into.

It's not even such a far-fetched idea, the first home computers were delivered as DIY packages that electronics geeks assembled themselves.

In any case, if that OP's home made card would even flash a startup dialogue on the screen, it would be an achievement.
Go for it bro...report back on your grand effort
avatar
GeraltOfRivia_PL: What stops me from simply making a high-end graphics card myself by copying NVIDIA in every way?

Why don't people make graphics cards on their own?
1) Drivers and compatibility issues
2) following all the specs
3) getting all the chips and equipment
4) doing raw soldering
5) the debugging and getting it to work, just for 2D and not including 3D
6) the math coprocessors involved
7) this will likely cost you more, be bulkier and less reliable than just getting it from a company that specializes in it.

Only say China could get away with it by having a company move over there, giving them all their specs and research and process and then them making it, perhaps at a fraction because all the hard work is mostly done

The world's worst video card! Part 2
Post edited January 23, 2021 by rtcvb32
https://dqydj.com/how-to-create-an-fpga-graphics-card/

That is about the best you can even think about getting done alone.
avatar
GeraltOfRivia_PL: What stops me from simply making a high-end graphics card myself by copying NVIDIA in every way?

Why don't people make graphics cards on their own?
Aside from the law (being a direct copy from NVIDIA, would break the law as NVIDA would argue that you brought one of their cards and tried to sell it as your own), and the issues with parts, soldering etc as people have said.

There would also be distribution, and getting it working. Any OS now days comes with basic graphics drivers for NVIDIA and ATI cards, your custom card will just not work. Your PC will just think wtf is this.
What stops me from simply making a high-end graphics card myself by copying NVIDIA in every way?
The amount of time you spend in this forum. Guess this actually already qualifies as plotstopper to get anything done.
high rated
Funny you should ask that. I was just thinking of doing this as soon as I complete my exact working replica of the Large Hadron Collider.
low rated
avatar
GeraltOfRivia_PL: What stops me from simply making a high-end graphics card myself by copying NVIDIA in every way?

Why don't people make graphics cards on their own?
avatar
RoboPond: Aside from the law (being a direct copy from NVIDIA, would break the law as NVIDA would argue that you brought one of their cards and tried to sell it as your own), and the issues with parts, soldering etc as people have said.

There would also be distribution, and getting it working. Any OS now days comes with basic graphics drivers for NVIDIA and ATI cards, your custom card will just not work. Your PC will just think wtf is this.
Well then we need to create a new PC and microchips for that too.
avatar
TerriblePurpose: Funny you should ask that. I was just thinking of doing this as soon as I complete my exact working replica of the Large Hadron Collider.
Well good news, the Pi Foundation have released a new tiny ultra budget microcontroller. You're 1% of the way there! :D
avatar
DesmondOC: Quite simply, patents and trade secrets
And this is the non-silly answer
Post edited January 24, 2021 by Sachys
We'll miss you, but go ahead and make those graphics cards!
low rated
avatar
GeraltOfRivia_PL: What stops me from simply making a high-end graphics card myself by copying NVIDIA in every way?

Why don't people make graphics cards on their own?
avatar
RoboPond: Aside from the law (being a direct copy from NVIDIA, would break the law as NVIDA would argue that you brought one of their cards and tried to sell it as your own), and the issues with parts, soldering etc as people have said.

There would also be distribution, and getting it working. Any OS now days comes with basic graphics drivers for NVIDIA and ATI cards, your custom card will just not work. Your PC will just think wtf is this.
I was thinking more personal use
low rated
These are some of my favorite images of an Intel CPU chip under an electron microscope:

https://www.techeblog.com/20-fascinating-images-showing-what-the-inside-of-a-cpu-circuit-looks-like-under-a-microscope/

The ones that start from about 2/3 down the page are particularly good at showing the almost unfathomable scale and complexity. This is a Pentium III from 1999 that had 180 nm transistors. You think you can single-handedly design something from scratch that would rival a modern NVidia GPU (~10 nm)? Knock yourself out!

You would probably have more luck trying to recreate New York City brick by brick, by yourself ...
Post edited January 24, 2021 by Time4Tea
What you need is an Extreme Ultra-Violate (EUV) lithography machine.

An extreme ultraviolet lithography machine is a technological marvel. A generator ejects 50,000 tiny droplets of molten tin per second. A high-powered laser blasts each droplet twice. The first shapes the tiny tin, so the second can vaporize it into plasma. The plasma emits extreme ultraviolet (EUV) radiation that is focused into a beam and bounced through a series of mirrors. The mirrors are so smooth that if expanded to the size of Germany they would not have a bump higher than a millimeter. Finally, the EUV beam hits a silicon wafer—itself a marvel of materials science—with a precision equivalent to shooting an arrow from Earth to hit an apple placed on the moon. This allows the EUV machine to draw transistors into the wafer with features measuring only five nanometers—approximately the length your fingernail grows in five seconds. This wafer with billions or trillions of transistors is eventually made into computer chips.


An EUV machine is made of more than 100,000 parts, costs approximately $120 million, and is shipped in 40 freight containers. There are only several dozen of them on Earth and approximately two years’ worth of back orders for more
It will involve all of your Ikea assembly skills to put it together although you may find your garage a bit cramped.

Given the two year wait for an EUV machine shipped directly from the factory, so you might want to try searching Ebay or Craigslist for a second hand one.