There is so much speculation on what AMD will do with the ATi and Radeon names, what this merger means for nVidia and the video card industry as a whole, motherboard chipsets and so on I thought I would set some people straight.
The first thing AMD is going to do is spend a ridiculous amount of money with market research companies to determine the relative strengths of the ATi and AMD brands along with the Radeon brand. These research companies will determine that ATi should go and AMD should spend their money broadening the AMD brand rather than split their efforts between the two acronyms.
The next bit of research, again costing ridiculous sums of money, will concern what to call the Radeon even though the previous research said to keep the name. This research will confirm that Radeon should stay and AMD marketing execs will, at that time, notice the wonderful fit with the CPU line. Still don't see it? Let me point it out for you: Opteron, Athlon, Sempron and wait for it…… Radeon. Get your game "on", Get your chipset "on" Get your CPU "on" oh the marketing slogans make my head hurt. Too bad they won't see this first, I'd be willing to send them a bunch of charts and ask a few friends for at least half of what they will pay to figure this out on their own.
What does the future hold? Graphics on the CPU of course. The current on-board video solution found on many ATi Xpress200-based motherboards will already play most of today's games. Not very well, granted, but at least they will work. Certainly a far cry from SiS graphics
Both Intel and AMD will soon be releasing CPUs with four cores. With the almost total lack of multi-threaded applications for the 99% of people using desktop computers these cores will largely be idle. On a side note, look for an upsurge in SETI calculations. Within the next two years you should look to see eight-way CPU cores. Server people rejoice, desktop people just sit there warm your toes by your new gigawatt processor.
Rather than waste four-cores-too-many on the desktop look for this real-estate to be replaced with GPU instructions. The space is smaller than will be available on a full fledged video card but the resulting performance loss can be offset by the faster communication between the CPU side and the GPU side of the new design. The initial offering will be great for home theater designs and previous generation gaming.
About 18 months after that you will see motherboards with enough throughput to provide some light at the end of the tunnel for more advanced games until, finally, the outboard video card goes away completely.
A quick look at history will show how obvious this is. If you are over thirty you have heard stories and if you are over forty you probably had one; a math co-processor. For those of you that don't know, it used to be possible to by a second processor that would help the main processor with math.
There are other examples as well. The L3 cache used to be upgradeable on some motherboards. The north bridge is still on AMD-based boards but the CPU has one built in so you can look for that to go away in the near future.
Companies have been asking AMD for a motherboard chipset for years with the idea that it would be the most reliable choice. Not a bad assumption. So AMD brings the north bridge onto the CPU and they are one step closer, lacking only functionality that goes beyond only allowing the computer to work. Now, with ATi's chipset engineers on board, things can improve. Look for the north bridge to finally disappear completely.
Here's a question no one has asked yet but you can bet it's coming. Who will buy Creative first; AMD or Intel? With the obvious trend to bring all the computing under one piece of silicon the question is not whether or not it will happen but rather when.
So where does all this leave nVidia? The super tight integration of AMD graphics with AMD chipsets and AMD CPUs spells the end of them. While AMD gets all the kinks worked out nVidia might see a little surge in video sales but it will be short lived. They only have one platform for which they can build chipsets and video cards. Intel has already snarfed up a bunch of 3DLabs graphics guys to forward their "return" to the graphics market and they have never liked other people making chipsets for their CPUs. That leaves nVidia trying to survive just long enough to keep the price of their stock up high enough to keep the blood bath their share holders will take when Intel buys them out to a minimum.
What does this mean for consumers? The savvy shopper doesn't upgrade their machine very often and when they do it requires, at a minimum, a new motherboard, CPU, video card and often RAM. The only difference is that two of those things will, eventually, be combined into one.
This is the way of the today's world. Just go to your local Wal-Mart and you can see it there, too. I remember a time when, in one day and 30 miles, I could to a grocery store, clothing store, shoe store, auto parts store and liquor store.
-Mr. Apothegm
Reader E-Mail: AMD/ATi Merger
- Apoptosis
- Site Admin
- Posts: 33941
- Joined: Sun Oct 05, 2003 8:45 pm
- Location: St. Louis, Missouri
- Contact:
Reader E-Mail: AMD/ATi Merger
A reader named Mr. Apothegm sent this e-mail to me this morning and I thought I'd share it with you all and see what you think of his thoughts.
First off, thanks for sharing Apop. Very interesting read. I would guess that within the next 1 - 2 years we will witness a sort of computer revolution or evolution if you will. Amazingly, this might be more than just science fiction, but a look at the very near and real future.
I guess, this isn't the time to buy Nvidia stock...
Instead of AMD hiring some Market Research company, why not utilize sites like this to get real world opinion? There's enough discussion going on here regarding things of this nature to answer their questions. If not, then I'm for hire...
Razorbacx
I guess, this isn't the time to buy Nvidia stock...

Instead of AMD hiring some Market Research company, why not utilize sites like this to get real world opinion? There's enough discussion going on here regarding things of this nature to answer their questions. If not, then I'm for hire...

Razorbacx
- HONkUS
- Legit Extremist
- Posts: 1054
- Joined: Wed Jun 07, 2006 2:42 pm
- Location: Fort Worth, Texas!
- Contact:
I would like to know what this guys qualifications are to make such bold predictions for the future. If you go back just a few years ago things such as optical mice, pixel/vertex shaders, SLI, Crossfire, dedicated sound card memory, and multi core CPU's were just ideas at best. Its IMPOSSIBLE to know for sure what is going to happen in the next 5 years. AMD may never recover from Conroe, and does anyone remember when Nvidia aquired 3dfx? remember the first collaboration between the former 3dfx engineers and the Nvidia engineers, the horrible Geforce FX line? The same may type of disaster might happen for AMD/ATI. PC's in general are evolving faster then ever before Shuttle is developing a DIY car based PC and media center PC's are actually becoming somewhat common. Also the line between Pocket PC's and cell phones is almost non existant anymore, any one of these areas can be a potential savior or destroyer of a company based on how that company exploits that trend. This is a random emailer and until I see some clue that he has some type of inside knowledge then im not taking what he and his crystal ball says seriously.