• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel confirms its dedicated GPU comes out 2020

It'll be interesting to see what they come up with. It's not going to be easy to breach either the AI or gaming markets, but Intel certainly does have the money to spend if need be. I think they're mostly trying to cover the HPC server space with the move as they're probably worried that most of the spending will veer towards GPUs. 2020 seems like an aggressive schedule, but I guess they already do have a lot of the building blocks ready, even if they've never really made a standalone graphics card.
 

LordOfChaos

Member
Is this a real gaming lineup though?


I suspect it will be HPC focused, but at that point they're nearly to a GPU so they'll do some 'may as well' driver development for gaming as well.

I.e see Nvidia GV100, that's mainly aimed at HPC and getting deep learning to academic budgets, but it's also sold as and works as a gaming card in case anyone wants to buy it. Intel may not go as big, but a similar focus break down.

Some of the biggest flaws with Larrabee were lack of ROPs and TMUs, and doing some of that stuff in software on general purpose cores was too slow. They planned on adding ROPs later but it was already dead for consumers. Here I see them building a HPC product that's 70% of the way to a gaming card so they'll do that stuff anyways, kinda thing.


http://tomforsyth1000.github.io/blog.wiki.html#[[Why didn't Larrabee fail?]]
 

Solo Act

Member
It feels like Intel could be the next IBM. They're having a heck of a time shrinking their CPUs, to where they're rumored to be losing Apple as a customer in the next 5 years. They are irrelevant in the mobile processor business, which is where all of the growth and new money is. Now they're announcing GPUs and it feels like they're just trying stuff out.

By "next IBM" I mean a tech company that feels like it'll always be in business and profitable, but are a shell of what they once were, to where a certain generation of people don't even know what they provide to the industry anymore.

Disclaimer: I don't follow this stuff closely
 
Competition would normally drive prices down. But knowing intel, the graphics cards will be expensive compared to Nvidia and AMD

If they price their GPU's like they price their CPU's that is.

I might be wrong about the whole thing.

We will have to see in 2020.
 

LordOfChaos

Member
Competition would normally drive prices down. But knowing intel, the graphics cards will be expensive compared to Nvidia and AMD

If they price their GPU's like they price their CPU's that is.

I might be wrong about the whole thing.

We will have to see in 2020.


They can price their CPUs like that because they still have that last little bit of performance that puts them on top, that you have to pay disproportionately higher for.

To charge that for GPUs, they'd have to gain that lead over Nvidia, which would be an impressive feat for a first attempt to say the least. If it doesn't perform as much better and they try to charge Intel CPU margins for it, the market would price it out until they lowered prices accordingly. If it does perform, cool, we have a new top dog and Nvidia and AMD have to respond.

Can't go wrong as consumers the way I see it.
 
Last edited:
They can price their CPUs like that because they still have that last little bit of performance that puts them on top, that you have to pay disproportionately higher for.

To charge that for GPUs, they'd have to gain that lead over Nvidia, which would be an impressive feat for a first attempt to say the least. If it doesn't perform as much better and they try to charge Intel CPU margins for it, the market would price it out until they lowered prices accordingly. If it does perform, cool, we have a new top dog and Nvidia and AMD have to respond.

Can't go wrong as consumers the way I see it.

Yeah you are right. I don't think people would buy it if it performed worse but still cost more than Nvidia / amd. Intel would have to lower prices to move stock.

Im not that knowledgeable about this stuff. Thanks for explaining it.
 
Blue = Intel and Xbox
Green = Nvidia and Switch
Red = AMD and PS5

Sony paid AMD for console exclusivity for their Navi based APU. MS had no choice but to go with Intel for their next APU.
 
Last edited:

LordOfChaos

Member
While that would be interesting, we have no indication Sony paid for exclusivity on Navi, only that they may have bolstered its R&D, perhaps for early access. Navi was on roadmaps for PCs for years, so it's not likely an exclusive.

Microsoft going fully Intel with the new dGPU seems like about the only thing that could be a surprise, otherwise similar-ish AMD hardware was the most likely for both. Or else Intel and Nvidia, but both have some histories with not being the best console partners (they want to control their chip technology, while AMD is happy to co-develop, even now the Switch's SoC seems identical to a TX1 on die scans).
 

Hendrick's

If only my penis was as big as my GamerScore!
If Intel plans to embrace gaming, which I think is questionable, it would make some sense partnering with Xbox help market and legitimize that.
 

j^aws

Member

Interesting saying this is Intels first discrete GPU, but they already launched a discrete GPU when the AGP bus was launched in the 90s with i740 chip. Of course, this was before Nvidia coined the term 'GPU' with the GeForce 256. Still, the i740 was a discrete 3D accelerator.

Also, I'm glad that I wasn't the only person thinking that the next Xbox going with Intel would be a nice surprise. Considering ex-AMD architect leading the development, Intels IPs and fab expertise, this could be a great collaboration for a 2020 launch for both parties. Risky though, but Intel tend to have access to the best manufacturing nodes before anyone else, thereby giving them an advantage.
 

Hubble

Member
Excited. Intel has a lot of money to invest in R&D and hopefully, this will help advance GPU’s at a quicker rate than now.
 

Tarin02543

Member
Time to show how good their R&D is.

I mean, it cannot be that only one company is responsible for advances in computer graphics.
 

Panajev2001a

GAF's Pleasant Genius
Interesting saying this is Intels first discrete GPU, but they already launched a discrete GPU when the AGP bus was launched in the 90s with i740 chip. Of course, this was before Nvidia coined the term 'GPU' with the GeForce 256. Still, the i740 was a discrete 3D accelerator.

Also, I'm glad that I wasn't the only person thinking that the next Xbox going with Intel would be a nice surprise. Considering ex-AMD architect leading the development, Intels IPs and fab expertise, this could be a great collaboration for a 2020 launch for both parties. Risky though, but Intel tend to have access to the best manufacturing nodes before anyone else, thereby giving them an advantage.

On the other side both Intel and especially nVIDIA can afford to or decide to be quite inflexible in terms of customisations and cost reduction plans as well as licensing terms in general. AMD right now is an excellent business partner as it still has quite kick ass technology and helps with a lot of semi-custom work for their clients (later to incorporate it in their main technology lines).

Also, partially what is making Apple thinking seriously about their own CPU’s in MacBooks, is that the foundry advantage over TSMC and Global Foundry is shrinking in terms of how it affects the competitiveness of final silicon in the market.
 
Last edited:

LordOfChaos

Member
Intel tried it before with larrabee but that didn't release as a GPU. Will not hold my breath on this tbh l.


Not really much reason to assume any similarity on the other hand.

Larrabee was a product of their x86 core or bust mentality with all the overhead that comes with x86 decode, times the number of cores. This isn't, and it's sounding like a new ground up GPU, but also that the current Gen integrated graphics are pretty good for their die area/power use, so they're already capable of making a modern GPU architecture. GPUs are all about scaling those base units up.
 

Marlenus

Member
Not really much reason to assume any similarity on the other hand.

Larrabee was a product of their x86 core or bust mentality with all the overhead that comes with x86 decode, times the number of cores. This isn't, and it's sounding like a new ground up GPU, but also that the current Gen integrated graphics are pretty good for their die area/power use, so they're already capable of making a modern GPU architecture. GPUs are all about scaling those base units up.

Hd630 is about 60mm2 approximately. Vega 11 is about 100mm2 approximately. Vega 11 is about 3x faster than hd630 when both APUs have the same TDP so even if intel were to scale it up it would be far slower than Vega at the same power use.
 

LordOfChaos

Member
Hd630 is about 60mm2 approximately. Vega 11 is about 100mm2 approximately. Vega 11 is about 3x faster than hd630 when both APUs have the same TDP so even if intel were to scale it up it would be far slower than Vega at the same power use.


100mm2 is 2.77x the die area of 60mm2...The whole squared part...


Like I said, they at least have the building blocks for a dedicated GPU, with some further enhancements it could be competitive. Their biggest knock is driver support.
 
Last edited:

magnumpy

Member
interesting! 3 companies pushing GPU tech forward is definitely better than 2. it will be a bit of a wait until 2020, but otherwise this is good news.
 

Marlenus

Member
100mm2 is 2.77x the die area of 60mm2...The whole squared part...
100mm*100mm = 10,000
60mm*60mm=3600

Like I said, they at least have the building blocks for a dedicated GPU, with some further enhancements it could be competitive. Their biggest knock is driver support.

Did you not learn area in school?

A rectangle of width 6mm and length 10mm has an area of what? How about width 10mm and length 10mm, what area does that give you?
 

xwez

Banned
100mm2 is 2.77x the die area of 60mm2...The whole squared part...
100mm*100mm = 10,000
60mm*60mm=3600

Like I said, they at least have the building blocks for a dedicated GPU, with some further enhancements it could be competitive. Their biggest knock is driver support.

You do know there's a difference between 60mm^2 and (60mm)^2? You are using the latter, i.e. you're assuming it's a square that has a width and length of 60mm to calculate area. 60mm^2 is total area, i.e. already calculated... So it's only a 1.666 times bigger.
 
Last edited:

Elios83

Member
More competition is better and Intel can be pretty huge competition for a company like nVidia that has always tried to keep prices high.
Still at this point it's not clear if Intel is going to target the high end market and if it has the know how to make a more powerful GPU than top of the line nVidia boards.
They got the lead guy from AMD on board but it's not like AMD could compete...
So it will be interesting to see how this will unfold.
 

GenericUser

Member
nVidia is clearly holding back new cards because of the whole crypto thing that took off and because AMD is not a real competitor anymore. I hope intel smacks their asses.
 

LordOfChaos

Member
You do know there's a difference between 60mm^2 and (60mm)^2? You are using the latter, i.e. you're assuming it's a square that has a width and length of 60mm to calculate area. 60mm^2 is total area, i.e. already calculated... So it's only a 1.666 times bigger.

Wow, derp, I feel silly.
Guess I haven't had to find the area of anything in a decade since school taught me lol


Ok, 1.6x the area for 3x the performance, Intel is behind, but I mean, they've already built modern GPUs, and their IGPs are meant to provide low power in average situations. Around the 5000 series I believe their die area expanded while performance stayed the same so they could have lower average power use for the same tasks. It's part of why even with low power modes, you'll want to switch to IGP only for battery life on laptops, hence Optimus/Enduro/now just Windows 10. I still have little reason to believe that with a performance focus instead, Intel couldn't build a modern GPU, even if the first take may not take on Nvidias efficiency.

https://www.anandtech.com/show/7085/the-2013-macbook-air-review-13inch/4
 
Last edited:
If they support Freesync out of the box and price their cards right they could actually go head to head with AMDs low to mid range market.
 

camelCase

Member
It feels like Intel could be the next IBM. They're having a heck of a time shrinking their CPUs, to where they're rumored to be losing Apple as a customer in the next 5 years. They are irrelevant in the mobile processor business, which is where all of the growth and new money is. Now they're announcing GPUs and it feels like they're just trying stuff out.

By "next IBM" I mean a tech company that feels like it'll always be in business and profitable, but are a shell of what they once were, to where a certain generation of people don't even know what they provide to the industry anymore.

Disclaimer: I don't follow this stuff closely

They're losing their mobile processor market share? I haven't seen a laptop in very long that sported a non intel processor
 

ilfait

Member
MbuLqVB.jpg
 

Solo Act

Member
They're losing their mobile processor market share? I haven't seen a laptop in very long that sported a non intel processor
I said they were irrelevant in the mobile market, not that they were losing market share in laptops. The "mobile" market speaks to phones and tablets which is dominated by TSMC and Qualcomm and is seeing markedly larger growth than the stalling computer processor market.
 

dirthead

Banned


Says something about Oculus/VR that he was willing to jump ship given the position he had there.

And it was way too early for Intel to announce this. 2020? Given their history, they need to stay quiet until they can actually deliver something.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Raj did not live up to the hype at AMD as far as Im concerned so I am not too optimistic that Intel will be able to offer anything all that worthwhile.

Nvidia badly needs the competition though.
 

IbizaPocholo

NeoGAFs Kent Brockman
https://infosurhoy.com/cocoon/saii/...intel-opening-graphics-chip-plant-in-toronto/

The multinational computing giant plans to set up a new engineering lab for graphics processing units, or GPUs, in North York, just south of Markham, where ATI Technologies Inc. became a global GPU-making powerhouse before being acquired by Advanced Micro Devices, or AMD, for US$5.4-billion in 2006.

As Intel prepares to release its own discrete GPUs by 2020, the lab is scheduled to open Wednesday. It will eventually be home to dozens, if not hundreds, of engineers working on the chips, said Ari Rauch, the company’s vice-president of visual technologies, who previously spent time in the Toronto area as a vice-president with AMD.

“Since I joined Intel three years ago, [Toronto] was always on our radar. So when we were going to expand our dreams, we really wanted to tap into this great talent that exists there,” Mr. Rauch said in an interview.
 

Lort

Banned
It all comes down to ray tracing .. amd your move next .. then its all up to you intel.

Nvidia has always won by effeciency by brute force .. integrated cpu/gpus are fastest at rendering video in premier .. they could also be fastest at ray tracing.
 

llien

Member
After that Crypto market, I suspect.
I don't think so, it's way to volatile for strategic investment.

I think they realized the power of APU, looking at Sony/MS and problems Intel had with Apple banning nvidia.

We haven't seen AMD in "great CPU and great GPU" at the same time position in the past, that is why we don't realize how cool a product good APU could be, especially in mobile space.
 

bigedole

Member
I'm skeptical Intel can pull it off. I think the most interesting implication is, as another poster suggested, this could be for a home console push. I feel really nervous for Microsoft if they're actually doing this though, Intel's 10nm has had terrible issues ramping up and it cost them Apple as a customer.
 

Hudo

Member
I hope they don't go and develop their own GPGPU solution. They should either go OpenCL (and help to modernise it!) or somehow ask nVidia if they can get into CUDA as well... Would make my future easier, if they become a serious player, haha...
 
Top Bottom