Current Events > Intel Arc is for sure gonna be cancelled

Topic List
Page List: 1, 2
DarkRoast
08/15/22 4:09:37 PM
#1:


TLDR

It's flatly not compatible with DX9
DX10 and DX11 games run like garbage
Intel's focused all its energy on DX12

Only problem? The A770 (the highest end model) seems to benchmark right around RTX 3060 (non-Ti) at very best.
That's kind of a problem as it is, but it gets worse because there's tons of games that don't work well with Intel's drivers. And the RTX 4000 series is right around the corner, putting Intel in a virtually impossible price bracked. The RTX 4060 Ti will almost certainly gun for the $299-$399 range and will have performance around RTX 3070 to RTX 3080 levels. So Intel's top of the line card would have almost half the performance of NVidia's low-end dedicated GPU, and will need to be less than $299.

And that leaves the A750 and A730, the other two models, in a bizarre no-man's land of pricing.


---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Axiom
08/15/22 4:12:46 PM
#2:


Intel is a joke these days anyway. If you own a PC for gaming and it doesn't have an AMD processor then you're being ripped off
... Copied to Clipboard!
DarkRoast
08/15/22 4:15:30 PM
#3:


Axiom posted...
Intel is a joke these days anyway. If you own a PC for gaming and it doesn't have an AMD processor then you're being ripped off

12000 series seems OK, especially in the midrange

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Ratchetrockon
08/15/22 4:18:18 PM
#4:


Axiom posted...
Intel is a joke these days anyway. If you own a PC for gaming and it doesn't have an AMD processor then you're being ripped off

the 12900k seems to be the best for running that ps3 emulator so prob still worth buying for those into that stuff

---
I'm a Taurus
I like collecting headphones and iems. My fave game of all time is DMC 3 SE on PC w/ style switch mod. IMO it the best button masher on earth
... Copied to Clipboard!
FL81
08/15/22 4:21:33 PM
#5:


That sucks, looks like we're stuck with the duopoly

---
Thanks to Proofpyros for the sig images
https://i.imgur.com/Nv4Pi1v.jpg https://i.imgur.com/N43HJYv.jpg
... Copied to Clipboard!
regina1914
08/15/22 4:21:36 PM
#6:


Ratchetrockon posted...
the 12900k seems to be the best for running that ps3 emulator so prob still worth buying for those into that stuff
Too bad you need the radiator of a truck to keep it cooled
... Copied to Clipboard!
DarkRoast
08/15/22 4:23:08 PM
#7:


The saddest part of all is that they had an absolutely golden opportunity to make a splash during the chip/GPU shortage, since unlike AMD and NVIDIA, Intel actually owns their chip manufacturing process from top to bottom, so they've been more or less unaffected by the whole chip shortage.

If they had swooped in with an RTX 3060 at $300-$400 at that time, they'd have made a killing.

And now the exact opposite is happening - many of the cards are actually falling below MSRP (especially the high end 3090 / 6900 models).
So now they're gonna have way overpriced and underperforming cards.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
#8
Post #8 was unavailable or deleted.
Kamen_Rider_Blade
08/15/22 4:56:48 PM
#9:


The bigger problem is that Intel is slipping on delivering it's products when it's promised.

Intel is delaying it's VERY FIRST Chiplet based CPU (Meteor Lake) to 2024.
https://www.theverge.com/23294064/intel-deny-meteor-lake-delay-2023-2024
https://www.pcgamer.com/intel-meteor-lake-cpu-delays/
AMD has been on Chiplet since Zen 1 AKA Ryzen 1000 series which launched way back in Feb 2017

-

Intel's Sapphire Rappids "Server CPU" to compete with EPYC has been delayed
https://www.tweaktown.com/news/87720/intel-xeon-sapphire-rapids-massive-delay-an-entire-year-and-some/index.html
Intel's new Xeon 'Sapphire Rapids' CPU was due in Q1 2022 and has now been delayed until Q1 2023, surprising exactly no one.

-

Intel's Arc launch debacle is one delay after another with HIGH CHANCE of cancellation of the consumer end of Arc.

Who knows, right now it's up to Pat Gelsinger and the other C-suite VP's to decide this, but the rumor on the street is that many of the other C-suite VP's are calling for Arc's head to be placed on the Guillotine.
It's not just losing ALOT of $$$, but also costing Intel alot of good will and reputation.

-

Intel canceled it's one good technology that it created from scratch with the help of Micron, which was Optane.
https://www.tomshardware.com/news/intel-kills-off-all-optane-only-ssds-for-consumers-no-replacements-planned

-

Intel's upcoming Raptor Lake is launching later than expected.
https://www.hardwaretimes.com/intel-13th-gen-raptor-lake-processors-to-launch-much-later-than-amds-ryzen-7000-report/

-

This is not to mention the HISTORY of delays like the 10 nm & 14 nm Process Nodes for it's fabs
https://gamefaqs.gamespot.com/a/user_image/8/4/2/AAAFvAAADkO6.jpg

https://gamefaqs.gamespot.com/a/user_image/8/4/3/AAAFvAAADkO7.png

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
xXfireglzXx
08/15/22 5:01:13 PM
#10:


Im really hoping they wind up tanking the stock and replace the whole C-suite with someone who can salvage the situation afterwards.

---
Official ANCIENT Killer. I make account bets from time to time so either man up or shut up. The Quote is dedicated to those we have lost.
... Copied to Clipboard!
DarkRoast
08/15/22 5:02:34 PM
#11:


Intel is such a dumpster fire of management lately.
With the resources they have, there's really no excuse at all.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
regina1914
08/15/22 5:14:21 PM
#12:


I'd love to see 3 players competing in the dGPU market but it's looking really bad for Intel/Arc right now. They just might cut their losses and bail before it even takes off properly
... Copied to Clipboard!
Fony
08/15/22 5:15:31 PM
#13:


It will come out in some form, probably OEM only just like in China. Several execs are weighing the idea of canceling it though and there is still nothing concrete about the launch or even bill of materials for their board partners. One AIb said they don't even want ARC and will do the bare minimum, lol.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
DarkRoast
08/15/22 5:19:09 PM
#14:


Such a damn shame. AMD is so weird about their GPUs - they kinda just dump them out and disappear. They're good, don't get me wrong, but they don't seem to be "in it to win it" like they are with Ryzen.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Fony
08/15/22 5:20:02 PM
#15:


Kamen_Rider_Blade posted...
The bigger problem is that Intel is slipping on delivering it's products when it's promised.

Intel is delaying it's VERY FIRST Chiplet based CPU (Meteor Lake) to 2024.
https://www.theverge.com/23294064/intel-deny-meteor-lake-delay-2023-2024
https://www.pcgamer.com/intel-meteor-lake-cpu-delays/
AMD has been on Chiplet since Zen 1 AKA Ryzen 1000 series which launched way back in Feb 2017

-

Intel's Sapphire Rappids "Server CPU" to compete with EPYC has been delayed
https://www.tweaktown.com/news/87720/intel-xeon-sapphire-rapids-massive-delay-an-entire-year-and-some/index.html

-

Intel's Arc launch debacle is one delay after another with HIGH CHANCE of cancellation of the consumer end of Arc.

Who knows, right now it's up to Pat Gelsinger and the other C-suite VP's to decide this, but the rumor on the street is that many of the other C-suite VP's are calling for Arc's head to be placed on the Guillotine.
It's not just losing ALOT of $$$, but also costing Intel alot of good will and reputation.

-

Intel canceled it's one good technology that it created from scratch with the help of Micron, which was Optane.
https://www.tomshardware.com/news/intel-kills-off-all-optane-only-ssds-for-consumers-no-replacements-planned

-

Intel's upcoming Raptor Lake is launching later than expected.
https://www.hardwaretimes.com/intel-13th-gen-raptor-lake-processors-to-launch-much-later-than-amds-ryzen-7000-report/

-

This is not to mention the HISTORY of delays like the 10 nm & 14 nm Process Nodes for it's fabs
https://gamefaqs.gamespot.com/a/user_image/8/4/2/AAAFvAAADkO6.jpg

https://gamefaqs.gamespot.com/a/user_image/8/4/3/AAAFvAAADkO7.png

Did you see that AMD supposedly moved back Zen 4's launch to the same day intel demos Raptor Lake(9/27)?

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:20:56 PM
#16:


xXfireglzXx posted...
Im really hoping they wind up tanking the stock and replace the whole C-suite with someone who can salvage the situation afterwards.
You should've seen how bad Intel was when Bob Swan was in charge.

Bob Swan was the "Bean Counter" running Intel into the ground.

Pat Gelsinger is an actual Technical Guy first and Intel Alumni who helped pioneer critical Intel CPU's in the past.

https://www.oregonlive.com/silicon-forest/2021/02/with-new-ceo-pat-gelsinger-intel-looks-to-its-past-in-hopes-of-securing-the-future.html

Pat Gelsinger is a literal Protege of Intel founder Andy Grove.

It's like the "Prodigal Son" returning home to take over the business.

Intel was MUCH worse while Bob Swan was in power for 2 years.

Especially since AMD rose from the ashes with Zen/Ryzen with Lisa Su in charge and was embarassing Intel across the board with each new iteration until they finally surpassed Intel.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:24:42 PM
#17:


Fony posted...
Did you see that AMD supposedly moved back Zen 4's launch to the same day intel demos Raptor Lake(9/27)?
Yup! AMD will annouce Zen 4 early, but delay launch to spoil Intel's news cycle moment.

That's AMD pulling a Jensen Huang / nVIDIA playbook move.

nVIDIA has done that to AMD countless times, now it's time for AMD to do it to Intel.

They want to spoil Intel's Annoucement/Unveiling of Raptor Lake by showing off that you can "Buy Zen 4" when Raptor Lake gets shown off to the press.

It might also delay the embargo for the benchmark / reviews of Zen 4 to whenever "Zen 4" launches, giving reviewers more time with the CPU to get a more thorough review.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:28:16 PM
#18:


DarkRoast posted...
Such a damn shame. AMD is so weird about their GPUs - they kinda just dump them out and disappear. They're good, don't get me wrong, but they don't seem to be "in it to win it" like they are with Ryzen.
AMD is "in it to win it".

They just realize that it's not going to be a "One and Done" situation.

It's going to take countless generations of "Winning" against nVIDIA and delivering on consistent execution to win the consumers back.

nVIDIA's reputation is fearsome for a reason, when ATi / AMD screwed up in the past, they ceded ALOT of marketshare to nVIDIA across all segments.

From Consumer DIY / Retail / Mobile to Professional.

Now it's slowly clawing it's way back at ALOT of expense to AMD.

And they realize that mind share isn't going to happen over night.

Just look at yourself, you aren't convinced yet to buy a AMD GPU.

AMD has to work harder to deliver on it's products to convince you to buy a AMD GPU again.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
refmon
08/15/22 5:29:01 PM
#19:


but can it run Doom?

---
If you read this signature, then that meant that I had control of what you read for 5 SECONDS!!
... Copied to Clipboard!
DarkRoast
08/15/22 5:29:18 PM
#20:


NVIDIA is obnoxious with how it controls prices and feature creep, but at least they make competitive top-tier products. Intel has just been swirling the drain for a half-decade.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:31:04 PM
#21:


DarkRoast posted...
NVIDIA is obnoxious with how it controls prices and feature creep, but at least they make competitive top-tier products. Intel has just been swirling the drain for a half-decade.

How much better must AMD's GPU be over nVIDIA to get you to buy their GPU?

Give me a % of performance better.

What about features, how many more features would AMD need to convince you to swap from Team Green to Team Red?

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 5:34:59 PM
#22:


I was a Radeon fan all throughout the 2000's. But to be honest, if I'm going to do a comparison I consider NVIDIA to be like Ryzen and AMD (ironically) to be like Core i9.

Core i9 does beat Ryzen CPUs in single-threaded performance sometimes, but GENERALLY SPEAKING Ryzen is just a better product overall, even if there's a couple things it does worse.

Likewise, the AMD cards feel like they're lacking the kind of support NVIDIA puts into GeForce. NVIDIA is so far ahead of AMD in general software optimization and support, even though they're about even in rasterization. In addition, I feel like AMD basically price matching NVIDIA instead of undercutting them makes it awful hard to justify. The only tier AMD undercut Intel was the Enthusiast (6900 XT vs. RTX 3090) but both cards were stupidly overpriced relative to performance; it's just AMD's was less stupidly overpriced.

I applaud AMD's FSR 2.0 big time. FSR 1.0 was, pardon my language, a fucking dumpster fire. I'm hoping AMD pushes harder into image reconstruction because it put NVIDIA way ahead of AMD in the RTX 2000 and 3000 generations.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Fony
08/15/22 5:35:27 PM
#23:


DarkRoast posted...
NVIDIA is obnoxious with how it controls prices and feature creep, but at least they make competitive top-tier products. Intel has just been swirling the drain for a half-decade.

NVIDIA and AMD collude on GPU pricing. If Intel ARC were any good you'd just have all 3 colluding, intel is the reason they drive pricing up to begin with. NVIDIA and AMD got caught exchanging e-mails stating they need to raise the status of their brands to be seen as a luxury brand "like Intel" in order to command a higher price.

Kamen_Rider_Blade posted...
How much better must AMD's GPU be over nVIDIA to get you to buy their GPU?

Give me a % of performance better.

What about features, how many more features would AMD need to convince you to swap from Team Green to Team Red?

AMD GPU's are fine for desktop gaming.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Fony
08/15/22 5:37:34 PM
#24:


DarkRoast posted...
I was a Radeon fan all throughout the 2000's. But to be honest, if I'm going to do a comparison I consider NVIDIA to be like Ryzen and AMD (ironically) to be like Core i9.

Core i9 does beat Ryzen CPUs in single-threaded performance sometimes, but GENERALLY SPEAKING Ryzen is just a better product overall, even if there's a couple things it does worse.

Likewise, the AMD cards feel like they're lacking the kind of support NVIDIA puts into GeForce. NVIDIA is so far ahead of AMD in general software optimization and support, even though they're about even in rasterization. In addition, I feel like AMD basically price matching NVIDIA instead of undercutting them makes it awful hard to justify. The only tier AMD undercut Intel was the Enthusiast (6900 XT vs. RTX 3090) but both cards were stupidly overpriced relative to performance; it's just AMD's was less stupidly overpriced.

I applaud AMD's FSR 2.0 big time. FSR 1.0 was, pardon my language, a fucking dumpster fire. I'm hoping AMD pushes harder into image reconstruction because it put NVIDIA way ahead of AMD in the RTX 2000 and 3000 generations.

AMD was recovering from intel bankrupting them illegally, and from blowing what money they had left on ATI. NVIDIA has about a 10 year head start on both software talent and software features.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
DarkRoast
08/15/22 5:38:53 PM
#25:


AMD needs to embrace its ATI roots and come out with something like the HD 4850 - a card that came out at $199 and just obliterated all of NVIDIA's midrange GPUs at half the price. It even surpassed NVIDIA's $600 ultra enthusiast GPU in a few games. They need that kind of edge.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:40:09 PM
#26:


Fony posted...
NVIDIA and AMD collude on GPU pricing. If Intel ARC were any good you'd just have all 3 colluding, intel is the reason they drive pricing up to begin with. NVIDIA and AMD got caught exchanging e-mails stating they need to raise the status of their brands to be seen as a luxury brand "like Intel" in order to command a higher price.
nVIDIA & AMD were already punished for that by the US Government and fined.

Nvidia settles price-fixing lawsuit
https://www.theregister.com/2008/09/30/nvidia_settles_lawsuit/
Out-of-court deal cut with plaintiffs

The major problem is nVIDIA has been in the lead for so long that AMD's only real option is to counter punch and counter move nVIDIA.

Ergo nVIDIA sets the pricing, and AMD reacts.

Fony posted...
AMD GPU's are fine for desktop gaming.
I concur, that's why I'm swapping to Team Red from now on.

But what each individual chooses is up to them.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 5:41:20 PM
#27:


Frankly I don't care which brand of GPU I own, but AMD has got to find another way to compete. Either significantly lower prices with similar performance (the ATI way) or significantly better performance at the same price (Ryzen way). They're not gonna catch up to NVIDIA in drivers and software support. And that's OK as long as they have better value in other areas. Again, like the HD 4850 absolutely spanking the GeForce 260.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Fony
08/15/22 5:41:33 PM
#28:


Kamen_Rider_Blade posted...
nVIDIA & AMD were already punished for that by the US Government and fined.

They still do it, they just don't send e-mails about it.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:42:42 PM
#29:


DarkRoast posted...
AMD needs to embrace its ATI roots and come out with something like the HD 4850 - a card that came out at $199 and just obliterated all of NVIDIA's midrange GPUs at half the price. It even surpassed NVIDIA's $600 ultra enthusiast GPU in a few games. They need that kind of edge.
That was a special time in GPU history.

You might need to wait a few generations before AMD can pull it off with chiplets.

So far, nVIDIA's Lovelace (upcoming), & Blackwell (Next in line after Lovelace) are all large monolithic GPU's.

If nVIDIA can't figure out chiplets, AMD is going to come up with a Chiplet solution that is going to make nVIDIA look like Intel right now.

AMD is doing Version 1 of Chiplets for GPU with RDNA 3 / RX 7000 series

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 5:43:53 PM
#30:


Fony posted...
They still do it, they just don't send e-mails about it.

The price collapse of this generation of GPUs clearly shows that they were running a cartel-like model of deliberate scarcity. Apparently NVIDIA has so much RTX 3000 GPUs just sitting around, they decreased the price of the RTX 3090 Ti by like 50%.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:44:03 PM
#31:


Fony posted...
They still do it, they just don't send e-mails about it.
They also don't talk to each other about it.

AMD just reacts to nVIDIA and counter prices them.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 5:45:39 PM
#32:


Kamen_Rider_Blade posted...
That was a special time in GPU history.

You might need to wait a few generations before AMD can pull it off with chiplets.

So far, nVIDIA's Lovelace (upcoming), & Blackwell (Next in line after Lovelace) are all large monolithic GPU's.

If nVIDIA can't figure out chiplets, AMD is going to come up with a Chiplet solution that is going to make nVIDIA look like Intel right now.

AMD is doing Version 1 of Chiplets for GPU with RDNA 3 / RX 7000 series

Honestly, I think the difference between NVIDIA and Intel is that NVIDIA actually does make highly refined and competitive products consistently. Which means AMD's gotta find a way to get ahead of NVIDIA in something, like you mentioned. Otherwise, they're just gonna be trailing NVIDIA at least in the PC space.

They also don't talk to each other about it.
AMD just reacts to nVIDIA and counter prices them.

Yeah they did that with the 6800 XT. But they only made it $50 cheaper which was a real WTF considering the gap in intrinsic feature sets between the 3080 and the 6800 XT.


---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:46:18 PM
#33:


DarkRoast posted...
The price collapse of this generation of GPUs clearly shows that they were running a cartel-like model of deliberate scarcity. Apparently NVIDIA has so much RTX 3000 GPUs just sitting around, they decreased the price of the RTX 3090 Ti by like 50%.
No, that's more like nVIDIA bet heavily on Crypto Pricing and trying to profit from it.

Crypto-driven GPU crash makes Nvidia miss Q2 projections by $1.4 billion
https://arstechnica.com/gaming/2022/08/crypto-driven-gpu-crash-makes-nvidia-miss-q2-projections-by-1-4-billion/
Cheaper GPUs are good for gamers but bad for Nvidia's bottom line.

A HUGE chunk of nVIDIA's inventory went to Crypto Miners / Scalpers.

Now it's time to roost and nVIDIA is warning that it's posting "Massive Losses" since normal gamers refuse to pay Crypto Miner / Scalper prices and it's affecting nVIDIA's bottom line.

SUCK IT nVIDIA.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Fony
08/15/22 5:48:21 PM
#34:


DarkRoast posted...
Frankly I don't care which brand of GPU I own, but AMD has got to find another way to compete. Either significantly lower prices with similar performance (the ATI way) or significantly better performance at the same price (Ryzen way).


I've been going between both brands for the last 5 years depending on price and availability. I think that AMD's RDNA3 will be a better purchase than Lovelace, especially if you own a Ryzen platform. They're enabling Direct Storage and a bunch of other things on the desktop. There's talk that the top RDNA 3 will be the fastest GPU this time around, we'll see. They came out of nowhere with the 6900XT and traded well with the 3090. One thing they have to improve alot(which looks like they did) is ray tracing.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Fony
08/15/22 5:50:07 PM
#35:


DarkRoast posted...
The price collapse of this generation of GPUs clearly shows that they were running a cartel-like model of deliberate scarcity. Apparently NVIDIA has so much RTX 3000 GPUs just sitting around, they decreased the price of the RTX 3090 Ti by like 50%.

Yup. The only reason the GPU crash didn't hurt AMD as much is because they used alot of their fab allocations on increasingly diverse types of products and don't have a deluge of unsold GPU's.

DarkRoast posted...
Honestly, I think the difference between NVIDIA and Intel is that NVIDIA actually does make highly refined and competitive products consistently. Which means AMD's gotta find a way to get ahead of NVIDIA in something, like you mentioned. Otherwise, they're just gonna be trailing NVIDIA at least in the PC space.

They also don't talk to each other about it.
AMD just reacts to nVIDIA and counter prices them.

Yeah they did that with the 6800 XT. But they only made it $50 cheaper which was a real WTF considering the gap in intrinsic feature sets between the 3080 and the 6800 XT.

NVIDIA will progress their product no matter what. Intel was keeping 6 core desktop CPU's for when they got 10nm working and only increased core counts due to Ryzen.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:50:13 PM
#36:


Intel Posts $500 Million Loss for the First Time in Decades as Sales Drop 17%
https://www.tomshardware.com/news/intel-posts-dollar500-million-loss-for-the-first-time-in-decades-as-sales-drop-17
CEO blames economic decline and execution issues.

But we all know that it's AMD eating Intel's marketshare bit by bit. Attacking every market segment.
ARM is gunning for the Server sector as well, so Intel doesn't have it easy.

Intel is also screwing up by not delivering.

-

Meanwhile...

-

In AMD land:

AMD Posts 70% Year-Over-Year Revenue Increase as Sales of EPYC CPUs Skyrocket
https://www.tomshardware.com/news/amd-posts-70-percent-year-over-year-revenue-increase
AMD earned $6.55 billion in Q2; net income dropped to $447 million.

Desktop CPU Sales Lowest in 30 Years, AMD Gains Market Share Anyway (Updated)
https://www.tomshardware.com/news/lowest-cpu-shipments-in-30-years-amd-intel-q2-2022-cpu-market-share
One rises while the rest fall.

https://gamefaqs.gamespot.com/a/user_image/8/8/0/AAAFvAAADkPg.jpg
Here's AMD marketshare improvement as of late

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Fony
08/15/22 5:52:45 PM
#37:


Intel and NVIDIA can only lose market share to AMD, ARM, etc. Intel especially is getting fucked in the datacenter, their bread and butter. Sapphire Rapids got delayed again BTW.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:54:28 PM
#38:


Fony posted...
Yup. The only reason the GPU crash didn't hurt AMD as much is because they used alot of their fab allocations on increasingly diverse types of products and don't have a deluge of unsold GPU's.
AMD also learned from the last Crypto Crash and didn't bet heavily into the Crypto Mining craze.

Fony posted...
NVIDIA will progress their product no matter what. Intel was keeping 6 core desktop CPU's for when they got 10nm working.
You mean Intel was "Holding Back" and not pushing as hard as possible, that's why we got a decade of Quad-core Hell.
https://gamefaqs.gamespot.com/a/user_image/6/4/6/AAAFvAAACRqm.png

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 5:56:58 PM
#39:


Fony posted...
Intel and NVIDIA can only lose market share to AMD, ARM, etc. Intel especially is getting fucked in the datacenter, their bread and butter. Sapphire Rapids got delayed again BTW.
I know, that's on Intel.

Sapphire Rapid's fundamental design is solid, especially if they went in on a "Pure E-core" version.

They could've had a Quad Tile (224x E-Core's or 56 P-Core's versions).

Granted each E-Core performs like Sky Lake, but there's a certain Quality with Quantity.

And Sky Lake performance ain't bad, but it'll only win certain specific workloads like massively Multi-Threading apps like 3D Rendering.

That's why Intel needs to DROP the Hybrid BS on DeskTop / Server and go for a Dual CPU Strategy.

The entire big.LITTLE strategy was designed to counter AMD on the "Mobile Side" and deal with power usage.

It's literally a hindrance on the DeskTop / Server side.

Intel could easily win Massively Multi-Threaded workloads and make ThreadRipper look like a bad value.
https://gamefaqs.gamespot.com/a/user_image/8/8/2/AAAFvAAADkPi.jpg
https://gamefaqs.gamespot.com/a/user_image/8/8/3/AAAFvAAADkPj.jpg

But Intel literally hindered themselves by going hybrid.
https://gamefaqs.gamespot.com/a/user_image/8/8/4/AAAFvAAADkPk.jpg


---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 5:58:22 PM
#40:


NVIDIA is a much better-run company than Intel. They're not afraid to go hard in price-to-performance, either. The general consensus when they announced the RTX 3080 was that it was priced very competitively relative to the extreme increase in performance over the previous generation. Of course, that was before the GPU shortage, and at that time NVIDIA had legitimate concerns that AMD was about to come in and undercut them heavily in price. So they simply took what was likely their true enthusiast card (the 3080) and lowered the price from $999 to $699 (and dropped the VRAM from 12gb to 10gb), then renamed the RTX TItan Ampere series to RTX 3090 at the last minute.

In NVIDIA's own documentation, the 3080 was originally supposed to be the highest-end gamer card and the 3090 was basically the 3080 with a ton of VRAM (a Titan card, basically, for producers).

Intel is the kind of dumb ass company that releases something like the 11900k, charges more than the Ryzen equivalent, then can't even match their own previous generation's CPU in multicore.


---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 6:04:23 PM
#41:


DarkRoast posted...
NVIDIA is a much better-run company than Intel. They're not afraid to go hard in price-to-performance, either. The general consensus when they announced the RTX 3080 was that it was priced very competitively relative to the extreme increase in performance over the previous generation. Of course, that was before the GPU shortage, and at that time NVIDIA had legitimate concerns that AMD was about to come in and undercut them heavily in price. So they simply took what was likely their true enthusiast card (the 3080) and lowered the price from $999 to $699 (and dropped the VRAM from 12gb to 10gb), then renamed the RTX TItan Ampere series to RTX 3090 at the last minute.

In NVIDIA's own documentation, the 3080 was originally supposed to be the highest-end gamer card and the 3090 was basically the 3080 with a ton of VRAM (a Titan card, basically, for producers).

Intel is the kind of dumb ass company that releases something like the 11900k, charges more than the Ryzen equivalent, then can't even match their own previous generation's CPU in multicore.

The 11th series was under Bob Swan's command
RocketLake launched on March 30, 2021

Pat Gelsinger rejoined Intel as their new CEO on February 15, 2021

There wasn't much Pat could do about RocketLake, everything for that CPU launch was set in motion by Bob Swan and the team behind RocketLake.

Pat was too busy getting settled in as Intel's new CEO.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Fony
08/15/22 6:25:49 PM
#42:


DarkRoast posted...
NVIDIA is a much better-run company than Intel. They're not afraid to go hard in price-to-performance, either. The general consensus when they announced the RTX 3080 was that it was priced very competitively relative to the extreme increase in performance over the previous generation. Of course, that was before the GPU shortage, and at that time NVIDIA had legitimate concerns that AMD was about to come in and undercut them heavily in price. So they simply took what was likely their true enthusiast card (the 3080) and lowered the price from $999 to $699 (and dropped the VRAM from 12gb to 10gb), then renamed the RTX TItan Ampere series to RTX 3090 at the last minute.

In NVIDIA's own documentation, the 3080 was originally supposed to be the highest-end gamer card and the 3090 was basically the 3080 with a ton of VRAM (a Titan card, basically, for producers).

Intel is the kind of dumb ass company that releases something like the 11900k, charges more than the Ryzen equivalent, then can't even match their own previous generation's CPU in multicore.

Intel has like 100K genius employees, but the management is driven entirely by profits. They will never really compete on price, and if they take a hit like the last quarter, they just cut spending and mull over layoffs instead. The CEO was the highest paid CEO of 2021 and they're moving ahead with paying dividends to shareholders even though every inch of intel's business lost money(big time).

They sold of some segments to go along with the bad financial news and shut down optane. Intel is having execution problems and it's probably all on the management.

---
It's not the end of the world, but we can see it from here.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 6:35:13 PM
#43:


Fony posted...
Intel has like 100K genius employees, but the management is driven entirely by profits. They will never really compete on price, and if they take a hit like the last quarter, they just cut spending and mull over layoffs instead. The CEO was the highest paid CEO of 2021 and they're moving ahead with paying dividends to shareholders even though every inch of intel's business lost money(big time).

They sold of some segments to go along with the bad financial news and shut down optane. Intel is having execution problems and it's probably all on the management.

Intel has 121,100 employees as of 2021.
~ 15,000 of those employees are Software engineers.

They also have a good chunk of non-engineering folks too sadly.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 6:39:14 PM
#44:


Apparently Intel has very poor intra-company communication, too. The GPU division's recent media push (in July) was apparently not even approved by Intel's upper management, and came as a complete surprise. And the Optane division was literally releasing new videos about benchmarks on the day they were shut down.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
FL81
08/15/22 6:40:56 PM
#45:


Honestly I just stick with the Intel/Nvidia combo because it has always just worked

I've been burned too many times on faulty AMD GPUs to consider purchasing one

---
Thanks to Proofpyros for the sig images
https://i.imgur.com/Nv4Pi1v.jpg https://i.imgur.com/N43HJYv.jpg
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 6:42:02 PM
#46:


DarkRoast posted...
Apparently Intel has very poor intra-company communication, too. The GPU division's recent media push (in July) was apparently not even approved by Intel's upper management, and came as a complete surprise. And the Optane division was literally releasing new videos about benchmarks on the day they were shut down.
Yup, Intel's Arc division is kinda going "Rogue" on the rest of Intel. That's having serious repercussions on everybody, including AIB partners.

Optane was running normally until they got canned publically.

I feel for the Optane guys, they were on truly ground breaking technology.
But because Intel didn't want to share, it was too expensive and got canned.

Luckily Intel shared ThunderBolt with USB IF, and it didn't go the way of FireWire.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
08/15/22 7:56:56 PM
#47:


I think the problem with optane is that the actual manufacturer of that kind of memory (micron) decided it wasn't worth it. So they literally had no way to acquire more.

---
Lenovo Legion 7 - Ryzen 5900HX, RTX 3080 16 GB (165W), 32 GB DDR4-3200
By Grabthar's hammer............... What a laptop.
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 8:05:25 PM
#48:


DarkRoast posted...
I think the problem with optane is that the actual manufacturer of that kind of memory (micron) decided it wasn't worth it. So they literally had no way to acquire more.
Intel didn't share the technology to manufacture Optane / QuantX / 3D Phase Change memory with all the DRAM/NAND Flash Memory manufacturers, ergo no economies of scale.

Intel didn't drive adoption by making it mandatory as the "OS Drive" across all Intel Product Lines.

That's where Optane's Ultra Low Latency shines, in "OS Drive" usage.

https://gamefaqs.gamespot.com/a/user_image/0/0/2/AAAFvAAADkRa.jpg
When Optane is used as a SSD, it literally had latency in the 10 s range while normal NVMe SSD's where barely breaking 100 s on a good day.

Read Latency for NAND based SSD's are getting worse with increased Cell Density.
https://gamefaqs.gamespot.com/a/user_image/0/0/3/AAAFvAAADkRb.jpg
Packing in all that data in one cell has a Latency penalty, and we're heading towards PLC (Penta-Level Cells) and more dense setups as time moves on.

If you used Optane as a RAM Stick / DIMM, it's latency is literally almost identical to HBM / GDDR6 latencies of ~400-500 ns range

Optane still isn't as low latency as regular DRAM (That's why System DRAM is the way it is), but you get ALOT more storage capacity and it could be used as a OS Drive if the correct drivers were programmed for it.

Imagine if you had a OS Drive that was a literal RAM-Drive of sorts with latency as bad as GDDR6 / HBM RAM.

That would've been a wonderous future for those of who want a faster responding OS Drive / Temp Drive.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 8:25:17 PM
#49:


The main thing that Optane offered was Consistency of performance, no matter how badly degraded your storage drive was.

https://gamefaqs.gamespot.com/a/user_image/0/2/2/AAAFvAAADkRu.png
And Phenomenal Random Performance at low Queue Depths where regular end users operated at.
https://gamefaqs.gamespot.com/a/user_image/0/2/3/AAAFvAAADkRv.png

https://gamefaqs.gamespot.com/a/user_image/0/2/4/AAAFvAAADkRw.png

https://gamefaqs.gamespot.com/a/user_image/0/2/5/AAAFvAAADkRx.png
https://gamefaqs.gamespot.com/a/user_image/0/2/6/AAAFvAAADkRy.png

Offering consistent fast performance, no matter the Storage Drive situation is "Hard to Market". Ergo Optane's marketing team failure to appeal to the masses.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Kamen_Rider_Blade
08/15/22 8:28:22 PM
#50:


And Sustained Write's that are "Like a Rock"

https://gamefaqs.gamespot.com/a/user_image/0/3/0/AAAFvAAADkR2.jpg

https://gamefaqs.gamespot.com/a/user_image/0/3/1/AAAFvAAADkR3.jpg
When all other drives are degraded in performance once you get past the initial buffer, Optane keeps Going & Going & Going.

https://gamefaqs.gamespot.com/a/user_image/0/3/2/AAAFvAAADkR4.jpg

https://gamefaqs.gamespot.com/a/user_image/0/3/3/AAAFvAAADkR5.jpg

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Topic List
Page List: 1, 2