Current Events > Apparently the RTX 3000 series is going to be ridiculously OP

Topic List
Page List: 1
DarkRoast
05/17/20 7:03:07 PM
#1:


https://www.tomshardware.com/news/nvidia-rtx-3080-ampere-all-we-know

https://www.tweaktown.com/news/72225/nvidias-mid-range-geforce-rtx-3060-could-beat-flagship-2080-ti/amp.html

Leaked benchmarks of the entry-level 3060 (price range around $300ish) puts it about on par with the 2080 Ti (a $1,300 card)

If previous generations are any indication, the 3080 Ti and the Titan would be double that.

That means the entry level card would handle basically any game up to 2021 at 4K well above 60fps.

Noice.


---
Well allons-y, Alonso!
... Copied to Clipboard!
pogo_rabid
05/17/20 7:03:56 PM
#2:


So glad I held off on upgrading my 1070ti to a 2070s

---
Ryzen 3800x @ 4.3, 32 gig @ 4000, 1070ti, 970pro, Asus Strix x570-E
FC: SW-8431-3263-1243
... Copied to Clipboard!
Master_Bass
05/17/20 7:04:17 PM
#3:


Awesome. I could use a replacement for my two GTX 1080s. It would be nice to just get one card to replace them.

---
Many Bothans died to bring you this post.
... Copied to Clipboard!
ToadallyAwesome
05/17/20 7:05:23 PM
#4:


pogo_rabid posted...
So glad I held off on upgrading my 1070ti to a 2070s

I was totally thinking of doing a new build this year but was wanting to hear what the upgrade on GPUs was going to be.

Waiting seems to be a smart move now.

---
... Copied to Clipboard!
DarkRoast
05/17/20 7:06:13 PM
#5:


ToadallyAwesome posted...
I was totally thinking of doing a new build this year but was wanting to hear what the upgrade on GPUs was going to be.

Waiting seems to be a smart move now.

I absolutely would wait until these gpus come out, sounds like it'll be September or so

---
Well allons-y, Alonso!
... Copied to Clipboard!
DarkRoast
05/17/20 7:08:35 PM
#6:


Also, they dramatically increased the number of Ray tracing cores. If Minecraft RTX is any indication, Ray tracing is going to be amazing next generation.

---
Well allons-y, Alonso!
... Copied to Clipboard!
DarkRoast
05/17/20 7:11:05 PM
#7:


Anyone remember in the late 2000s when ATI released the Radeon 4850, a card with the performance of a $700 Nvidia card for $200?

Now that was a revolution

---
Well allons-y, Alonso!
... Copied to Clipboard!
pogo_rabid
05/17/20 7:13:03 PM
#8:


DarkRoast posted...
Anyone remember in the late 2000s when ATI released the Radeon 4850, a card with the performance of a $700 Nvidia card for $200?

Now that was a revolution
AMD has done that a bunch of times, their legendary 9800 from the mid 00's comes to mind

Well...it was ATI at the time..

---
Ryzen 3800x @ 4.3, 32 gig @ 4000, 1070ti, 970pro, Asus Strix x570-E
FC: SW-8431-3263-1243
... Copied to Clipboard!
cmiller4642
05/17/20 7:16:12 PM
#9:


I love my 5700 XT

I'll probably get a new card in 3 or 4 years
... Copied to Clipboard!
Tropicalwood
05/17/20 7:17:08 PM
#10:


But does it have HBM2

---
ayy lmao ayy lmao || oaml oaml yya yya
ayy lmao ayy lmao || oaml oaml yya yya
... Copied to Clipboard!
Darmik
05/17/20 7:34:07 PM
#11:


Hopefully that rumor pays off. Gonna be a lot of people upgrading to this generation I think. Hopefully they're not overpriced.

---
Kind Regards,
Darmik
... Copied to Clipboard!
DarkRoast
05/17/20 7:45:11 PM
#12:


Darmik posted...
Hopefully that rumor pays off. Gonna be a lot of people upgrading to this generation I think. Hopefully they're not overpriced.

Nvidia is so far ahead of AMD currently. It's hard to imagine they won't price gouge.

---
Well allons-y, Alonso!
... Copied to Clipboard!
pogo_rabid
05/17/20 7:46:45 PM
#13:


Darmik posted...
Hopefully that rumor pays off. Gonna be a lot of people upgrading to this generation I think. Hopefully they're not overpriced.
Jensen/Nvidia are not represented in any console atm, so they have a motivation to compete with consoles since they're AMD machines. Really hoping this is them going all out to show off at the expense of the next generation or two.

They don't wanna pull an intel, after all.

---
Ryzen 3800x @ 4.3, 32 gig @ 4000, 1070ti, 970pro, Asus Strix x570-E
FC: SW-8431-3263-1243
... Copied to Clipboard!
KamenRiderBlade
05/17/20 7:52:07 PM
#14:


I REALLY don't believe those rumors.

Will Nvidia have a 2x Faster card than the old RTX 2080 Ti.

You Betcha.

Will it cost a ARM & LEG, HELL YEAH!!!

You thought $2500 = Nvidia Titan RTX was high, that's going to be the new epitome of gaming.

A RTX 2080 Ti with only 68 SM's cost $999 MSRP.

Expect that to be the new normal for Top Tier level performance. $1,000 for a Top of the line Ti.

Nvidia won't be giving you that much more performance for that little money.

At best it just shifts one tier down from last gen.

Look carefully at Nvidia history and how it's performance has scaled with each generation change and price has only gone up.

With Ampere, there can now be 128 SM's for a Full Out card.

That means the old school SLI style cards can come back out to play without actually any of the downsides because they're one card.

I'd more realistically believe that Nvidia is going to create more tiers and charge more.

This is my speculation for Nvidia MAXIMIZING their profits from you!!!

So continuing on Speculating on their Tiered Lineup:

NOTE: __ SM = Streaming MultiProcessor

HBM2 Chips will go as low as RTX 3070, after that, go pure traditional GDDR6 on PCB layout, DO NOT allow any GDDR version below 6

128 SM's | 96 GB RAM = $2500 = RTX Titan A ____ | _72 SM's = $2500 = Nvidia Titan RTX
126 SM's | 72 GB RAM = $1450 = RTX 3090 Ultra _ |
124 SM's | 68 GB RAM = $1400 = RTX 3090 Ti ____ |
122 SM's | 64 GB RAM = $1350 = RTX 3090 Super _ |
120 SM's | 60 GB RAM = $1300 = RTX 3090 _______ |
118 SM's | 56 GB RAM = $1250 = RTX 3080 Ultra _ |
114 SM's | 52 GB RAM = $1200 = RTX 3080 Ti ____ |
110 SM's | 48 GB RAM = $1150 = RTX 3080 Super _ |
106 SM's | 44 GB RAM = $1100 = RTX 3080 _______ |
102 SM's | 42 GB RAM = $1050 = RTX 3070 Ultra _ |
_98 SM's | 38 GB RAM = $1000 = RTX 3070 Ti ____ | _68 SM's = $_999 = RTX 2080 Ti
_94 SM's | 34 GB RAM = $_967 = RTX 3070 Super _ |
_90 SM's | 30 GB RAM = $_933 = RTX 3070 _______ | Lowest Tier GPU with HBM2 memory
_86 SM's | 29 GB RAM = $_900 = RTX 3060 Ultra _ |
_82 SM's | 28 GB RAM = $_867 = RTX 3060 Ti ____ |
_78 SM's | 27 GB RAM = $_833 = RTX 3060 Super _ |
_74 SM's | 26 GB RAM = $_800 = RTX 3060 _______ |
_70 SM's | 25 GB RAM = $_767 = RTX 3050 Ultra _ | _72 SM's = $2500 = Nvidia Titan RTX
_67 SM's | 24 GB RAM = $_733 = RTX 3050 Ti ____ | _68 SM's = $_999 = RTX 2080 Ti
_64 SM's | 23 GB RAM = $_700 = RTX 3050 Super _ | _48 SM's = $_700 = RTX 2080 Super
_61 SM's | 22 GB RAM = $_667 = RTX 3050 _______ | _46 SM's = $_650 = RTX 2080
_58 SM's | 21 GB RAM = $_633 = RTX 3040 Ultra _ |
_55 SM's | 20 GB RAM = $_600 = RTX 3040 Ti ____ |
_52 SM's | 19 GB RAM = $_567 = RTX 3040 Super _ |
_49 SM's | 18 GB RAM = $_533 = RTX 3040 _______ | _48 SM's = $_700 = RTX 2080 Super
_46 SM's | 17 GB RAM = $_500 = RTX 3030 Ultra _ | _46 SM's = $_650 = RTX 2080
_43 SM's | 16 GB RAM = $_467 = RTX 3030 Ti ____ |
_40 SM's | 15 GB RAM = $_433 = RTX 3030 Super _ | _40 SM's = $_500 = RTX 2070 Super
_37 SM's | 14 GB RAM = $_400 = RTX 3030 _______ | _36 SM's = $_400 = RTX 2070
_34 SM's | 13 GB RAM = $_367 = RTX 3020 Ultra _ | _34 SM's = $_400 = RTX 2060 Super
_31 SM's | 12 GB RAM = $_333 = RTX 3020 Ti ____ | _30 SM's = $_350 = RTX 2060
_28 SM's | 11 GB RAM = $_300 = RTX 3020 Super _ |
_25 SM's | 10 GB RAM = $_267 = RTX 3020 _______ | _24 SM's = $_280 = GTX 1660 Ti
_22 SM's | _9 GB RAM = $_233 = RTX 3010 Ultra _ | _22 SM's = $_230 = GTX 1660 Super
_19 SM's | _8 GB RAM = $_200 = RTX 3010 Ti ____ | _22 SM's = $_220 = GTX 1660
_16 SM's | _7 GB RAM = $_175 = RTX 3010 Super _ | _20 SM's = $_160 = GTX 1650 Super
_13 SM's | _6 GB RAM = $_150 = RTX 3010 _______ | _14 SM's = $_150 = GTX 1650
_10 SM's | _5 GB RAM = $_125 = RTX 3000 Ultra _ |
__8 SM's | _4 GB RAM = $_100 = RTX 3000 Ti ____ | __6 SM's = $_140 = GTX 1050 Ti
__6 SM's | _3 GB RAM = $__75 = RTX 3000 Super _ | __5 SM's = $_110 = GTX 1050
__4 SM's | _2 GB RAM = $__50 = RTX 3000 _______ | __3 SM's = $__80 = GTX 1030

Budget Graphic Card Tiers should stick with 2x Mini Display Port on both outter ends and Single/Double Slot cooling with Blower Coolers
There should be gaps in the center for air flow along with a plastic shroud to route air out the back
Stick with Blowers with Vapor Chambers & Heat Pipes until you get to 3020 because of the cost of Mass Production
Highest tiers of GPU gets 6x forms of Display Output all on one Rear Add-In Card Slot

Nvidia has grabbed all their customers and fan base by the crotch, and they're going to milk you $$$ wise for that performance.

Doesn't matter if AMD has better price to performance, your DieHard Nvidia loyalty will not make you waiver, and you will gladly fork over the money to them for the performance you want or can afford.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
KamenRiderBlade
05/17/20 7:55:32 PM
#15:


pogo_rabid posted...
They don't wanna pull an intel, after all.
The only aspect of Intel they want to copy is their FAT Profit margins and really big and wide product stack to maximize profits. And they will do it.

They're not so generous to give you that much performance for so little money.

AIB partners want more margins, and Nvidia wants their cut. Prices are going to go up overall, but you will get some price to performance boost relative to the price you paid last gen.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
KamenRiderBlade
05/17/20 7:57:08 PM
#16:


DarkRoast posted...
Nvidia is so far ahead of AMD currently. It's hard to imagine they won't price gouge.
They will, you can bet on that!

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
KamenRiderBlade
05/17/20 7:58:17 PM
#17:


Tropicalwood posted...
But does it have HBM2
Only for top tier cards if you're willing to pay the $$$ for it.

Otherwise you're going to Plebian category with normal GDDR6 mounted on the traditional PCB.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
KamenRiderBlade
05/17/20 7:59:18 PM
#18:


DarkRoast posted...
I absolutely would wait until these gpus come out, sounds like it'll be September or so
I'd bet on November, Jensen Huang would want to spoil Sony/MS console launches with his GPU launch.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Villain
05/17/20 8:00:41 PM
#19:


glad I opted for a 1660 Super instead of getting a 2060 Super then.

I can hold off a couple of years for one of these.

---
https://imgur.com/ZWNgMXL
Formerly known as Will VIIII
... Copied to Clipboard!
billcom6
05/17/20 8:01:13 PM
#20:


when they droppin

---
//constant loneliness// --- Steam and Fortnite: billcom6
My Teams: The Ohio State Buckeyes, New York Yankees, Buffalo Bills, The CBJ, Cavs
... Copied to Clipboard!
DarkRoast
05/17/20 8:04:24 PM
#21:


I don't really understand what AMD's been up to lately, at least on the GPU end of things. They've been wrecking with Ryzen Threadripper on the CPU end, but their best GPU - literally top-of-the-line, is about on par with the RTX 2060 or, somewhat comically, the thin-and-light laptop RTX 2080 Max Q. If your flagship desktop GPU is on-par with a GPU literally designed for thin laptops, you're doing it wrong.

---
Well allons-y, Alonso!
... Copied to Clipboard!
KamenRiderBlade
05/17/20 8:15:12 PM
#22:


DarkRoast posted...
I don't really understand what AMD's been up to lately, at least on the GPU end of things. They've been wrecking with Ryzen Threadripper on the CPU end, but their best GPU - literally top-of-the-line, is about on par with the RTX 2060 or, somewhat comically, the thin-and-light laptop RTX 2080 Max Q. If your flagship desktop GPU is on-par with a GPU literally designed for thin laptops, you're doing it wrong.

The 5700XT is only good enough to hang with the 2070 Class.

https://pcper.com/2019/09/sapphire-nitro-radeon-rx-5700-xt-review/

You have to understand that the amount of users who buys 2070 and up class GPU's is a VERY tiny portion of the GPU market. That's like 1% of the total GPU market

Those Halo Products that you splurge on, you're LITERALLY a 0.1% of the GPU world.

Most PC gamers can't afford "The Best" like you can. they're just poor and can only get 2060 Class and below.

Ryzen's performance dominance didn't come over night.

It took them 3 Generations to get where they are now.

Next Generation is where they're expected to lead in all metrics.

Nvidia has been constantly improving unlike Intel.

And Nvidia has held the lead for quite some time.

The Navi series was quite the performance jump compared to the previous gen, but still not enough.

Give AMD time, and it will catch up.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
pogo_rabid
05/17/20 8:21:11 PM
#23:


KamenRiderBlade posted...
Next Generation is where they're expected to lead in all metrics.
So happy my x570 board will work with this year's chips.

---
Ryzen 3800x @ 4.3, 32 gig @ 4000, 1070ti, 970pro, Asus Strix x570-E
FC: SW-8431-3263-1243
... Copied to Clipboard!
DarkRoast
05/17/20 8:24:24 PM
#24:


AMD's solution for Ray Tracing, imo, is not going to cut it for consoles and especially PCs.

They're basically working with Direct X / Vulkan for software solutions.

Nvidia got a lot of hell for RTX 2000 marketing, but the bottom line is that using separate cores for raytracing actually works. It's why even the worst RTX card (laptop RTX 2060) absolutely spanks the GTX 1080 in the few games that allow GTX cards to (attempt) raytracing.


---
Well allons-y, Alonso!
... Copied to Clipboard!
KamenRiderBlade
05/17/20 8:48:11 PM
#25:


DarkRoast posted...
AMD's solution for Ray Tracing, imo, is not going to cut it for consoles and especially PCs.

They're basically working with Direct X / Vulkan for software solutions.

Nvidia got a lot of hell for RTX 2000 marketing, but the bottom line is that using separate cores for raytracing actually works. It's why even the worst RTX card (laptop RTX 2060) absolutely spanks the GTX 1080 in the few games that allow GTX cards to (attempt) raytracing.

AMD's Hybrid Ray Tracing is just that, it uses the right hardware/software parts to perform RayTracing.

https://www.tomshardware.com/news/amd-patents-hybrid-ray-tracing-solution,39761.html

Don't knock it until you've tested it and gotten hard results yourself.

Nvidia went the brute force method and it works.

But they needed DLSS just to make RT workable at a higher resolution, otherwise the performance hit would still be too high.

Without DLSS scaling up the image, RT would be too taxing at the original resolution.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
KamenRiderBlade
05/17/20 8:53:51 PM
#26:


https://www.youtube.com/watch?v=qC5KtatMcUw

https://www.youtube.com/watch?v=IIdn6yNdHMY

https://www.youtube.com/watch?v=iIDzZJpDlpA

With UE5 & their solution for Global Illumination and Tiny Triangles.

The need for Ray Tracing will lessen.

Ray Tracing could be used for only what's needed, ergo the importance of it will be lessened.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Darmik
05/17/20 8:58:59 PM
#27:


Whatever happened with PhysX? I remember that being a big hyped thing that just sort of wasn't talked about anymore. I guess it's still around in some form.

My 1070 chugged when it was turned on for Assassin's Creed 4 which was weird.

KamenRiderBlade posted...
https://www.youtube.com/watch?v=qC5KtatMcUw

https://www.youtube.com/watch?v=IIdn6yNdHMY

https://www.youtube.com/watch?v=iIDzZJpDlpA

With UE5 & their solution for Global Illumination and Tiny Triangles.

The need for Ray Tracing will lessen.

Ray Tracing could be used for only what's needed, ergo the importance of it will be lessened.

This won't be universal as standard ray tracing is though right? As far as I know ray tracing can basically be applied to any game while this will be exclusive to UE5 games.

---
Kind Regards,
Darmik
... Copied to Clipboard!
DarthAragorn
05/17/20 9:01:51 PM
#28:


I would bet September will be the launch period for the 3080 and 3070

Cyberpunk is out September and PC gamers are gonna want to upgrade to play that with raytracing
... Copied to Clipboard!
DarkRoast
05/17/20 9:02:52 PM
#29:


KamenRiderBlade posted...
AMD's Hybrid Ray Tracing is just that, it uses the right hardware/software parts to perform RayTracing.

https://www.tomshardware.com/news/amd-patents-hybrid-ray-tracing-solution,39761.html

Don't knock it until you've tested it and gotten hard results yourself.

Nvidia went the brute force method and it works.

But they needed DLSS just to make RT workable at a higher resolution, otherwise the performance hit would still be too high.

Without DLSS scaling up the image, RT would be too taxing at the original resolution.

DLSS 2.0 is actually pretty impressive. The first gen stuff was Blur-O-Vision.

---
Well allons-y, Alonso!
... Copied to Clipboard!
#30
Post #30 was unavailable or deleted.
KamenRiderBlade
05/17/20 9:05:34 PM
#31:


Darmik posted...
This won't be universal as standard ray tracing is though right? As far as I know ray tracing can basically be applied to any game while this will be exclusive to UE5 games.
But the concept in how they did it will be copied by the other engines.

They already explained the concept publically, expect all the other engines to match it soon enough.

Implementation will vary obviously, but the end result should be similar.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Darmik
05/17/20 9:05:45 PM
#32:


Yeah DLSS 2.0 is the thing that's really tipping me towards an Nvidia PC for next gen. That's gonna give these cards a long life.

---
Kind Regards,
Darmik
... Copied to Clipboard!
DarthAragorn
05/17/20 9:07:13 PM
#33:


Future_Trunks posted...
no way a $300 card next year will out perform a current $1000 card

In all scenarios doubtful but I can absolutely see the 3060 having better raytracing performance than the 2080 Ti
... Copied to Clipboard!
PIITB415
05/17/20 9:07:14 PM
#34:


Darmik posted...
Yeah DLSS 2.0 is the thing that's really tipping me towards an Nvidia PC for next gen. That's gonna give these cards a long life.

That's basically where I'm at. I'm going to sell my 2080ti once I get my hands on the 3080ti.

---
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:08:06 PM
#35:


DarkRoast posted...
DLSS 2.0 is actually pretty impressive. The first gen stuff was Blur-O-Vision.
But why use DLSS or any up-sampling compared to rendering the image "Natively" and get the actual image the way it should've been rendered.

All DLSS and any Up-Sampler or Down-Sampler is just another form of scaling to hide lack of performance.

The most accurate image quality will always be the native / original resolution your display is at. Anything else is a crutch to hide performance issues or use some trick for Anti-Aliasing effects.

The lowest amount of latency will always be rendering at your native resolution without having to UpSample or DownSample the image from a non-native resolution.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
DarkRoast
05/17/20 9:08:10 PM
#36:


Darmik posted...
Yeah DLSS 2.0 is the thing that's really tipping me towards an Nvidia PC for next gen. That's gonna give these cards a long life.

If it gets any better, somewhat ironically it'll start to beg the question of why anyone would get a top end card.

---
Well allons-y, Alonso!
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:13:25 PM
#37:


DarkRoast posted...
If it gets any better, somewhat ironically it'll start to beg the question of why anyone would get a top end card.
There is a performance penalty in terms of latency in using any form of Up/Down sampling.

So it's far from a perfect solution.

It's a crutch, and it will only boost you so far.

Nothing beats raw HP and efficiency to get the most accurate graphical results.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
MutantJohn
05/17/20 9:15:15 PM
#38:


Never buy AMD. Dude, they don't even run CUDA!

---
"Oh, my mother; oh, my friends, ask the angels, will I ever see heaven again?" - Laura Marling
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:17:48 PM
#39:


MutantJohn posted...
Never buy AMD. Dude, they don't even run CUDA!
CUDA is Nvidia's proprietary language.

Never run or code in CUDA compared to using OpenCL or DirectCompute.

Why on earth would you want to use a proprietary way of doing something compared to a OpenSourced method.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Darmik
05/17/20 9:18:31 PM
#40:


KamenRiderBlade posted...
But why use DLSS or any up-sampling compared to rendering the image "Natively" and get the actual image the way it should've been rendered.

All DLSS and any Up-Sampler or Down-Sampler is just another form of scaling to hide lack of performance.

The most accurate image quality will always be the native / original resolution your display is at. Anything else is a crutch to hide performance issues or use some trick for Anti-Aliasing effects.

The lowest amount of latency will always be rendering at your native resolution without having to UpSample or DownSample the image from a non-native resolution.

Because the difference to me looks negligible.

https://imgsli.com/MTUwMDg/0/2

Obviously native is ideal but resolution is such a significant resource hog so this is a great solution to have without significantly sacrificing IQ or putting down other settings you might want.

I often stream games to my Nvidia Shield or laptop so I think I'll be fine with input lag which I doubt will be worse than that.

---
Kind Regards,
Darmik
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:20:19 PM
#41:


Darmik posted...
Because the difference to me looks negligible.

https://imgsli.com/MTUwMDg/0/2

Obviously native is ideal but resolution is such a significant resource hog so this is a great solution to have without significantly sacrificing IQ or putting down other settings you might want.
It looks like somebody rubbed Baby Oil over the entire screen.

Not exactly the effect that I want on my Image Quality.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
MutantJohn
05/17/20 9:21:50 PM
#42:


KamenRiderBlade posted...
CUDA is Nvidia's proprietary language.

Never run or code in CUDA compared to using OpenCL or DirectCompute.

Why on earth would you want to use a proprietary way of doing something compared to a OpenSourced method.
Ha, you must've never seriously compared CUDA to OpenCL.

CUDA basically shits on OpenCL any day of the week in terms of developer experience.

---
"Oh, my mother; oh, my friends, ask the angels, will I ever see heaven again?" - Laura Marling
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:22:57 PM
#43:


MutantJohn posted...
Ha, you must've never seriously compared CUDA to OpenCL.

CUDA basically shits on OpenCL any day of the week in terms of developer experience.
I prefer avoiding proprietary API's if possible.

I'm pro Open Source if there is an option.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
MutantJohn
05/17/20 9:24:25 PM
#44:


KamenRiderBlade posted...
I prefer avoiding proprietary API's if possible.

I'm pro Open Source if there is an option.
I get that. But OpenCL is hot garbage in comparison to CUDA.

CUDA is basically exactly what you'd want. I blame AMD for not implementing their own CUDA API or working with Nvidia to create a unified, AOT-compiled API

---
"Oh, my mother; oh, my friends, ask the angels, will I ever see heaven again?" - Laura Marling
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:26:05 PM
#45:


MutantJohn posted...
I get that. But OpenCL is hot garbage in comparison to CUDA.

CUDA is basically exactly what you'd want. I blame AMD for not implementing their own CUDA API or working with Nvidia to create a unified, AOT-compiled API
AMD works with OpenCL / DirectCompute.

One's Open Source, another is a public API that is open to all devs.

Not a proprietary API that is locked in to Nvidia's architecture.

So, no thanks. I'll never code for CUDA willingly.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Darmik
05/17/20 9:28:05 PM
#46:


KamenRiderBlade posted...
It looks like somebody rubbed Baby Oil over the entire screen.

Not exactly the effect that I want on my Image Quality.

Keep in mind the source for that one is 540p. 540p on a Switch screen looks like a blurry mess. This looks pretty normal.

---
Kind Regards,
Darmik
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:32:20 PM
#47:


Darmik posted...
Keep in mind the source for that one is 540p. 540p on a Switch screen looks like a blurry mess. This looks pretty normal.
But what you linked me was Control 2880 DLSS 576 vs Control 1440p

If you swap to Control 1440 DLSS 576, it's closer, but you can still tell the difference, it's just less aparent.

But I'd rather stick with native 1440p for the Image Quality.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Darmik
05/17/20 9:35:05 PM
#48:


My bad yes it's 576p.

---
Kind Regards,
Darmik
... Copied to Clipboard!
KamenRiderBlade
05/17/20 9:38:23 PM
#49:


Darmik posted...
My bad yes it's 576p.
On the Nintendo Switch, I can understand the need for DLSS.

But on my gaming rig, HELL NO.

Image Quality shall not suffer 1 pixel if I can avoid it.

I'll settle for good enough frame rates at ~120 fps.

---
Are you a MexiCAN or a MexiCAN'T - Johnny Depp 'Once Upon A Time in Mexico'
... Copied to Clipboard!
Topic List
Page List: 1