250°

Nvidia Finally Officially Speaks About AMD's Mantle - Will Not Support It, No Real Benefit Using It

Nvidia's Distinguished Engineer Tom Petersen and Senior Director of Engineering Rev Lebaradian who shared their thoughts about it, and confirmed that Nvidia is not part of it, will not support it, and there are - at least to them - no big benefits from using it.

Read Full Story >>
dsogaming.com
NYC_Gamer3908d ago (Edited 3908d ago )

Nvidia are on board with DX12 and won't support Mantle because it's from a rival[AMD]

Magicite3908d ago

another exclusivity for AMD.

ATi_Elite3908d ago

Nvidia down playing something AMD does........Omg you almost never ever see that. Lol

Mantle only benefits low end cards in which Nvidia is correct as their driver updates give the same performance boost but across ALL gpu tiers.

bumsick3908d ago

fools, it seems they are content to segregate rather than unify gaming on pc.
Surely having all cards support all api's would lead to better optimisation for everyone in the long run.

Letros3908d ago

Mantle is segregation, DX12 is unification.

Lon3wolf3908d ago

How when Mantle is being offered free to NVidia so there is parity? NVidia have locked Physx in segregation practices when they purchased it years ago.

Roccetarius3908d ago

Obviously people won't understand the bigger picture, but DX12 is definitely as you put it, unification.

No need to involve more unneccesary developer work.

ThatOneGuyThere3908d ago

nVidia is a joke. My laptop has an nVidia card in it, but thats the last time I get anything from them. Ever try using 3d with their cards? "oh, we're sorry, you have to have a specific card/monitor/driver set up thats nVidia nVision certified. Yeah, we know your card plays the game at 120+fps, and you have a 3d monitor, but you see, we didnt write our name on the box, so you cant use it." F nVidia. Never again.They can take their BS and put it where they get their policies.

ThatOneGuyThere3908d ago

Oh yeah, not to mention every card from them is more than likely some re-named POS from last year.

OculusRift3907d ago

Same again. I NEVER buy next gen parts.. I wait for the kids and rich folks to drop kick their stuff to get a 3-12% bump in performance ("OMFG 60 MORE CUDA CORZ/STREAM PROCESSORZ.. 1 MOAR DISPLAY PORT)" and I pick it up for hundreds less.

JackOfAllBlades3908d ago

I find Nvidia usually overpriced as hell and that's why I use AMD, same reason I don't use Apple products

OculusRift3907d ago

Kind of the same Reason I don't use Intel/nVidia anymore. overpriced to hell while AMD offers the same thing at hundreds less. If I didn't go full AMD this build, it would have easily cost me $3-400 more to get the same performance.

bigboirock3908d ago

My alien ware 18 with dual 880m have no problems with it I put in a 3d screen runs everything smooth

tee_bag2423907d ago (Edited 3907d ago )

Except the 880m's are 8gb rebadged 4gb 780m's.
Which isn't a problem but their are multiple threads at NBR about Nvidia deliberately slipping in performance throttling flag into the 880m's.
Something is seriously wrong when a 780m performs far better than a newer 880m.
We're actually addressing the issue with a Nvidia rep right now. You should chime in.
Nice machine by the way. I have a R2.

Show all comments (47)
230°

Nintendo Switch 2 Leveled Up With NVIDIA AI-Powered DLSS and 4K Gaming

Nvidia writes:

The Nintendo Switch 2 takes performance to the next level, powered by a custom NVIDIA processor featuring an NVIDIA GPU with dedicated RT Cores and Tensor Cores for stunning visuals and AI-driven enhancements.

Read Full Story >>
blogs.nvidia.com
Number1TailzFan3d ago

The raytracing probably doesn't even equal a low end PC GPU, even if it did it would probably be mostly useless. They'll probably force it in some game now that will run like shit maybe 30fps at best, just because "it can do it"

B5R3d ago

Raytracing is so unnecessary for a handheld. I just hope you can turn it off.

Vits3d ago

A lot of gamers don’t realize that ray tracing isn’t really about making games look better. It’s mainly there to make development easier and cheaper, since it lets devs skip a bunch of old-school tricks to fake reflections and lighting. The visual upgrade is just a nice bonus, but that’s not the main reason the tech exists.

So you can be 100% sure that developers will try to implement it every chance they get.

RaidenBlack3d ago (Edited 3d ago )

Agree with Vits .... but also to add, if devs and designers just implement RT to a game world then it won't always work as expected. RT is not just reflections but also lighting and illumination as well. For example, If you just create a room with minimal windows, then it will look dark af, if RTGI is enabled. Devs and designers needs to sort out the game world design accordingly as well.
DF's Metro Exodus RT upgrade is an amazing reference video to go through, if anybody's interested.

darthv723d ago

So is HDR... but they have it anyway.

thesoftware7303d ago

Some PS5 and SX games run at 30fps with RT...just like those systems, if you don't like it, turn it off.

I only say this to say, you make it seem like a problem exclusive to the Switch 2.

Neonridr3d ago (Edited 3d ago )

sour grapes much?

"It probably doesn't do it well because it's Nintendo and they suck". That's how your comment reads. Why don't you just wait and see before making these ridiculous statements?

Goodguy013d ago

Please. I'd like to play my switch games on my 4k tv without it looking all doodoo.

PRIMORDUS3d ago

Nvidia could have said this months ago and cut the bullshit. Anyway the rumors were true.

Profchaos3d ago

Would have been nice but NDA likely prevented them from saying anything

PRIMORDUS2d ago

TBH I don't think Nvidia would have cared if they broke the NDA. A little fine they pay, and they go back to their AI shit. They don't even care about GPU's anymore. I myself would like them to leave the PC and console market.

Tacoboto2d ago

This story was written half a decade ago when the world knew Nvidia would provide the chip for Switch 2 and DLSS was taking off.

Profchaos2d ago

Yeah but similar thing happened a long time ago when 3dfx announced they were working with Sega when they took the company public Sega pulled out of the contract for the Dreamcast GPU.

In response Sega terminated the contract and went to a ultimately weaker chipset.

So there's a precedent but that Nintendo would have much Of an option its AMD, NVIDIA or Intel

Profchaos3d ago

I'm not expecting of anything from ray tracing but dlss will be the thing that sees the unit get some impossible ports.

andy853d ago

Correct. All I'm seeing online is it'll never run FF7 Rebirth. If it can run cyberpunk it'll run it. The DLSS will help. Obviously only 30 fps but a lot don't care

Profchaos3d ago (Edited 3d ago )

Exactly right when I buy a game on switch I know what I'm getting into I'm buying a game for its portability and I'm willing to sacrifice fidelity and performance to play on a train or comfortably from a hotel room when I travel for work.

3d ago Replies(1)
Show all comments (23)
90°

AMD Ryzen 9 9950X3D & 9900X3D 3D V-Cache CPUs Now Available

AMD launches the Ryzen 9 9950X3D for $699 & Ryzen 9 9900X3D for $599, offering the best-in-class gaming & content creation CPU performance.

Read Full Story >>
wccftech.com
mkis00726d ago

X3D really turned around AMD's cpu prospects. I wont touch intel now, vs 10 years ago I wouldn't imagine going anywhere near AMD cpu's for gaming only.

Number1TailzFan26d ago

Zen 1 was merely "meh" IMO, it had major RAM compatibility issues, only really worked with Samsung memory from what I recall, and performance was ok at best, the 8700k launched the same year and was top dog even when Zen 2 came out. Though Zen 2 was much better.. it just lacked a bit in gaming, good all rounder chips though for other applications.

AMD are trying to upsell the 9900x3D to 9950x3D, pricing is weird (too close) and odd chip configuration.. it should be a lot cheaper. They did the same with the 9070 -> 9070XT.

Some funny choices going on at AMD..

FinalFantasyFanatic26d ago (Edited 26d ago )

Zen 1 was pretty great for what it was, considering that was the first time in a long time that AMD was actually competitve with Intel, it was also the first time you could easily get something with more than 4 cores/threads. The RAM issues was frustrating AF though, especially since Zen 1 performance relied so heavily on fast RAM.

If we ignore the price, the 990x3D and 9950X3D look pretty great provided you can actually make use of those extra cores/threads, otherwise the 9800x3D is better value.

PixelOmen25d ago

Zen1 was the beginning of the turn around and by Zen3 it was starting to become ultra competitive. X3D was really only the final nail in the coffin.

Jingsing25d ago (Edited 25d ago )

I guess the real question is how many compatibility issues will arise from their motherboard chipsets? also the selection of motherboards for AMD is more limited too. Which often limits what kind of form factor build you want. Last time around I avoided AMD due to their chipsets having horrid USB3 support with accessories. You tend not to see these kind of issues being talked about, it ends up just being games and synthetic benchmarks.

220°

Project Amethyst: AMD & Sony Collaborate on FSR 4

AMD and Sony co-develop FSR 4 upscaler under Project Amethyst, enhancing visuals and performance for future PlayStation consoles.

Read Full Story >>
techgenyz.com
30d ago
Eonjay29d ago

Clearly there was a colab as every game used to demo the tech was a WWS game. And of course they alluded yo this as far bask as the Pro Tech deep dive.

This means that PSSR is probably a lightweight CNN version of FSR 4 which would make sense due to the Pro and PS6 being AMD cards. The biggest relevant difference in the PRO and the RDNA 4 cards being that the PC cards have 3x+ the TOPS.

They both deliver good results with FSR4 having a better denoiser.

PanicMechanic26d ago

I remember Cerny saying that whatever developments were made with PSSR for the prop, that tech would translate into and help develop FSR 4. Sony is making the right moves with AMD

Eonjay26d ago

Yes it feels like they helped them catch up with ML real fast.

Starman6929d ago

Can't believe how good God of war Ragnarok is on the pro 😳

DivineHand12529d ago

The question is, is PSSR going to be replaced by FSR4 on future playstation consoles and is the PS5 Pro FSR4 capable?

--Onilink--29d ago

Unlikely given that FSR4 is only supported by the 9000 cards.

I would expect the PS6 to use FSR4 since it is definitely superior to PSSR, not really much of a point in keeping investing separate resourced into PSSR, but who knows if both will be available on PS6

ABizzel129d ago

FSR 5 would likely be out by then and probably a transformer model. I assume Sony will continue to use PSSR for branding purposes but it will essentially be FSR 5 with a PlayStation specifically solution.

The_Hooligan29d ago

In my opinion I think they will still use PSSR for the PS6 mainly because that was a big marketing point for the PS5 Pro and Sony probably doesn't want to abandon it. They might call it PSSR 2.0 or something and will probably use similar tech as FSR4 due to the partnership between the two companies. I doubt PS6 will use anything similar to the 9000 cards so won't have the same bells and whistles as the FSR4.

NoDamage29d ago

I was going to build a PC soon with a last gen and card but this makes me think I should wait to make sure I get the best experience in the next generation as well.

I guess I'm going to be all in on AMD which is the opposite of what I would normally lean towards.

Number1TailzFan29d ago (Edited 29d ago )

Value wise the 9070XT is a decent card if you don't want to pay for Nvidia ones.. That being said, make sure you're happy with the restrictions using some software, if you're happy with just gaming though AMD should be fine.

Probably best bet price wise is get the 9070XT and then upgrade next gen in a couple of years.. With Nvidia this new gen has been a bust, only the 5090 is a decent step up, even the 4090 beats the 5080 by a fair margin.

Hopefully Nvidia bounces back next gen.. though I expect refreshed 5000 cards before then, 5080 Ti / Super will probably be = to a 4090.

NoDamage29d ago

Thanks! I'll look into software restrictions. Never thought there would be issues there and it's important since I do some graphic design. I was thinking about the 9070 but will have to wait till a it's actually available to buy without the current nonsense.

Show all comments (14)