![]() If GT 710 could do it, then why RX 6400 can't? That's just AMD selling gimped e-waste to dumb people. And including full decoding capabilities didn't really cost any extra. If that's what buyer needs and don't care about gaming at all, then that's fine purchase. But people in the past bought GT 710s to make YT playable on say Pentium D machines. It's my non daily machine for light usage and 1080p on it works fine, so I won't upgrade. Is RDNA2 worst in raytracing than 2018 Turing, sure, it's worst than Turing in other things also, but Nvidia isn't an easy opponent. Now as engineering goes, he's an asset for AMD, sure RTG had some judgment laps with Navi 24 and with pricing strategy in general, but achieving 2.81GHz on N6, and having only 107mm² die size for the transistor budget (not to mention that raytracing implementation of RDNA2 although weak on performance it is extremely efficient regarding transistor budget size that it adds) are very good indications for the work that is happening in RTG imo. He was right, (probably the w e he used in his comment was meant as we as an industry ,not that it matters much) and it's exactly like said, he probably meant it like this : "this new feature won't take off until a lot of people have access to it".ĭid he said it because they had nothing to compete back then and probably because he wanted to kill the hype Nvidia was trying to generate back then, yes sure.ĭid he play a negative roll trying to delay the adoption from devs of a graphics feature that will ultimately help advance the visual fidelity of the industry, OK maybe even that, but don't you think your reaction is a bit too much? reality has shown that hype to be complete BS? What are you so annoyed about? That an AMD exec made a vaguely accurate, but overly optimistic prediction that contradicted Nvidia's marketing? After all, if it annoys you that a corporate executive made a dubious claim back then, shouldn't you be even more annoyed at Nvidia execs trumpeting the wonders of RTRT and how it would revolutionize real time graphics? Because their statements at the time are much, much further from reality than the statement you quoted here. So, you're arguing that it was bad that he "cut" the Turing hype, yet. But the direction of his argument is far more accurate than Nvidia's statement at the time. Was he wrong? In a naïvely absolutist way where the absolute wording of an assessment matters more than its direction and overall gist, sure. Going back to my 0-100 scale: Nvida was saying "it's 100!", he was saying "probably more like 70-80", and reality has since kicked down the door with a solid 20-30. Because if you did you would see that I absolutely think that it was that, but that I also think that reality has "cut the hype from Nvidia's Turing" much more than this statement. Did you read what I wrote? It doesn't seem like it. Unless he is literally all-powerful, he does not have the ability to make deterministic statements about the future.ģ? You apparently stopped using numbers, but. ![]() The best he, as any human, can do is to state his opinion and intention. That I entirely agree with this should have been plenty clear from my post.Ģ: That's literally the same thing. That's the same thing (barring, of course, any misrepresentations in their quote).ġ: Yes. You're quoting him, even if you're quoting their quoting of his statement, because you're reproducing his words (through their translation).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |