Új hozzászólás Aktív témák

  • Z10N

    veterán

    AMD_Robert Technical Marketing

    Q: "Why weren't any single card benchmarks released?"

    A: "Because that is what we sample GPUs to reviewers for. Independent third-party analysis is an important estate in the hardware industry, and we don't want to take away from their opportunity to perform their duty by scooping them."

    Q: "Wait so you won't say single GPU performance because reviewers are going to do it but you will say dual GPU? I don't follow the logic."

    A: "There are considerably fewer dual GPU users in the world than single GPU users, by an extremely wide margin. If my goal is to protect the sovereignty of the reviewer process, but also give people an early look at Polaris, mGPU is the best compromise."

    Q: "So, as someone with no dual GPU experience, I have to ask a seemingly stupid question, what was holding the dual 480s back?"

    A: "Tuning in the game. The developer fully controls how and how well multi-GPU functions in DX12 and Vulkan."

    Q: "Ty, If I can ask another stupid question, what does this stuff mean?
    Single Batch GPU Util: 51% | Med Batch GPU Util: 71.9 | Heavy Batch GPU Util: 92.3%
    In the first post you mentioned 151% performance of a single gpu."

    A: "This is a measurement of how heavily the GPU is being loaded as the benchmark dials up the detail. Batches are requests from the game to the GPU to put something on-screen. More detail = more batches. These numbers indicate that GPU utilization is rising as the batch count increases from low, to medium to high. This is what you would expect."

    Q: "Do you know of any plans of how some engines are going to implement that? Unreal 4 or unity for example? Is there a possibility that multi adapter is going to see widespread use through those engines? "

    A: "I hope so. Some engines already support DX12 multi-adapter. The Nitrous engine from Oxide, the engine from the Warhammer team (forget name), and a few others. I personally in my own 100% personal-and-not-speaking-for-AMD opinion believe that the mGPU adoption rate in games will pick up over time as developers become more broadly familiar with the API. Gotta get yer sea legs before you can sail the world and stuff. :)"

    Q: "Think about it this way. Each card represents 100% under full load. since they said that the cards are at 51% this would mean that it's running at roughly 1.5 cards load. 151% would be about 75.5% GPU load on each card. That is, IF I understand correctly."

    A: "Pretty spot-on."

    Q: "Are there any incentives for developers to make future games (or updates for existing ones) better utilize multiple GPUs? Even if, just like you stated, the number of users with more than one GPU is vastly smaller than single users?"

    A: "Sure. The multi-GPU users are the early adopters, the influencers, the "first 10%ers." They produce a disproportionate amount of discussion relative to their population, which makes them very influential on the rest of the PC gaming community downstream. Developers like CDPR have proven the value of this approach, as has the team at Oxide Games for aggressively pursuing advanced DX12 functionality.
    I'm sure both devs would have done just fine with less outsized investment in the bleeding edge, but it's been really valuable for them because gamers appreciate the respect. That's incentive enough, imo."

    Q: "I still don't follow. Reviews are going to say dual and single performance anyway."

    A: "I don't know how to explain it another way. Posting sGPU numbers hurts their reviews and their traffic. mGPU sort of doesn't. That's it."

    Q: "I saw it as X being a roman numeral, meaning it's like an R10. This would imply they could still have R7 and R9, and just no ***X cards."

    A: "Ask me again after launch."

    Q: "Ok I got some questions, if you're not using vsync or locked fps why is the GPU usage so low and not at 100% pushing maximum FPS? You're running a beast CPU so it shouldn't be bottlenecked.
    Why only showcase multi-GPU and only Ashes benchmark? I know it's cherry-picking but it would be nice with other benchmarks as well as they will probably tell
    Third. Please release Vega soon I want a card that can push 144FPS+ in 1440P."

    A: "DX12 uses Explicit Multi-Adapter. The scaling depends on how mGPU is implemented into the engine, and future patches could boost scaling more for any vendor or any GPU combination that works. Besides that, migrating to full production-grade drivers would help. But as you can image, the drivers are still beta. I'm not promising earth-shattering improvements, here, but there are many variables in play that wouldn't be present with GPUs that have been released for 12+ months."

    Q: "Thanks alot for the great info on clafication of the whole AOTS issue.
    I have one question that you may or may not be able to answer.
    I play at triple 1440P (dell U2711 x3) with eyefinity. im currently running triple crossfire 5870 1GB and finding alot of games struggle so even on 1 screen i can play games like BF4. :(
    Would the RX480 be a upgrade.. ;)"

    A: "I can't really answer your question, but I can give you some food for thought that might be enlightening: the Radeon R7 360 is faster than the 5870."

    Q: "Hi, only just noticed this thread and love that you took the time to clarify all this. But I was somewhat wondering if I can get an official comment (or nod) on me comparing the x2 480 vs 1080 comparison and it's introduction of explicit multi-adapter in DX12 to this- https://www.youtube.com/watch?v=6JARVfb-FBg

    Now it may not be obvious for those who are not familiar with the show, but theres also some serious foreshadowing going on with this particular video and what I expect will happen much later on. "

    A: "1) Garnet is best gem.
    2) We're not really foreshadowing anything. We just wanted to point out that it's possible to get $700 worth of performance for <$500."

    Q: "Hey /u/AMD_Robert, thanks for clearing things up, but don't you think it's a bit early to say things like;

    even with fudgy image quality on the GTX 1080 that could improve their performance a few percent

    Maybe I'm just misreading you here, but I don't really think we can say for certain this was intentional yet. Shader errors are a thing, and it may end up being fixed in a driver patched for Pascal and just somehow slipped through the filters. After all, it was fairly hard to notice anyways."

    A: "I never said it was intentional. I simply stated the fact that the image quality is a bit fudgy, and it could improve perf by a few percent. There was no implication of malice."

    [ Szerkesztve ]

    # sshnuke 10.2.2.2 -rootpw="Z10N0101"

Új hozzászólás Aktív témák