Hardware support: Nvidia changes their mind: RTX 3080, 3090 FE Cards Will Be Sold In Australia |
- Nvidia changes their mind: RTX 3080, 3090 FE Cards Will Be Sold In Australia
- Heads up: 3080 FE review embargo moved to Sept 16th at 6 a.m. Pacific Time
- RTX 3070 Releases Oct. 15.
- (GN)Get Good: AMD Radeon Marketing Friendly Fire & RDNA2 Competing vs. RTX 3000
- _rogame on twitter - Possible RDNA 2 Benchmark Spotted AOTS
- Digital Foundry new FS2020 scaling video for those who were thinking about buying a 3090.
- [VideoCardz] NVIDIA GeForce RTX 3060 Ti gets 4864 CUDA cores and 8GB GDDR6 memory
- LG Launch 163-inch 4K Micro LED TV, Revealing Brutal Truth about MicroLED
- Are there any display port 2.0 monitors on the horizon?
- RedGamingTech: RDNA 2 Is Monstrous | Insane Cache System & Performance Info - EXCLUSIVE
- Why Xbox Series S's Dumb 8+2GB Memory Configuration Isn't As Dumb As You Think
- 3080 Geekbench 5 Vulkan score
- Artificial Intelligence (GPT-3) Explains How RAM Works
- ELI5 the weird RAM configuration of the Xbox Series S
- Heatsinks: Do heatpipes soldered to the fins and base matter?
- A set of hypothesises about the mystery of the Navi Bus
- When will we likely see Graphics cards with hbm2 close to the die like the Vega series back on the market?
- Is SLI/multi-GPU truly dead for gaming?
- Xbox series X vs 2080
Nvidia changes their mind: RTX 3080, 3090 FE Cards Will Be Sold In Australia Posted: 11 Sep 2020 08:24 PM PDT |
Heads up: 3080 FE review embargo moved to Sept 16th at 6 a.m. Pacific Time Posted: 11 Sep 2020 02:00 PM PDT |
Posted: 11 Sep 2020 04:39 PM PDT |
(GN)Get Good: AMD Radeon Marketing Friendly Fire & RDNA2 Competing vs. RTX 3000 Posted: 11 Sep 2020 04:48 PM PDT |
_rogame on twitter - Possible RDNA 2 Benchmark Spotted AOTS Posted: 11 Sep 2020 08:04 PM PDT |
Digital Foundry new FS2020 scaling video for those who were thinking about buying a 3090. Posted: 11 Sep 2020 11:17 PM PDT |
[VideoCardz] NVIDIA GeForce RTX 3060 Ti gets 4864 CUDA cores and 8GB GDDR6 memory Posted: 11 Sep 2020 04:12 AM PDT |
LG Launch 163-inch 4K Micro LED TV, Revealing Brutal Truth about MicroLED Posted: 11 Sep 2020 09:29 AM PDT |
Are there any display port 2.0 monitors on the horizon? Posted: 12 Sep 2020 02:11 AM PDT ROG and a new company called Eve are the only two I know of that are supposedly releasing an hdmi 2.1 input 4k monitor before the year is up... The Eve is rated for 144Hz and I think the ROG will come out with the same specs when they drop. The problem is hdmi 2.1 is only rated for 4k @ 120Hz...you need display port 2.0 to achieve true 4k @ 144Hz without any compression. I just dont understand why they would even make the monitor go above what the inputs can actually provide...whats the point of bottlenecking your own product right out of the gate? Anyone know of a monitor that is releasing with display port 2.0 in the next 3 months or so, please drop that info, thanks! [link] [comments] |
RedGamingTech: RDNA 2 Is Monstrous | Insane Cache System & Performance Info - EXCLUSIVE Posted: 11 Sep 2020 12:40 PM PDT |
Why Xbox Series S's Dumb 8+2GB Memory Configuration Isn't As Dumb As You Think Posted: 11 Sep 2020 01:37 PM PDT I wrote a piece on "Why Xbox Series X's Dumb 10+6GB Memory Configuration Isn't As Dumb As You Think" and noticed that Microsoft has done something similar for the Xbox Series S, so here we go again! The Memory Config Is Even StrangerA couple things were expected:
A couple things were not expected:
Wtf Is Clamshelling?Clamshelling is an old technique built into the GDDR spec to allow a designers to double memory capacity without extra memory controllers. The most recent example is the Nvidia RTX 3090. The 3090 used clamshelling to "double" its VRAM capacity from 12GB to 24GB despite only using a 384-bit bus with 8Gb memory chips.
Videocardz reported on a couple nice PCB pictures where you can see the spots for the 12 memory chips on the front PCB side and the spots for 12 more memory chips on the back PCB side. 3090? Clamshelling Sounds Expensive...Clamshelling is generally a premium option that you see on workstation cards like the AMD FirePro S9100 that reached 32GB of VRAM on a single GPU all the way back in 2014 (!) despite using the same bog-standard 512-bit Grenada GPU that showed up in the Radeon R9 390 and 390X of the same era. General advantages of clamshelling include:
General disadvantages of clamshelling include:
Needless to say, clamshelling is expected on >$1000 products like the RTX 3090 or FirePro 9100, but it is unexpected on a commodity like a $300 gaming console. So Are You Sure?I'm not aware of an official source or teardown to confirm that the Xbox Series S uses clamshelled memory. However, there's no other feasible way to achieve what the Series S has achieved without clamshelling. It might be possible that only one of the memory controllers has to run in clamshell mode (or can partially run in clamshell mode) so that MS doesn't have to duplicate all of the memory chips. For example, if the GDDR spec permits one of the 64-bit memory controllers to "partially" run in clamshell mode, then the memory config would probably look like:
Surely only worrying about 5 memory devices would be cheaper than 8, but I'm not aware of any historical examples where this "partial clamshell" technique was used. Also, if MS did that, then why wouldn't they put all five memory chips on the front of the PCB? Wouldn't that be easier to cool? As a reminder, the render has only four memory chips on the front of the PCB with an unknown number on the back. Additionally, if 32Gb (4GB) GDDR6 memory was available, then Microsoft could use it to avoid this clamshelling business in the first place. However, I'm not aware of this existing yet and this would mean MS is juggling three separate memory types for its console supply chain. Two types of memory is probably already a low-key nightmare for a sophisticated supply chain like that of the Xbox, so I doubt that MS would enjoy adding a third to the mix even if it was available. Does This Stuff Even Matter?This kind of thing isn't important. All that matters is:
Developers already have to worry about this "fast-v-slow" complexity on the Series X, so it shouldn't be a big deal for the Series S. [link] [comments] |
Posted: 11 Sep 2020 11:11 AM PDT https://browser.geekbench.com/v5/compute/1477073 2080 TI scores between 100k - 115k. the average is 102k [link] [comments] |
Artificial Intelligence (GPT-3) Explains How RAM Works Posted: 11 Sep 2020 09:22 PM PDT |
ELI5 the weird RAM configuration of the Xbox Series S Posted: 11 Sep 2020 09:45 AM PDT It's a full day that I'm scratching my head about the RAM configuration on the Microsoft new little next-gen console. They gave us the bandwidth and with 224 GB/s on the "fast side" I assume they used the same 14 Gbps GDDR6 modules of the Series X on a 128 bit bus. The "really slow" 2 GB @ 56 GB/s are bothering me. Since in the exploded view we can see only 4 modules around the SoC I assume that a single 14 Gbps module on a 32 bit bus is hidden somewhere. Am I correct with my assumption? Thank for your replies. [link] [comments] |
Heatsinks: Do heatpipes soldered to the fins and base matter? Posted: 11 Sep 2020 07:48 AM PDT As many may know, Noctua has always been a dominant heatsink brand for many years. I wondered what made it so, and from short research it seems one of the most distinguishing traits they have is that they solder their heatpipes to the fins and base of the heatsink. The only other brand I know that does that is Thermalright (who don't market this for some reason). Supposedly, this is done to improve the conductivity of heat from base to the pipes to the fins as well as the longevity of the heatsink through all the expansions and contractions from heating and cooling. This might explain why heatsinks with many heatpipes like the Raijintek Pallas 120 (6) for instance consistently does worse than the Noctua NH-L12S (4). I was very sure this was what made them so good, but then the Scythe Big Shuriken 3 comes up.According to many benchmarks, such as Optimum Tech's, it's always reached very similar temps to the NH-L12S - in some cases even getting better temps too. Out of curiosity, I asked a Scythe representative if they soldered that heatsink, but the Big Shuriken 3's heatpipes are not soldered to the base and fins; only press-fitted. Which does raise a few questions:
It's always been a wonder why Noctua's been so good, and I thought I found the very thing that made it so... but then this Scythe heatsink comes with its more average process and somehow comes really close to beating the Noctua. What do you guys think? Edit: If anyone's interested, here are a list of some brands that I found use either a soldered base or fins:
[link] [comments] |
A set of hypothesises about the mystery of the Navi Bus Posted: 11 Sep 2020 09:26 AM PDT As we all know by now, Kopite ID'd a GPU with what appeared to be a 256 bit bus as Navi 21. This does not compote with the calculated abilities of the arch, as we know it from RDNA 1.0, or the behavior of nVidia in response to it. I have found several hypothesises to answer this that have been endorsed by retweets by Kopite and _Rogame, probably the most credible silicon leakers there are at the moment. https://twitter.com/Locuza_/status/1304407437450137600 Among other things, it suggests either changes to the L2$ tiles, and/or they may be using HBM for the big boy after all. The one I go with https://twitter.com/3DCenter_org/status/1304264470508834821 https://twitter.com/kopite7kimi/status/1304238246797275138 The burden of these conversations comes to 3 possibilities: Possibly a 512 bus, like Hawaii, Kopite messed up (He admitted, by retweeting the first one that he may have called it wrong, and the die is a touch small for that), or dinkered test article. We may have even fallen for something AMD did to fox industrial espionage - they have been running a tight ship on info, so it would not surprise if they're doing things to fuck with folks peering in. [link] [comments] |
Posted: 11 Sep 2020 03:17 AM PDT Are there any indications or credible speculations on that kind of design coming back any time "soon"? [link] [comments] |
Is SLI/multi-GPU truly dead for gaming? Posted: 11 Sep 2020 05:56 PM PDT I'm trying to research into the efficacy of SLI/multi-GPU setups with current/next generation cards, but have had a hard time getting reliable information. Most of the resources I can find are people saying they're either not supported at all or else aren't worth it because scaling is terrible. But, those are two very different claims, one is a flat out prohibition of any use, while the other can be highly variable and is heavily dependent upon the use case. Of the articles I've found that have actual numbers behind them, I've seen both atrocious and excellent scaling on nVidia's Turing chips. This variability has almost always been explained by different resolutions, where scaling at a lower resolution (1080p/1440p) is awful, while scaling at higher resolutions (particularly above 4K) has been able to achieve almost linear performance gain (90% for some 8K applications). However, I've also read that SLI support has been totally dropped from some major titles, like Metro: Exodus and MSFT Flight Simulator. Yet, there are also videos of people claiming to run Metro at 8K with dual RTX Titans. So, it's hard to know what's actually true. Personally, I have 3x 4K displays and have been able to drive them in ultra wide Surround with my multi-GPU setup (originally quad Titan-Xs... though a few have died over the years). I've always had great luck with SLI for my use case (11520x2160 is a lot of pixels and scaling's been good) and even modern games, like RDR2 have had seemingly good SLI support. Yet, many sources point to that drying up on future titles. So, does anyone know if SLI is flat out unsupported in current/future releases, or if it still is supported, but it just requires a suitably demanding display to show any appreciable benefit. Apologies if this is the wrong subreddit for this question, but I'm hoping for a more substantive discussion than "it's dead" and "scaling sucked for me on my 1080p monitor." [link] [comments] |
Posted: 11 Sep 2020 03:28 PM PDT What are your thoughts on the XBSX being "as good or better" than the 2080? Just curious. [link] [comments] |
You are subscribed to email updates from /r/hardware: a technology subreddit for computer hardware news, reviews and discussion.. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment