• Breaking News

    Sunday, September 13, 2020

    Hardware support: [Videocardz] Overclocking NVIDIA GeForce RTX 3080 memory to 20 Gbps is easy

    Hardware support: [Videocardz] Overclocking NVIDIA GeForce RTX 3080 memory to 20 Gbps is easy


    [Videocardz] Overclocking NVIDIA GeForce RTX 3080 memory to 20 Gbps is easy

    Posted: 12 Sep 2020 09:40 AM PDT

    [VideoCardz] Next-generation NVIDIA Quadro RTX (Ampere) to feature 10752 CUDA cores

    Posted: 13 Sep 2020 12:03 AM PDT

    SoftBank Nears $40 Billion Deal to Sell Arm Holdings to Nvidia

    Posted: 12 Sep 2020 09:49 AM PDT

    (GN)HW News - RTX 3080 Binning, AMD Zen 3, Liquid-Cooled Silicon, & Xbox Ser...

    Posted: 12 Sep 2020 07:35 PM PDT

    SoftBank set to sell UK’s Arm Holdings to Nvidia for $40bn

    Posted: 12 Sep 2020 01:01 PM PDT

    What does Nvidia buying Arm mean for the future of hardware?

    Posted: 12 Sep 2020 03:45 PM PDT

    What benefit is this to Nvidia? Are we looking at S.O.Cs? What other products will Nvidia be looking at? Will changes to the intellectual property landscape change the ballance with competitors> What will consumers experience?

    submitted by /u/ch1llboy
    [link] [comments]

    How do modern MCM approaches differ from Intel's approach to Core 2 Quad (Kentsfield)?

    Posted: 12 Sep 2020 11:34 AM PDT

    Intel used a dual-die design for the first quad core parts (2006), then moved back to monolithic parts starting with Nehalem (2008).

    What caused them to abandon that approach? Was it related to the FSB vs. integrated memory controller?

    submitted by /u/categoryofnone
    [link] [comments]

    3080 gigabyte teardown - possible look at reference pcb

    Posted: 12 Sep 2020 05:58 PM PDT

    DLSS: Will It Matter For The Next 3 Years?

    Posted: 12 Sep 2020 06:58 PM PDT

    So, like everyone else who has an interest in non-casual PC gaming, I'm excited about the upcoming release of new graphic cards from Nvidia and AMD. Matter of fact, I haven't cared this much about graphic components since the early 2000's. After the horror of the mining craze and the stagnation of true gains for a few years, we are suddenly on the verge of a generational leap in technology from the top all the way down even to the consoles. Plus, all the technology seems to be actually appearing in both PC and console games: GPU scheduling, I/O updates, raytracing, use of cloud AI for graphic rendering.

    In particular, Nvidia is working overtime to market just how incredible their gains in raytracing and DLSS is for the Ampere GPU cards that are about to be released. I wasn't really following either raytracing or DLSS before the Ampere announcement because I wasn't in the market for an RTX card but, with Flight Simulator suddenly making my 1660 Super cry in pain, I'm figuring I will get a new card in the next few months. I'm not sold on paying extra, though, for raytracing and DLSS gains even though they are impressive: right now, the lineup of games for both features is tiny and I would only have one game that might even take advantage of the features (Flight Simulator) in the future. I play games like Euro Truck Simulator 2, Civilization 5, and whatever is on Xbox Game Pass. I do, however, use VR so DLSS being in VR games possibly does interest me. Also, I know raytracing will get more support because both consoles will use it but DLSS doesn't seem like it is catching on.

    Are there any signs that DLSS will become more of an industry standard? Since it is an Nvidia technology, is AMD coming up with anything similar for its lineup (and, thus, for the consoles)? If DLSS isn't going to be utilized in a significantly larger amount of games than what is currently used then it feels like even mentioning it is simply to crow about how much larger benchmarks in a few random games is. That would mean I wouldn't feel horrible about getting an AMD card that has a bit less performance if the drivers aren't designed on a Commodore 64. While the Founder's Edition Ampere cards are priced reasonably, I'm starting to think that the AIB cards will be significantly more expensive so the Nvidia software advantage has to be significant also to matter to me.

    submitted by /u/cypher50
    [link] [comments]

    Why do Scythe's fan blades lean to the opposite of most fans and spin clockwise? Does it lead to a different performance or is it for aesthetic and brand recognition purposes?

    Posted: 12 Sep 2020 06:38 PM PDT

    Looking at all fan reviews, they don't talk about how Scythe's fans look different as the blades lean to the opposite of most fans. Also they spin clockwise when most would spin counter-clockwise. Only their slim versions adhere to the standard design of most fans. Does this really make a difference when it comes to airflow?

    submitted by /u/lordlors
    [link] [comments]

    Traditional enthusiast class GPU performance increases.

    Posted: 12 Sep 2020 06:50 PM PDT

    Hi, a final follow-up to my previous posts on traditional midrange and high-end tier performance increases within the same price range/brand.

    I'm defining enthusiast by the price segment $500 to $700. Given how wide this price range is, If cards from the same gen fall into the same bracket, I favour cards from the upper end of the price range (where the enthusiast price is concentrated).

    Similarly to my previous post, I derived release day price and non AIB performance chart scores from Techpowerup.com. I then scaled these to the baseline starting in 2010 (or 2011 in the AMD's case).

    One variation is I bumped up resolution performance scores to 2560x1440 for years 2014-2016 and 3840x2160 for 2017 and after.

    Nvidia | Price | Year | Perf over baseline

    GTX 580 | $500 | 2010 | 100%

    GTX 680 | $500 | 2012 | 119%

    GTX 780 TI | $700 | 2013 | 165%

    GTX 980 | $550 | 2014 | 175%

    GTX 980 TI | $650 | 2015 | 219%

    GTX 1080 | $599 | 2016 | 298%

    GTX 1080 TI | $700 | 2017 | 403%

    RTX 2080 | $699 | 2018 | 438%

    RTX 2080s | $700 | 2019 | 471%

    AMD

    HD 6990 | $699 | 2011 | 100%

    HD 7870 Ghz | $500 | 2012 | 100%

    R9 290x | $550 [ 2013 | 122%

    R9 Fury X | $650 | 2015 | 161%

    Vega 64 | $500 | 2017 | 204%

    Radeon VII | | $700 | 2019 | 265%

    For Nvidia, better long term performance gains in this tier over the previous 'high-end', but the wider cost range would suggest the price/performance doesn't scale as well. Midrange surpasses this tier as it did the 'high-end', securing its the bang for buck status.

    submitted by /u/Loan_Even
    [link] [comments]

    How many FPS do you think will be the threshold where there is no competitive advantage to go higher?

    Posted: 12 Sep 2020 04:03 PM PDT

    Obviously going from 100k FPS to 1 million FPS won't help anyone since it is way past limit of human reflexes. At what point do you think the hardware and monitor emphasis will have to be put on increasing image quality since putting more frames out won't help. 250 FPS, 500FPS, 1000 FPS?

    submitted by /u/JarJarAwakens
    [link] [comments]

    No comments:

    Post a Comment