• Breaking News

    Thursday, September 10, 2020

    Hardware support: AMD to introduce Zen3 on October 8, Radeon RX 6000 series on October 28 - VideoCardz.com

    Hardware support: AMD to introduce Zen3 on October 8, Radeon RX 6000 series on October 28 - VideoCardz.com


    AMD to introduce Zen3 on October 8, Radeon RX 6000 series on October 28 - VideoCardz.com

    Posted: 09 Sep 2020 09:18 AM PDT

    Xbox Series X launches Nov 10 for $499. Pre orders start Sep 22nd.

    Posted: 09 Sep 2020 06:10 AM PDT

    Alleged AMD Radeon RX 6000 engineering sample spotted - VideoCardz.com

    Posted: 09 Sep 2020 04:57 PM PDT

    Intel i9-10850K CPU Review & Benchmarks [GM]

    Posted: 09 Sep 2020 09:30 PM PDT

    Traditional mid range performance increase.

    Posted: 09 Sep 2020 04:12 PM PDT

    Hi, I was curious to see what kind of performance gain the traditional mid range GPU segment has had over the last ten years, using the 2010 release as the baseline.It was interesting to see the intervals for when you can attain a ~2x performance increase for a similar price. All prices and performances were derived from TPU.

    I've defined midrange by price: $200 to $250 (Nvidia) and $230 to $280 (AMD). I wanted to keep the price in a tight(ish) range so as to make the historic comparisons meaningful. I also didn't factor later price adjustments, only release day prices.

    Its important to note that the price similarities between Nvidia and AMD do not necessarily indicate performance similarities, e.g. The 5600 XT is 1.25 x the performance of the 1660s.

    Nvidia

    GTX 460(2010) $200 100%

    GTX 560 ti(2011) $230 133%

    GTX 660(2012) $230 166%

    GTX 760(2013) $250 208%

    GTX 960(2015) $200 230%

    GTX 1060(2016) $249 434%

    GTX 1660s(2019) $230 590%

    AMD

    HD 5830(2010) $270 100%

    HD 6870(2010) $240 135%

    HD 7850(2012) $250 163%

    R9 280(2014) $279 227%

    R9 380x(2015) $230 279%

    RX 480(2016) $239 366%

    RX 590(2017) $280 431%

    RX 5600 XT(2020) $280 625%

    P.s. Apologies for the formatting.

    submitted by /u/Loan_Even
    [link] [comments]

    China chipmakers speed up effort to cut reliance on US supplies

    Posted: 09 Sep 2020 07:02 PM PDT

    researchers demonstrate in chip water cooling

    Posted: 09 Sep 2020 12:22 PM PDT

    Samsung, LG Display to stop supplying panels to Huawei due to U.S. restrictions

    Posted: 09 Sep 2020 04:49 AM PDT

    Inside the Xbox Series S

    Posted: 09 Sep 2020 08:08 AM PDT

    AMDs Failed Radeon a Year Later, Radeon VII vs. RTX 2080 & GTX 1080 Ti (Hardware Unboxed)

    Posted: 09 Sep 2020 04:16 AM PDT

    Crypto-miners could flock to the Nvidia RTX 3080, leaving gamers out in the cold.

    Posted: 09 Sep 2020 11:11 AM PDT

    Researchers demonstrate in-chip water cooling (Arstechica)

    Posted: 09 Sep 2020 11:37 AM PDT

    [Anandtech] The Armari Magnetar X64T Workstation OC Review: 128 Threads at 4.0 GHz, Sustained!

    Posted: 09 Sep 2020 09:18 AM PDT

    [TecLab review] Promote 4K into mainstream RTX3080 global first experience (First RTX 3080 Benchmarks)

    Posted: 09 Sep 2020 07:12 AM PDT

    Lightmatter Startup Tries to Speed Up Computing Using Light

    Posted: 09 Sep 2020 08:17 AM PDT

    How do modern CPUs compare to the Cell Processor of the PS3? How many times more powerful are they compared to the PS3?

    Posted: 09 Sep 2020 07:52 AM PDT

    As the title says, I'm really curious about this. I also want to know how much more powerful modern CPUs are to the PS4 as well, as I have heard that it was severely underpowered when PS4 was released?

    submitted by /u/TheMostWanted774
    [link] [comments]

    How will consoles sharing similar CPU and GPU to PC affect console game coding?

    Posted: 09 Sep 2020 10:45 AM PDT

    Will this mean consoles will have easier time porting to PC? Will this affect emulation progress? In reverse, would a PC developer have a easier time to publish on consoles?

    submitted by /u/TK3600
    [link] [comments]

    Do you think GPU-based storage tech like RTX IO will tangibly lower CPU requirements in demanding games?

    Posted: 09 Sep 2020 09:48 AM PDT

    This is something I've been wondering about, of course it's hard to tell until it's actually out but I'm interested in discussing. Admittedly, I'm unsure of how much IO actually occupies the CPU in-game. I also suspect we're going to see something parallel to RTX IO on Big Navi, seeing how the PS5 is going to have similar functionality on AMD hardware.

    My interest is because I run an SFF build with an i5 9400F; while it still shines in single thread applications, its performance in CPU hungry games is starting to show its age. I'm planning on upgrading to the RTX 3070 and I'm optimistic it should take enough load off the CPU to give me at least a couple more years of acceptable performance.

    From a system builder's standpoint, I think it would be great if this makes a substantial difference in CPU requirements. Going back 5+ years when multi-thread optimized games were more of a novelty, the performance difference between the Haswell i3 and i7 was negligible even in CPU hungry games, to such an extent that I had a friend who ran a Pentium alongside a GTX 970 with no trouble. It would be a boon to the budget builder to have requirements look like that again.

    submitted by /u/shobgoblin
    [link] [comments]

    No comments:

    Post a Comment