• Breaking News

    Sunday, July 19, 2020

    Hardware support: [LTT] Does Intel WANT people to hate them?? (RAM frequency restriction on non-Z490 motherboards)

    Hardware support: [LTT] Does Intel WANT people to hate them?? (RAM frequency restriction on non-Z490 motherboards)


    [LTT] Does Intel WANT people to hate them?? (RAM frequency restriction on non-Z490 motherboards)

    Posted: 18 Jul 2020 09:17 AM PDT

    Western Digital releases new 18TB, 20TB EAMR drives

    Posted: 18 Jul 2020 07:14 PM PDT

    Why don't GPU's and CPU's cool from both sides?

    Posted: 18 Jul 2020 11:08 PM PDT

    I noticed that the underside of my motherboard under where the CPU goes generates a lot of heat.
    Is there something stopping a CPU or GPU from being cooled from both sides?

    submitted by /u/caravellex
    [link] [comments]

    Is Apple's strategy of adding numerous hardware accelerators a possible shift toward an alternate route to speed up processors, as opposed to simply speeding up the cpu (increased frequency, cores, IPC)?

    Posted: 18 Jul 2020 10:41 AM PDT

    What I find interesting about Apples shift to "Apple Silicon" is that, aside from their presumed goal of increased raw cpu compute power, they will be putting a ton of hardware accelerators into the SoC. At least it seems to me to be fix-function hardware. This is already seen in the T2 chip in current Macs, which contain an H265 processor, SSD controller, image processor, ambient light sensor stuff, audio controller, etc.). This presumably allows Macs to perform these actions with greater speed and power efficiency. Apple's new ARM processors will have even more hardware accelerators, as shown in this picture, that, in theory, should accelerate tasks they deem to be important faster and with less power consumption.

    Intel, AMD, and Nvidia GPUs already have hardware support for video encode and decode, but that is admittedly much more prevalent and limited. Nvidia has 'recently' been adding more hardware, like "RT cores" and "Tensor cores," the former of which started a paradigm shift in gaming via real-time ray tracing, not to mention the faster render times in Blender and other 3D, ray-tracing based, software. DLSS is also another one.

    Is this something chip manufacturers might start doing? Instead of only increasing CPU speed, chips are made with more and more dedicated fixed function hardware that accelerate specific tasks, like an ASIC or FPGA? I understand this comes at a cost of lower flexibility, but if enough fix-function hardware is built, could it cover most, or at least a lot, of use cases of users? A traditional CPU would be necessary of course. Obviously this is not in place of a CPU, but in addition to it, to aid CPU manufacturers if/when they are having trouble continually raising raw CPU compute power. It may not improve Geek bench scores, but it might improve real world performance.

    I have real-world no chip design experience, so maybe these are just my musings...

    submitted by /u/HW_HEVC_Decode
    [link] [comments]

    Cherry-picked Samples & "Review Batches" - Elephant in the Room?

    Posted: 18 Jul 2020 04:53 AM PDT

    What inspired me to take some time to write this is the recent wave of awareness posts on bad review practices and how often actual customer experience differs and being hurt by various techniques of review manipulation.

    These themes of discussion are not new and most often I see in them about "buying review scores" or "altering review to be positive" somehow - which obviously easily detectable and brings on lot of backlash in case brought to publicity which keeps these practices somewhat tempered.

    However ever so often I spot a comments on hardware-related circles (sometimes appear to be from insider/close to insider) about (IMO) much more dangerous practice of "making reviews look good" by direct quality manipulation and skewing quality of items which know to undergo scrutiny towards more positive public feedback - while "mass market" quality can be kept lower/lower cost.

    From insider comments I've noticed over years these seem to fall into two broad categories:

    1) Ones done by PR department of a brand. Basically, means that, before sending samples to reviewer, they undergo some additional form of selection, e.g. to "pick CPUs that overclock well" or "pick monitors without bad light bleed". Easy to execute and fairly fool-proof (after all, they know who this sample will go with), hard to expose - only way basically is to make some insider to whistleblow (which quite rare and less effective since they almost universally tied by some interest to the company doing it).

    2) Ones done by a factory. The basic idea is "make a special good batch with higher QC". Sometimes even higher-quality/expensive components are used (in most extreme cases even separate "hardware revision"). The idea is to either provide an easy bin for review samples, or to reduce quality and manufacturing cost after initial batches (where most of review already done) - yes, often "new hardware revision" does not actually means "quality-improved" but rather "now corners can be cut". Might be done on actual behest from a brand - or by factory itself (technically, trying to swindle the brand). Couple years ago I've seen at least one commenter somewhat ranting about how it was hard to them to find even single factory in certain "manufacture powerhouse" country which wouldn't eventually let quality slip or try to swindle them over somehow - basically it required continuous extra checks & frustrating hair pulling over various "substitutions" in initial process which just kept popping up.

    But even badly affected commenters appeared often to just "meh, its always happening and always will - not much you can do about it". Most of the time such comment threads fizzled out pretty fast and being mostly ignored.

    Personally, I think above practice is much more widespread than attempts to bribe reviewers and in the end much more detrimental to customer experience. Just because its much safer for brands to pull without repercussion - and often not even viewed as something "bad" just "normal optimization" which "just makes sense to do - its stupid to send bad CPU to reviewer" (on the other hand reviewer bribing/threating all brands have explicit bans against at least in official internal policies).

    This might mean that only way to get proper review score is only consider reviews using samples from normal retail channels (with reviewer identity kept anonymous). Or brands should just give reviewer a voucher to have them buy sample themselves, against just giving them one. Still this does not protect against "first wave good batch, let later quality slip".

    What do you think? Have you ever heard reports about it happening and in what form? Did it ever affect you?

    Especially interested to hear from people with industry ties/sources.

    submitted by /u/riding_the_flow
    [link] [comments]

    SonicBOOM: The 3rd Generation Berkeley Out-of-Order Machine

    Posted: 18 Jul 2020 09:30 AM PDT

    (AHOC/Buildzoid) 5GHz RAM OC on basically all the 8GB Crucial Ballistix sticks I have from 3000 C15 to 4000 C18

    Posted: 18 Jul 2020 04:49 AM PDT

    My R9 295X2 used 2x8-pin power connectors and drew 500+W under load. How did it do this without violating power specifications?

    Posted: 18 Jul 2020 12:57 PM PDT

    The talk of nVidia possibly having a new 12-pin power connector to deliver more power to GPUs has got me thinking...

    With 2x8-pin connectors, the max power draw per specifications for a GPU is 375W. From the PCI-E slot you get 75W and then 150W each from each 8-pin GPU.

    So how did AMD get away with pulling 200+W through each 8-pin power connector? If the power specifications don't matter, then why does Nvidia need to make a new connector to handle new GPUs possibly exceeding the 375W spec?

    submitted by /u/Last_Jedi
    [link] [comments]

    [TechPowerUp] Alphacool Eiswolf 2 AIO GPU Cooler Review

    Posted: 18 Jul 2020 05:17 AM PDT

    [Hardware Unboxed] Core i5-7600K vs. Ryzen 5 1600, 2020 Revisit

    Posted: 18 Jul 2020 08:24 AM PDT

    [PCWorld] Why now is a bad time to buy a high-end graphics card

    Posted: 18 Jul 2020 12:35 PM PDT

    No comments:

    Post a Comment