• Breaking News

    Thursday, November 12, 2020

    Hardware support: MacBook air M1 Geekbench

    Hardware support: MacBook air M1 Geekbench


    MacBook air M1 Geekbench

    Posted: 11 Nov 2020 04:46 PM PST

    Cinebench R23 released. Now with support for Apple's M1 chip

    Posted: 11 Nov 2020 07:58 PM PST

    Geekbench 4 benchmark results make Geekbench 5 results on Apple CPUs questionable, as it includes more intensive math and memory benchmarks than Geekbench 5

    Posted: 12 Nov 2020 12:41 AM PST

    One example of A12X vs 4800U and A12X vs 1065G7 on Geekbench 4. A12X is the last Apple chip used on Geekbench 4 and since it got the same 15W TDP with 4800u, on the same TSMC 7nm process and Intel 10nm (which is considered equal to 7nm TSCM), it would be a more accurate comparison. So, scores seem to be similar to Geekbench 5, except for the SGEMM and SFFT benchmark, are 2 times faster for x86 CPU and 3 times faster on zen 3 vs the A12X. Memory latency is a bit faster too. On Intel desktop CPUs, the latency would be 3 times faster, would contribute to a significant higher score.

    On the contrary, on Geekbench 5, A12X vs 4800U and A12X vs 1065G7, the SEGMM and SFFT benchmarks, which make up the most significant difference between A12X and the two x86 CPUs, among other favorable benchmarks, were removed. Here is how the SEGMM and SFFT were implemented: https://i.imgur.com/nxZp12g.png. Those benchmarks utilizes x86 AVX and ARM NEON instructions, which are important accelerators for modern applications.

    On Rigid Body Physics benchmark, the FPS is much lower on Geekbench 5 compared to Geekbench 4. Running Geekbench 4, the A12X scored only 13487.7 FPS, while the 1065G7 scored 18822.3 FPS and the 4800u scored 18635.9 FPS. However. Running Geekbench 5, the A12X scored 6901.5 FPS, very close to 1065G7: scored 8003.2 FPS, and 48700U: scored 7667.4 FPS. This reduces the score difference by half in Geekbench 5. In both versions, the benchmark makes use of Box2D Physics engine, so why are the results in FPS so difference?

    Perhaps, the Geekbench authors felt that heavy math workloads and similar are not suitable to run on small mobile devices and exclude them. However, as Apple is pushing their CPUs into laptop and desktop market, is it time to include heavier benchmarks that reflect typical desktop workloads?

    Other benchmarks:

    - The CB R23 score of the A12X in Mac Mini devkit: https://i.imgur.com/CdPYv0a.png

    - The CB R23 scores of zen 2 and zen 3 CPUs: https://i.imgur.com/V56NPtx.png (Source: Computebase.de).

    submitted by /u/tuhdo
    [link] [comments]

    Intel's Graphics Driver Now Sharing ~60% Codebase Between Windows/Linux, 90~100% The Performance

    Posted: 11 Nov 2020 04:15 PM PST

    Gamers Nexus' Research Transparency Issues

    Posted: 11 Nov 2020 05:35 AM PST

    Before starting this essay, I want to ask for patience and open-mindedness about what I'm going to say. There's a lot of tribalism on the Internet, and my goal is not to start a fight or indict anyone.

    At the same time, please take this all with a grain of salt - this is all my opinion, and I'm not here to convince you what's wrong or right. My hope is to encourage discussion and critical thinking in the hardware enthusiast space.

    ------

    With that out of the way, the reason I'm writing this post is that, as a professional researcher, I've noticed that Gamers Nexus videos tend to have detailed coverage in my research areas that is either inaccurate, missing key details, or overstating confidence levels. Most frequently, there's discussion of complex behavior that's pretty close to active R&D, but it's discussed like a "solved" problem with a specific, simple answer.

    The issue there is that a lot of these things don't have widespread knowledge about how they work because the underlying behavior is complicated and the technology is rapidly evolving, so our understanding of them isn't really... nailed down.

    It's not that I think Gamers Nexus shouldn't cover these topics, or shouldn't offer their commentary on the situation. My concern is delivering interpretations with too much certainty. There are a *lot* of issues in the PC hardware space that get very complex, and there are no straightforward answers.

    At least in my areas of expertise, I don't think their research team is meeting due-diligence for figuring out what the state-of-the-art is, and they need to do more work in expressing how knowledgeable they are about the subject. Often, I worry they are trying to answer questions that are unanswerable with their chosen testing and research methodology.

    ------

    Since this is a pretty nuanced argument, here are some examples of what I'm talking about. Note that this is not an exhaustive list, just a few examples.

    Also, I'm not arguing that my take is unambiguously correct and GN's work is wrong. Just that the level of confidence is not treated as seriously as it should be, and there are sometimes known limitations or conflicting interpretations that never get brought up.

    1. Schlieren Imaging: https://www.youtube.com/watch?v=VVaGRtX80gI - GN did a video using Schlieren imaging to visualize airflow, but that test setup images pressure gradients. In the situation they're showing, the raw video is difficult to directly interpret, and that makes the data they're showing a poor fit for the format. There are analysis tools you can use to transform the data into a clearer representation, but the raw info leads to conclusions that are vague and hard to support. For comparison, Major Hardware has a "Fan Showdown" series using simpler smoke testing, which directly visualizes mass flow. The videos have a clearer demonstration of airflow, and conclusions are more accessible and concrete.
    2. Big-Data Hardware Surveys: https://www.youtube.com/watch?v=uZiAbPH5ChE - In this tech news round-up, there's an offhand comment about how a hardware benchmarking site has inaccurate data because they just survey user systems, and don't control the hardware being tested. That type of "big data" approach specifically works by accepting errors, then collecting a large amount of data and using meta-analysis to separate out a "signal" from background "noise." This is a fairly fundamental approach to both hard and soft scientific fields, including experimental particle physics. That's not to say review sites do this or are good at it, just that their approach could give high-quality results without direct controls.
    3. FPS and Frame Time: https://www.youtube.com/watch?v=W3ehmETMOmw - This video discusses FPS as an average in order to contrast it with frame time plots. The actual approach used for FPS metrics is to treat the value as a time-independent probability distribution, and then report a percentile within that distribution. The averaging behavior they are talking about depends on decisions you make when reporting data, and is not inherent to the concept of FPS. Contrasting FPS from frametime is odd, because the differences are based on reporting methodology. If you make different reporting decisions, you can derive metrics from FPS measurements that fit the general idea of "smooth" gameplay. One quick example is the amount of time between FPS dips.
    4. Error Bars - This concern doesn't have a video attached to it, and is more general. GN frequently reports questionable error bars and remarks on test significance with insufficient data. Due to silicon lottery, some chips will perform better than others, and there is guaranteed population sampling error. With only a single chip, reporting error bars on performance numbers and suggesting there's a finite performance difference is a flawed statistical approach. That's because the data is sampled from specific pieces of hardware, but the goal is to show the relative performance of whole populations.

    ------

    With those examples, I'll bring my mini-essay to a close. For anyone who got to the end of this, thank you again for your time and patience.

    If you're wondering why I'm bringing this up for Gamers Nexus in particular... well... I'll point to the commentary about error bars. Some of the information they are trying to convey could be considered misinformation, and it potentially gives viewers a false sense of confidence in their results. I'd argue that's a worse situation than the reviewers who present lower-quality data but make the limitations more apparent.

    Again, this is just me bringing up a concern I have with Gamers Nexus' approach to research and publication. They do a lot of high-quality testing, and I'm a fairly avid viewer. It's just... I feel that there are some instances where their coverage misleads viewers, to the detriment of all involved. I think the quality and usefulness of their work could be dramatically improved by working harder to find uncertainty in their information, and to communicate their uncertainty to viewers.

    Feel free to leave a comment, especially if you disagree. Unless this blows up, I'll do my best to engage with as many people as possible.

    ------

    P.S. - This is a re-work of a post I made yesterday on /r/pcmasterrace, since someone suggested I should put it on a more technical subreddit. Sorry if you've seen it in both places.

    Edit (11/11@9pm): Re-worded examples to clarify the specific concerns about the information presented, and some very reasonable confusion about what I meant. Older comments may be about the previous wording, which was probably condensed too much.

    submitted by /u/IPlayAnIslandAndPass
    [link] [comments]

    Intel Icelake Server Die Size & Floorplan Inefficiencies Revealed

    Posted: 12 Nov 2020 01:24 AM PST

    AMD Radeon RX 6800 and RX 6800 XT GPU OpenCL Performance Leaks

    Posted: 11 Nov 2020 01:07 PM PST

    Bluepoint Games explains PS5 SSD and hardware decompression advantages

    Posted: 12 Nov 2020 12:54 AM PST

    PLATYPUS: Software-based Power Side-Channel Attacks on x86

    Posted: 11 Nov 2020 12:26 PM PST

    Phison E18 real-world benchmarks

    Posted: 12 Nov 2020 03:25 AM PST

    [Gamers Nexus] Fractal Meshify 2 Case Review

    Posted: 11 Nov 2020 06:20 AM PST

    [Digital Foundry] Devil May Cry 5 SE: PS5 vs Xbox Series X - The First Next-Gen Performance Face-Off

    Posted: 11 Nov 2020 07:00 AM PST

    Rated support for DDR Memory always lower than actual best-in-slot, without any "apparent" impact on reliability or longevity... why? Intel/AMD shooting themselves on the foot? Anyone can shed some light on this?

    Posted: 12 Nov 2020 12:31 AM PST

    For context, I'm talking about out-of-the-box situations and no overclocking, so factory defaults only. Using the example of the new Ryzen, AMD's site mentions official support of "up to 3200 Mhz".

    This is despite their Infinity Fabric fclk being defaulted to 1800, and therefore being better aligned to a 3600 Mhz kit, without any overclocking since the recommendation is 1:1:1 ratios?

    Their new blog post even mentions adding support up to 2000 Mhz fclk for 4000 Mhz memory kits.

    If AMD wants their CPU to perform best, why not just mention support "up to XXX" but also emntion a "recommended DDR 3600 Mhz" since this would provide them with the 1:1:1 ratio required for best performance out of the box, without overclocking, and from all reports it seems that this is perfectly stable, reliable and without impact on longevity?

    For overclockers, of course all numbers would be guidelines and they can find their own sweet spot based on their actual chip.

    But for out of the box, average users, why not provide recommendations that guide to the 1:1:1 ratio directly, CPU ships with 1800 fclk, 3600 DDR4 works without a glitch, reliably and stable... why shoot yourself in the foot and mention support "up to 3200 Mhz" only?

    P.S.: Intel seems to be doing the same thing so not wanting to cause a partisan debate? Just genuinely curious why the manufacturers are being so conservative with their DDR "support" claims when it's clear they can support much better and faster?

    submitted by /u/Sykoon_Reader
    [link] [comments]

    XPU and Software Update - Intel oneAPI, Data Center Software and Intel Server GPU

    Posted: 11 Nov 2020 10:58 AM PST

    Intel Launches Xe-LP Server GPU: First Product Is H3C’s Quad GPU XG310 For Cloud Gaming

    Posted: 11 Nov 2020 06:39 AM PST

    (HWBuster)El Cheapo Power Supplies Part #2 - be quiet! - Rasurbo - Chieftec

    Posted: 11 Nov 2020 09:10 AM PST

    M1 Chip is gonna be the greatest revolution in Apple after 2005!

    Posted: 11 Nov 2020 11:42 PM PST

    No comments:

    Post a Comment