• Breaking News

    Thursday, January 16, 2020

    Hardware support: x86 CPU Design House Centaur Technology will host an AMA next week, Thursday Jan 23, 2020 @ 12:00pm-3:00pm CST

    Hardware support: x86 CPU Design House Centaur Technology will host an AMA next week, Thursday Jan 23, 2020 @ 12:00pm-3:00pm CST


    x86 CPU Design House Centaur Technology will host an AMA next week, Thursday Jan 23, 2020 @ 12:00pm-3:00pm CST

    Posted: 15 Jan 2020 02:23 AM PST

    Hi from Austin TX!

    Come ask us about how we design x86 CPU's and our new AI co-processor. Our engineers will be answering the questions.

    Learn more about the industry's first high-performance x86 SOC with server-class CPUs and integrated AI coprocessor technology.

    Technical details disclosed in Linley Group's Microprocessor Report

    We even have a documentary: Rise of the Centaur

    submitted by /u/CentaurHauls95
    [link] [comments]

    Intel's Mitigation For CVE-2019-14615 Graphics Vulnerability Obliterates Gen7 iGPU Performance

    Posted: 15 Jan 2020 07:11 PM PST

    (Anandtech) CES 2020: Clevo & XMG Prepare Notebooks w/ 12-Core AMD Ryzen 9 3000 CPUs

    Posted: 15 Jan 2020 05:58 PM PST

    Vulkan 1.2 Specification Released: Refining For Efficiency & Development Simplicity

    Posted: 15 Jan 2020 07:07 AM PST

    Exclusive: Washington pressures TSMC to make chips in US - Pentagon fears Chinese interference in Taiwan's semiconductor giant

    Posted: 15 Jan 2020 07:53 AM PST

    list of major annual computer hardware related events in 2020

    Posted: 15 Jan 2020 05:46 PM PST

    upcoming computer hardware (CPU, GPU, monitor etc.) is often announced/showcased around certain annual events, list of them below

    2020

    January 7-10; CES (Consumer Electronics Show)

    March 16-20; GDC (Game Developers Conference)

    March 22-26; GTC (GPU Technology Conference)

    June 2-6; COMPUTEX (Taipei International Information Technology Show)

    June 9-11; E3 (Electronic Entertainment Expo)

    July 19-23; SIGGRAPH (Special Interest Group on Computer GRAPHics and Interactive Techniques)

    August; Hot Chips

    August 25-29; gamescom

    September 4-9; IFA

    brand specific

    June; WWDC (Apple Worldwide Developers Conference)

    -----

    major price discounts possible

    November 27; Black Friday

    November 30; Cyber Monday

    submitted by /u/gort11112
    [link] [comments]

    Intel Discontinues Some Cascade Lake Xeon Models, Slashes Pricing on Others

    Posted: 15 Jan 2020 10:37 PM PST

    NVIDIA - Frames Win Games - Community Questions Answered

    Posted: 15 Jan 2020 01:42 PM PST

    Hi, everyone!

    The NVIDIA GeForce team has recently been doing a lot of work exploring the benefits of gaming with higher refresh rates. From looking into the science of gaming on higher hertz monitors, to esports benefits, and even working with Linus, we really wanted to get down to why more frames matter.

    We recently held a live Q&A on the r/NVIDIA subreddit with Nvidia's GeForce Esports Product Manager, Seth Schneider (/u/coldfire37). Users had the opportunity to ask him anything regarding framerate, refresh rate, monitors, latency, etc., and we thought we'd share some of the answers to those questions.

    Below you'll find a concise and summarized edit of the questions we received and answers Seth provided. If you'd like to view the full Q&A, please visit the original thread.

    Q: "How does NVIDIA's newly released max framerate limiter compare to other in-game framerate limiters (RTSS) in button-to-pixel latency?"

    Seth: Given consistent frame rates, good in-game frame rate limiters (FRL) naturally have more control of the entire pipeline and thus can produce lower latency than our max framerate limiter or RTSS. This is simply because the GPU is later in the pipeline and there is only so much control we can have over how the game processes their simulation.

    Although, using an FRL to get lower latency can be a tricky business! Game FPS is not usually consistent. If the uncapped FPS would naturally be much higher than the frame rate limit, capping will produce higher latency. Conversely, if the game becomes GPU bound by dipping too far below the cap, it will also result in higher latency. This is one of the reasons why we created the Ultra Low Latency mode in the driver. It handles the dynamic nature of games pretty well and provides lower latency in most situations.

    Q: "Does 144Hz and 180Hz have a big difference while gaming? I worry that overclocking to 180Hz will shorten the monitor's lifespan."

    Seth: 144Hz is the new baseline for competitive gaming. We have shown benefits going from 144hz all the way up to 360hz. Regarding overclocking, we suggest reaching out to your display manufacturer for details.

    Q: "In your opinion, can we expect similar refresh rate capabilities from all major monitor technologies: IPS, TN, LCD, etc?"

    Seth: LCD display techs such as IPS, TN, and VA all can be driven at high Hz. There are 240Hz G-SYNC displays on the market today that use each technology. Where you see the difference however is in pixel response times. Typically TN panels have faster pixel response times than IPS or VA, but trade off some color contrast and image quality.

    Q: "1. I want a tear-free experience. On my 144Hz monitor I have set G-Sync and V-Sync on through NVCP, but, additionally, I have set a frame cap under 144FPS that helps to reduce input lag (info from blurbusters.com). Why is this so and will there be any solution?

    1. I have the ACER XB270H (TN panel). At 144FPS at 144Hz, if I move my mouse fast enough in Battlefield 4, I notice that the picture loses its sharpness. I recorded the gameplay with a camera in slow motion and noticed the objects on the screen ghosting. It looks like motion blur, but I do not have that option turned on.

    Does it make sense to wait for the next display technology instead of currently buying a 144+Hz monitor with a TN/IPS/VA panel? At the moment there are high-Hz monitors, but pixel response times aren't very fast. Wouldn't it make more sense to wait for OLED monitors to become more ubiquitous?"

    Seth: Let's break improvement into two parts: perception and actual gamer skill improvement.

    Starting with perception, let's imagine a moving scene. The vast majority of people can perceive the difference between 60, 144, 240, and even 360hz. At CES, we just showed a demo in concert with BlurBusters where people could see the difference between 240 and 360hz. If you want to check out the demo, here is the link: https://www.testufo.com/framerates-versus

    As far as improvements to gamer skill, we have done research experiments (Blind tests, control groups, etc.) that do show benefit to gamer skill when Hz and FPS increase in certain aiming tasks. We published a paper a SIGGRAPH Asia last year that shows some of these benefits. Here is the study: https://research.nvidia.com/publication/2019-11_Latency-of-30

    Also in the Linus video we did see these benefits as well! - https://www.youtube.com/watch?v=OX31kZbAXsA

    That being said, based on the research we have conducted so far, it turns out there is real improvement.

    Q: "I'd like to see the importance of high-Hz gaming expanded to also cover single player experiences. Even in games like The Witcher 3 or Metro: Exodus, I prefer a higher (100-120fps) framerate when possible.
    Also, how important is RAM speed for to maximize framerate? Some say you should tweak the sub-timings as well. Is this actually useful?"

    Seth: High hz and Ultra Low Latency mode benefit single player games as well! We're always trying to optimize performance while also providing the most stunning visuals like RTX and allow the user to tailor the experience to their personal preference.

    I'm more of an expert on GPUs and Displays, but RAM speed tends to not have a significant impact on framerate in most scenarios. Although, faster memory never hz anyone. :)

    Q: "I know using a frame limiter like the one added in 441.87 helps to keep framerate consistent. How does NVIDIA's framerate limiter differ from utilizing G-Sync and NVCP's V-Sync while using the Ultra Low Latency mode?"

    Seth: A frame rate limiter will only set the upper bound on FPS so if the game's workload were to increase such that it drops below that target then frames will start to queue up and increase latency. Ultra Low Latency mode dynamically adapts to these situations to minimize that queuing.

    Q: "At what point with FPS does one start to see diminishing returns? Is there a point when the amount of frames per second becomes redundant or is there always room for MORE FPS?"

    Seth: Great question. Currently we have not found an upper limit, but we have linked gamer skill with System Latency which will hit diminishing returns at some point. As of today, we are still seeing improvements even up at the 360FPS/360hz range. Our expert research team is very interested in this question though! We will surely publish a formal paper once we determine where this limit is - likely, this threshold will be different for each person.

    TLDR: There are always moar frames. :)

    Q: "What do you think is the next, big thing in display technology? After Hz, resolution, OLED, G-SYNC/Adaptive Sync - what's next?"

    Seth: If we take a trip down memory lane back to pre-2013, display technology was pretty stagnant. Basically all 1080p 60Hz displays. Since G-SYNC arrived on the scene, we have been pushing the display industry to create higher resolutions (77inch BFGD LG TVs), higher Hz (Like our 360Hz tech we just announced at CES), backlight technologies such as OLED and mini-LED that provide incredible contrast, and VRR/overdrive technologies to improve image quality and reduce ghosting. In the future, we will keep improving on these vectors by providing these breakthrough technologies to gamers.

    Check out our slow motion video that visualizes different FPS/Hz values to get an idea of how display technology has progressed over the years: https://www.youtube.com/watch?v=uJxxCgKa0mU

    As a competitive gamer that used to play on a 60Hz display, these technology breakthroughs have certainly changed the way I compete.

    Q: "Why does maxing out your GPU SIGNIFICANTLY increase input lag? Battlenonsense made a video about it. I think a lot of us would like to know more about this."

    Seth: The GPU is late in the rendering pipeline so if it becomes the bottleneck, everything that comes before it has the option to queue up work and 'run ahead'. This is good for maximum throughput but adds to latency. That's why adding frame rate limits (either static or dynamic) earlier in the pipeline will minimize these queues and reduce latency. If your GPU is not maxed out that means the bottleneck is likely much earlier in the application and once the game is ready to render it will run through the rest of the pipeline without waiting in any queues.

    Q: "What made you do this project?"

    Seth: I grew up as a competitive gamer. It's actually what got me into computer engineering and computer science. NVIDIA is full of gamers who want to make products that actually make a difference for the thing they love - gaming.

    For me, I have over 2k hours in CS:GO, so researching how tech can improve gamer skill is extremely interesting to me. It's been fun hearing from the community on our work with Linus, Shroud, and n0thing while we explore this topic. This is just the beginning of our work with frameswingames and I'm looking forward to more awesome breakthroughs like 360Hz.

    submitted by /u/SurooshX
    [link] [comments]

    Intel retook semiconductor top spot from Samsung in 2019

    Posted: 15 Jan 2020 12:29 PM PST

    [Buildzoid/Actually Hardcore Overclocking] MSI Creator TRX40 transient response and LLC settings.

    Posted: 15 Jan 2020 10:36 AM PST

    A fine host for a Raspberry Pi: The Register rakes a talon over the NexDock 2

    Posted: 16 Jan 2020 01:09 AM PST

    Apple’s rack-mountable Mac Pro is now available

    Posted: 15 Jan 2020 04:16 AM PST

    When are we going to see rackmount chassis sold for the custom building crowd?

    Posted: 15 Jan 2020 05:56 PM PST

    You could have something like a 5u form factor where there's three 120mm fans for blowing air through the case, BUT there would be a air intake area that's so that air is sucked through it,and into fan intakes where you would bolt water cooling radiators onto.

    It could be a sheet metal panel that you can remove to mount radiators & fans to.The sheet metal would isolate airflow, and give you a position to mount radiators.

    That extra 1u would be for the intake for dual 480 rads so you could built two loops,each loop would have fresh cool air versus bolting two radiators together sandwich style that would pass hot air from the 1st radiator into the 2nd.

    submitted by /u/The_toast_of_Reddit
    [link] [comments]

    USB flash drive cooling for a stable high speed

    Posted: 15 Jan 2020 10:35 AM PST

    Hey, i've noticed that most USB flash drives promoted with high read and write speed slow down after a few seconds in reality. Which means the promoted speed is only given when you transfer small files. What makes them slow down is it a controller which set the speed down so it does not overheat? Or is there an other reason? Would the speed be stable on a high level when it would be cooled? I was thinking to build a cooler/heatsink for a USB drive but i wanted to hear the opinion of the experts in this subreddit first. Let me know what you think.

    submitted by /u/gLr007Bm
    [link] [comments]

    Everything I Know About SSDs 2019

    Posted: 15 Jan 2020 05:43 AM PST

    Intel: Tiger Lake to bring "double-digit performance gains" over Ice Lake

    Posted: 15 Jan 2020 11:41 AM PST

    Logitech’s new split Ergo K860 keyboard expands its ergonomic accessory lineup

    Posted: 15 Jan 2020 06:18 AM PST

    How the Power Save techniques are implemented in CPU, SSDs, HDDs?

    Posted: 15 Jan 2020 07:07 AM PST

    Hello there. I've never thought about the topic, and I actually have no idea how this is implemented at a low level.

    1. How does CPU saves power? If I read the `/proc/cpuinfo` from time to time, it reports different rates, let's say, 250mhz, 1500mhz, 4700mhz. Let's say the CPU clock rate now is 250, what happens when it decides to increase the power consumption and the clock rate? How does it know and when to do that? And how does it know when to lower the consumption (and so, speed)? Let's say, I am about to perform an assembly operating, `mov ax, 0`. What (if it does) makes the CPU increase the rate?
    2. I have some SATA HDD which, when not used, needs a few seconds in order to perform any kind of operation. How does this work? Does it just have a timer and then it hits it goes down? Is it the same as it were during the days when we had CD-drives when whenever we wanted any data from a cd, it first needed to get some speed and only then responded? Mechanics should be the same, right?
    3. How does SSD power-saving techniques differ from the HDD's?
    submitted by /u/vityafx
    [link] [comments]

    No comments:

    Post a Comment