• Breaking News

    Wednesday, June 16, 2021

    Hardware support: 700,000 GPUs Shipped to Miners in the First Quarter of 2021

    Hardware support: 700,000 GPUs Shipped to Miners in the First Quarter of 2021


    700,000 GPUs Shipped to Miners in the First Quarter of 2021

    Posted: 15 Jun 2021 12:59 PM PDT

    Starlink dishes go into “thermal shutdown” once they hit 122° Fahrenheit

    Posted: 15 Jun 2021 02:24 PM PDT

    [VideoCardz] AMD AM5 motherboards to arrive in Q2 2022, Raptor's Lake Z790 series in Q3 2022

    Posted: 16 Jun 2021 12:31 AM PDT

    I’m trying to understand some things about the creation process of ICs in computer architecture

    Posted: 16 Jun 2021 12:40 AM PDT

    I have been programming for a while now and I started to get this urge that I need to understand how modern computers works. I've red a lot about how transistors works and been building some small breadbord simulations with logic gates. I get the idea of larger scale circuits with maybe dozens to hundreds of transistors, but my brain is exploding trying to understand the "billions of transistors" in microchips. So therefore I have some questions I hope someone would be glad to answer and discuss.

    1. When the transistors and logic gates are made on the wafers, do they use some kind of map to order them in a specific way to create something specific, like an ALU? Because I know the theory on how to build an ALU on a larger scale, but I'm having trouble understanding how all these parts connects on the super tiny chip.

    2. Does a wafer use different drawings depending on what type of part that is gonna be made? Like a RAM, ROM, Hard-Drive or processor? Or does these all use the same/similar parts of the wafer and are then programmed into learning to behave properly?

    3. How is the hardware programmed to understand low level instructions from the input? I mean, it's not hard to understand it when you have a couple of physical buttons, but when you have a full keyboard, is that hard printed somehow into the processor? Like when you press an 'A' in a terminal, does that send a specific binary input that's stored?

    4. Any good book recommendations on computer architecture that someone on a very beginner level in electronics can understand mostly?

    All answers are welcomed (short and summarized or longer) and of course pick a question if you don't want to answer them all :)

    Thanks for helping out!

    submitted by /u/cajmorgans
    [link] [comments]

    Doom runs on IKEA smart light bulb

    Posted: 15 Jun 2021 06:10 AM PDT

    According to Igor, 2-tile Ponte Vecchio consumes up to 600W, is 4 tile DOA ?

    Posted: 15 Jun 2021 10:06 PM PDT

    https://www.igorslab.de/en/everything-super-intels-ponte-vecchio-supercomputer-gpu-with-super-power-consumption-needs-a-super-cooler/

    Now I can save several k on a 10k A100 for this beast. Winter near the mountains doesn't seem like a pain in the ass anymore.

    All jokes aside, last time I heard only 3-tile was demonstrated. What's chance of Raja pulling one on us here ?

    submitted by /u/Lurk_a_long_time
    [link] [comments]

    Nvidia won’t explain the mysterious absence of its RTX 3070 Ti GPU

    Posted: 15 Jun 2021 05:20 AM PDT

    Crypto-mining’s half a billion dollar impact on AIB sales

    Posted: 15 Jun 2021 10:28 AM PDT

    From Russia with love - Warning about fake websites with Red BIOS Editor and MorePowerTool! | igor´sLAB

    Posted: 15 Jun 2021 11:19 PM PDT

    The Ultimate 12L PC Case? Dan C4-SFX Review

    Posted: 15 Jun 2021 07:24 AM PDT

    What is the point of AMD's upcoming 5700G compared to the 5700GE?

    Posted: 15 Jun 2021 07:19 AM PDT

    AMD has announced the 5700G and 5700GE processors. As best I can tell, they appear to have identical specs (cores, threads, boost clock, cache, GPU cores, memory support, etc.) - apart from the GE having a lower base clock. This lets it draw less power at idle but looks like it'll have the same performance when boosting, plus they're both unlocked processors anyway.

    Is there some situation where reducing power consumption by having the lower base clock is disadvantageous? Or, do we need to wait from more info from AMD to see if there's some other differences between these CPU's?

    submitted by /u/kryptopeg
    [link] [comments]

    Samsung’s first LPDDR5 uMCP enables flagship experiences on low-cost phones

    Posted: 15 Jun 2021 03:54 AM PDT

    Are Gaming Laptops Now Better Value Than Desktops? Radeon RX 6800M vs 6700 XT Benchmark

    Posted: 15 Jun 2021 05:38 AM PDT

    Nvidia/AMD/Microsoft - who will make the first true Apple M1 competitor?

    Posted: 15 Jun 2021 10:56 AM PDT

    ... in performance as well as performance per Watt.

    Nvidia already has Tegra X1 working in the Nintendo Switch, and has unveiled a high-performance ARM server CPU. With their ARM acquisition, it's not unlikely that they could jump into the CPU market.

    AMD is working on their own big.LITTLE CPU. Also has some history of working with ARM (see: AMD Opteron).

    Microsoft is working on ARM CPUs for their Azure servers. MS is new to chip design but can potentially achieve hardware-software vertical integration like Apple did.

    Intel has their own low-power x86 CPUs which could potentially compete with M1 in the future.

    Qualcomm... also exists I guess. Their laptop performance has been disappointing so far.

    submitted by /u/jumpy-town
    [link] [comments]

    No comments:

    Post a Comment