Hardware support: [VideoCardz] AMD Radeon RX 6800 overclocked to average 2.5 GHz |
- [VideoCardz] AMD Radeon RX 6800 overclocked to average 2.5 GHz
- From Ryzen 5 1600X to 5600X - The UPGRADE Path for AMD Zen
- [Gamers Nexus + Dremel] Sony PlayStation 5 Dual-Sense Controller Tear-Down & Disassembly
- New X-NAND Tech Detailed: SLC Speed at QLC Capacity and Pricing
- Why are Ryzen 8 core processors more expensive per core than 6 or 12?
- Watch Dogs Legion - Xbox Series X/S Ray Tracing vs PC RTX - Features, Quality, Performance + More
- [tech spot]Ryzen 5000 Memory Performance Guide
- [Louis Rossmann] Apple watching & logging EVERY APP YOU OPEN with new OS.
- Apple M1 GFXbench results
- Affinity Photo test (by dev): M1 vs 2019 27" iMac (6 core 3.7GHz + desktop 580X)
- Why is there an inverse relationship between architecture width and clock speed?
- bputil manpage - configuring Secure Boot settings on Apple Silicon Macs
- Amazon moving from nVidia to a custom (home grown?) chip
- [VideoCardz] AMD Radeon RX 6800 Basemark results leak out
- Apple M1 Geekbench5 OpenCL score
- SBC for risc V !
- Next board power-only GPU?
- (VideoCardz.com) ASRock planning custom Radeon RX 6900/6800 XT: Taichi, Phantom, Challenger and OC Formula series?
- EVGA launches GeForce RTX 3090 and RTX 3080 FTW3/XC3 Hybrid and Hydro Copper series
[VideoCardz] AMD Radeon RX 6800 overclocked to average 2.5 GHz Posted: 13 Nov 2020 02:07 PM PST |
From Ryzen 5 1600X to 5600X - The UPGRADE Path for AMD Zen Posted: 13 Nov 2020 10:55 AM PST |
[Gamers Nexus + Dremel] Sony PlayStation 5 Dual-Sense Controller Tear-Down & Disassembly Posted: 13 Nov 2020 11:49 PM PST |
New X-NAND Tech Detailed: SLC Speed at QLC Capacity and Pricing Posted: 13 Nov 2020 07:21 PM PST |
Why are Ryzen 8 core processors more expensive per core than 6 or 12? Posted: 13 Nov 2020 01:05 PM PST I was just comparing prices per core on Ryzen 3000 and 5000 series and I found it very weird, that the midrange processors are the most expensive while traditionally the top of the lines should be more expensive per unit. For example, in Germany the 3600 costs 185€ (31€ per core), the 3600xt 200€ (33€/core) the 3700x costs 280€ (35€ per core) and the 3900x goes for 400€ (33€/core). Why the heck is the midrange model, which normally is the value king in every product, the worst value here? It's also wierd that I goes 6-8-12 and not 6-9-12 (probably technical reasons) or 6-10-12. Given that most real world workloads aren't massively parrallelizable, once you get over the psychological hurdle of having a core count that is not a potency of 2, it seems to me as if AMD is leaving a massive amount of money on the table by incentivising their customers to save money and go for 6 cores instead of 8. And I assume most people aren't like me and don't even care if the number is a potency of 2, or an even number, or whatever. Maybe they speculate that people will buy 12 instead but I doubt that the majority will go up instead of down. Given that increasing core count comes from less defective cores in production, it should scale up superlinearily in price. Is this some crazy marketing move? Are they throwing darts to determine prices? What is going on here? Edit: Thanks for all the great responses, I learned a lot! [link] [comments] |
Watch Dogs Legion - Xbox Series X/S Ray Tracing vs PC RTX - Features, Quality, Performance + More Posted: 13 Nov 2020 06:22 AM PST |
[tech spot]Ryzen 5000 Memory Performance Guide Posted: 13 Nov 2020 10:07 AM PST |
[Louis Rossmann] Apple watching & logging EVERY APP YOU OPEN with new OS. Posted: 14 Nov 2020 12:41 AM PST |
Posted: 13 Nov 2020 10:40 AM PST |
Affinity Photo test (by dev): M1 vs 2019 27" iMac (6 core 3.7GHz + desktop 580X) Posted: 13 Nov 2020 04:57 AM PST M1 Vs iMac
Extremely strong CPU performance with big improvements over the iMac. Available on 999$ fanless laptop (insanity), a Macbook Pro and/or 599$ Mac Mini.
M1 demolishes in combined due to unified memory, despite lower rasterization performance. Conclusion: It's as if every known rule and principle can't be applied anymore. A fanless 999$ Air will have this power, even if it can't sustain it for long. Remember that no one complains that iPad Pros get uncomfortably hot and the performance does not seem to drop much. The Air is bigger, can dissipate heat better. Silent. Jesus. Bonus: On Apple's developer website, you can see Baldur's Gate 3 running natively on ARM (M1 equipped device) on 1080p ultra settings after 06:48. Supported GPU features include raytracing. I just can't wait to see what they put on the "professional" 13" MBP (the 28W intel part with 4 thunderbolt ports) and the 16", not to mention desktops. But this is insane. Think about it this way: Gigantic upgrades. One way of putting it is that Apple took the 10th gen 8core i9 laptop performance + half (or more) of 5600m GPU (4000$ config), dropped 1/3rd of the weight, made it considerably smaller, doubled the battery life, made it completely silent, and is now selling all of that as the 999$ Macbook Air. Just think about it for a bit. And this is ignoring what probably will be the most important half of the SoC (NPU, big amount of accelerators for everything, etc.), after ignoring the CPU+GPU. [link] [comments] |
Why is there an inverse relationship between architecture width and clock speed? Posted: 13 Nov 2020 01:34 PM PST I have heard a lot recently about narrower cores being able to clock higher than wider cores, but why is that? Are wider cores not able to clock as high, or do they just require more power than narrower cores, and therefor are not viable? [link] [comments] |
bputil manpage - configuring Secure Boot settings on Apple Silicon Macs Posted: 13 Nov 2020 03:10 PM PST |
Amazon moving from nVidia to a custom (home grown?) chip Posted: 13 Nov 2020 11:44 PM PST |
[VideoCardz] AMD Radeon RX 6800 Basemark results leak out Posted: 14 Nov 2020 01:41 AM PST |
Apple M1 Geekbench5 OpenCL score Posted: 13 Nov 2020 04:22 PM PST |
Posted: 13 Nov 2020 08:44 PM PST |
Posted: 13 Nov 2020 02:54 PM PST I'm wondering if anyone knows when the next GPU that only needs board power would be released? That is, a GPU that doesn't have any additional 6- or 8-pin adapters required. Currently, the GTX 1650 is the best one. Many thanks! [link] [comments] |
Posted: 13 Nov 2020 08:53 AM PST |
EVGA launches GeForce RTX 3090 and RTX 3080 FTW3/XC3 Hybrid and Hydro Copper series Posted: 13 Nov 2020 12:17 PM PST |
You are subscribed to email updates from /r/hardware: a technology subreddit for computer hardware news, reviews and discussion.. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment