Hardware support: PC Gamer: "Nvidia expects RTX 2060 12GB stock to 'ramp end of December through January' as retailers remain barren" |
- PC Gamer: "Nvidia expects RTX 2060 12GB stock to 'ramp end of December through January' as retailers remain barren"
- [der8auer] AORUS Z690 Testing and Surprising FPS per Watt Results testing i9-12900K vs Ryzen 5950X
- Display Stream Compression/DSC - How does it work?
- [VideoCardz] NVIDIA GeForce RTX 3090 Ti to launch on January 27th, RTX 3070 Ti on January 11th
- [TPU] Intel Core i9-12900K Alder Lake Tested at Power Limits between 50 W and 241 W
- [HUB] The Best Value Intel Z690 Motherboard: VRM Thermal Benchmark
- [TPU] Intel Z690 Motherboard Costs Explained
- [LTT] RTX 2060 discussion, with shout-out to r/hardware
- VideoCardz: "NVIDIA GeForce RTX 3050 to be announced on January 4th, launches January 27th"
- Put down your pitchforks: Intel's 12th-gen CPUs aren't power hogs | PCWorld
- Copying framebuffer from discrete GPU to iGPU, is this something new to Alder Lake or has this always been possible?
- [LTT] Is M1 Max worth $400 extra?
- Disappointed with Samsung, Qualcomm could shift some Snapdragon 8 Gen 1 production to TSMC
- "Intel Launches Integrated Photonics Research Center"
- Tipster outs 3DMark Time Spy score for upcoming AMD Rembrandt 6000 series APUs: new iGPUs could offer near GeForce GTX 970 performance - NotebookCheck.net News
- XDA Developers: "Benchmarking the Qualcomm Snapdragon 8 Gen 1: Performance expectations from 2022's flagships"
- Samsung to develop 'Pellicle', an essential EUV process product
- Cooling issues with Intel’s Alder Lake - Problems with the LGA-1700 socket and a possible workaround | igor'sLAB
- VideoCardz: "NVIDIA launches GeForce RTX 2060 12GB, a perfect card for crypto miners"
- Server Hardware Super-Cycle 2022, Software Developers Outlook
- XDA Developers: "Windows 11 is significantly slowing down some NVMe SSDs"
- Could raytracing acceleration help digital cameras? Are there other applications that aren't as obvious or haven't been talked about much?
- "AI Chip Innovator MemryX Hires Veteran Semiconductor Leader Keith Kressin as CEO"
- PCMag: "Nvidia: We Expect GPU Supplies to Improve in Second Half of 2022"
Posted: 09 Dec 2021 06:51 AM PST |
[der8auer] AORUS Z690 Testing and Surprising FPS per Watt Results testing i9-12900K vs Ryzen 5950X Posted: 09 Dec 2021 06:22 AM PST |
Display Stream Compression/DSC - How does it work? Posted: 09 Dec 2021 07:12 AM PST Apologies in advance if this isn't the correct subreddit, but I'm genuinely unsure where else to ask this sort of a question. I've seen DSC be bought up a lot online about how it is visually lossless - that the image quality between it and the uncompressed image should be almost impossible to tell apart. But I've been having a hard time finding information about how it actually works. Finding information about this online, simply, seems rather opaque. I was curious if some people on this subreddit could help clarify how it works for me. Does it work like how ADPCM works for audio, where the delta tables involved change depending on the previous samples in the chain, and the future samples in the chain? Or does it work something completely and totally different to that concept? As far as I've been able to figure out in my head, the ADPCM approach seems the most straightforward - but I'm also not the IT industry's greatest minds, thinkers, and engineers. I'd love if someone could shed some light for me on how Display Stream Compression actually works. And, again, apologies in advance if this is the wrong subreddit. I'd love to be directed to the correct one to ask this question if this is the incorrect one! [link] [comments] |
[VideoCardz] NVIDIA GeForce RTX 3090 Ti to launch on January 27th, RTX 3070 Ti on January 11th Posted: 08 Dec 2021 10:39 AM PST |
[TPU] Intel Core i9-12900K Alder Lake Tested at Power Limits between 50 W and 241 W Posted: 08 Dec 2021 04:05 PM PST |
[HUB] The Best Value Intel Z690 Motherboard: VRM Thermal Benchmark Posted: 09 Dec 2021 02:08 AM PST |
[TPU] Intel Z690 Motherboard Costs Explained Posted: 09 Dec 2021 07:23 AM PST |
[LTT] RTX 2060 discussion, with shout-out to r/hardware Posted: 08 Dec 2021 05:49 AM PST |
VideoCardz: "NVIDIA GeForce RTX 3050 to be announced on January 4th, launches January 27th" Posted: 08 Dec 2021 01:57 PM PST |
Put down your pitchforks: Intel's 12th-gen CPUs aren't power hogs | PCWorld Posted: 08 Dec 2021 06:16 AM PST |
Posted: 08 Dec 2021 02:17 PM PST The monitor I use for work/FPS is usually connected to my 12700k's iGPU, and I move it to my 3080 when using it for gaming (3080's ports are usually occupied by other screens). Today, I fired up Halo Infinite but forgot to switch my monitor's cable over to the 3080. Much to my surprise, I was still getting around 100fps at 1440p Ultra. Task manager was showing my 3080 at 85-90% utilization, with 50% copy utilization. The iGPU was sitting at roughly 30% utilization. The 3080 was set as the selected adapter within the game. So it appears that the 3080 was rendering the game, and the framebuffer was being copied from the 3080 to the iGPU. I had no idea this was possible - you always see stories of people new to PC gaming facepalming after they realize they had their monitor plugged into their motherboard instead of their discrete GPU (although I'm assuming this framebuffer copying is how it must work on gaming laptops). Is this a new capability with Alder Lake, or has this always been possible? [link] [comments] |
[LTT] Is M1 Max worth $400 extra? Posted: 08 Dec 2021 10:11 AM PST |
Disappointed with Samsung, Qualcomm could shift some Snapdragon 8 Gen 1 production to TSMC Posted: 07 Dec 2021 11:10 PM PST |
"Intel Launches Integrated Photonics Research Center" Posted: 08 Dec 2021 11:48 AM PST |
Posted: 08 Dec 2021 04:21 AM PST |
Posted: 08 Dec 2021 07:17 AM PST |
Samsung to develop 'Pellicle', an essential EUV process product Posted: 08 Dec 2021 09:14 AM PST |
Posted: 07 Dec 2021 10:17 PM PST |
VideoCardz: "NVIDIA launches GeForce RTX 2060 12GB, a perfect card for crypto miners" Posted: 07 Dec 2021 07:41 AM PST |
Server Hardware Super-Cycle 2022, Software Developers Outlook Posted: 08 Dec 2021 05:39 AM PST |
XDA Developers: "Windows 11 is significantly slowing down some NVMe SSDs" Posted: 07 Dec 2021 07:37 AM PST |
Posted: 08 Dec 2021 08:47 AM PST I already know a lot of phones with built-in cameras have additional hardware and software to help them process images and make pictures look better. Image processors and media processors and video processors and DSPs and all sorts of stuff. But one thing I've been thinking about for a while is that. Like, a digital camera works basically because a sensor receives light, and photons. All the photons that hit the sensor at a given moment in time are essentially the input that's used to make the picture or image. My understanding of raytracing, as a technique, is that it's all about simulating the way light works, and the way photons work, the way they move through a scene. So, like, it seems to me like raytracing hardware and software and the mathematics and techniques there could potentially be useful to further improve camera technology? Or maybe like. Use raytracing tech to better make "3D" images possibly? Based on a given image, the raytracing technology could calculate backwards and guesstimate all the light used to make that image and everything it bounced off of. I apologize in advance if this is a stupid question. [link] [comments] |
"AI Chip Innovator MemryX Hires Veteran Semiconductor Leader Keith Kressin as CEO" Posted: 08 Dec 2021 12:08 PM PST |
PCMag: "Nvidia: We Expect GPU Supplies to Improve in Second Half of 2022" Posted: 07 Dec 2021 08:15 AM PST |
You are subscribed to email updates from /r/hardware: a technology subreddit for computer hardware news, reviews and discussion.. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment