Advertisement
Guest User

Why Tegra NX rumors are Ridiculous

a guest
Jul 29th, 2016
1,370
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.28 KB | None | 0 0
  1. With all these rumors floating around NX having a Tegra X1 or X2 chip, it's important to actually address these silly rumors before they blow out of control. My background is that I'm an engineer who has worked in individual IP development and Top Level RTL for a major Fortune 500 company. I'm going to explain from a design perspective why Tegra X1 or X2 are not going to work for a portable console.
  2.  
  3. Tegra X1 is a 15W part. By comparison, the A9X in the iPad Pro is a 5W part. This means the A9X is around 1/3 of the power consumption of the Tegra X1. While you can argue X1 already has some tablets with a X1 chip, this does not apply to a games console, where sustained loads basically guarantee the full power numbers of the chip are going to be used most of the time. This is something you see pretty clearly when you game on your smartphone, and your battery life is significantly less than if you didn't use it for gaming. Basically, this means X1 will have a sustained usage of 15W in this usage scenario, significantly higher than the A9X. This means for a dedicated gaming tablet, NX would have poor battery life. My guess is anywhere from 2-3 hours with something that we would normally use in a portable device, like a tablet.
  4.  
  5. Well what about X2 you ask? Honestly speaking, right now NVidia's financial reports, and Conference calls seem to think that they're not investing in any mobile technology, but rather focusing the Tegra group towards automotive, where they're experiencing pretty good growth. However, putting that aside, X2 would have to achieve an overall power efficiency jump of 300% on the Tegra X1 to have the same performance numbers of the A9X (Which would have a battery life of about 3-6 hours for gaming). This is simply not possible for NVidia to achieve with the Tegra X2. While Pascal is a significant leap when compared to Maxwell before it, X1 uses a modified version of Maxwell which divides the traditional FP32 CUDA core into 2 FP16 CUDA cores in order to hit power numbers. In cases where FP32 is used, these two FP16 CUDA cores are combined into a single FP32 CUDA core. What this means is that the performance per watt gains experienced by Pascal vs Maxwell in desktop variants are not going to directly translate over from Tegra Maxwell Tegra Pascal. Not only that, but there are still certain IP blocks in the design which are going to have similar power numbers from X1 to X2, ie the ARM cores. This is not even getting into the physical shrink. Tegra X1 is only shrinking from 20nm to 16nm in Tegra X2. We won't see the huge perf/watt gains that we saw in desktop Maxwell (28nm) to desktop Pascal (16nm).
  6.  
  7. My guess is, optimistically, if NVidia is pursuing a "low power" design with X2 then we'll see a part with 30% more power at 10W (still 2X of what's needed for a mobile part). However, in reality, we're probably looking at an X2 which either just gives up on power consumption targets for mobile since their money driver is for Tegra is automotive and heavy industry (according to NVidia's financial reports), or we'll see a part that's at 12W (20% reduction in power) which delivers a modest 10-20% performance jump (probably around 15%) if they're still pretending to target mobile devices. Either way, recent Tegra parts are not in a power envelope that could be used easily for mobile gaming.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement