Advertisement
Guest User

Untitled

a guest
Nov 12th, 2019
183
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 5.40 KB | None | 0 0
  1. You speak about a different gpu then i do. Nobody talks about the AMD Radeon here (which is shit tbh, crappy "modern programming"). AMD Radeon = > 160 programmable shaders. WiiU uses 32 TEVs, same as in Wii and Gamecube (which both used 16 of). And that is an ATI Radeon, sir. Nobody speaks about the "programmable shaders" (god how i hate that invention today lmao). And no- of course you cannot use 2 Gbytes of Ram for those ATI graphics. Since it has to be COMPATIBLE with an ATI Radeon if it says "ATI Radeon" on the Wii´s package and if it uses ATI Radeon-shaders after all. Which it does. You can google this. You could only use 3 mbytes of front-buffer graphics-memory for Wii`s ATI Radeon. Because the resolution was only 720 x 576 pixels (SD). And the WiiU one uses 32 mbytes because it uses up to 720p native resolution for these TEVs (this means you need more graphics-buffer, because 3 mbytes are not enough to store 3 images for triple-buffering with a resolution of 1280 x 720 pixels, Shi`nen once stated it occupies about 16 Mbytes for 3 such images of their game). Since it´s a hardcoded thing. You cannot use the slow DDR3-ram for the TEVs/GX-"register-combiners". the Standard-ram for Wii/gamecube/GX-stuff can only be used to run cpu-or other stuff on there. You know: the program itself. The graphics are just running though triple-buffered graphics-buffer. This is why you simply don`t need gbytes of Ram. Maybe you misunderstood me. And dude? It was assumed years ago, when they talked about the Wii, that the Wii used some proprietary PCI-protocol to handle it´s ATI Radeon`s 16 TEV-units. The WiiU has double amount of them though and something was changed/added, which is why you need that tiny little 8bit-cpu to translate when you want to run original Wii-games/use that GX-protocol. It´s done with that 8-bit-cpu and it´s done automatically though once the machine recognizes Wii-compatible-code. Oh and dude? The assumption in the past on Neogaf-forum was allways, that the Wii and Gamecube`s ATI gpus were simple 1:1 Radeons. Nothing else. Very early ATI Radeons = register-combiners. No Pixel and no Vertex-Shaders. No shaders at all tbh. Which is why i said: Look it up and google it and you´ll see: There aren´t many ATI Radeons with 32 Mbytes of Memory. Which is why i told him it could be an ATI Radeon 7000 (or 8500). One very early ATI Radeon from Gamecube-days. But like i said: That´s just an assumption. But: like i said: There is a good reason Nintendo went with eDRAM-memory here. Since normal old ATI Radeons from the past used old, slow Ram (such as DDR1, DDR2 or SDram even). NIntendo uses much faster and way more reliable ram (with a custom protocol though).  Btw: Why do you allways say stuff like" "Don´t spread misinformation"? Why do you not read stuff, and THEN understand what somebody tries to tell you and THEN answer? Reading=> understanding => answering. In your case an answer wasn´t even necessary lol. I don`t even read many answers here since it´s Youtube after all. Oh and dude? I wrote my own custom drivers years ago. It´s not that hard as you make it out to be. Since it depends on what lines allready work and are compatible (or if there is a driver which has a compatible chip-set after all). In my case, sure i only used a finished driver and re-wrote some lines to make it compatible. But that is the definition of "custom". In the WiiU´s case, since it´s an ATI Radeon after all (the GX-gpu part) it might very well be just some changed lines as well. You´ll never know if you don´t try. Which is my guess that the WiiU´s ATI Radeon (not the AMD-part, but the old fixed Shader stuff) is just using some embedded PCI or AGP-like interface. And that would mean: They just took that stuff from Gamecube/Wii-days and brought it to another chipset. So they might have used the SAME protocol for Gamecube, Wii and WiiU. if that´s the case (which is easy to find out), then it´s also easy to find out what exact protocol they used. I mean, some hackers even looked at the WiiU´s PowerPC to find out it uses an ~600x-Interface after all for the cpu-communication instead of the old ~60x-interface the Wii used. And they found out via Neogaf that it uses 6 USB-2.0-interfaces in total. In order to find out if the WiiU´s GX-interface matches that of a PCI/AGP-one you just have to look at the lines on the die (Which is exactly what they did with the USB-thing) and count them or you google and look up how PCI/AGP worked. And what voltages were needed. See? Why should Nintendo use a 100% custom protocol? Trust me: They didn´t. Nobody sane would do that. They just used PCI/AGP for Gamecube & Wii all those years which is the most logical and cheapest way to do it...And well. That makes the console cost less since developing an OWN custom protocol costs a TON of money. Which many companies are shy of. And we both know how cheap Nintendo sometimes can be. And using PCI or AGP in Gamecube-days makes sense. It was very cheap. the only difference here is that the WiiU isn´t a PC and thus doesn´t have expansion slots and that you easily can count these lines since it´s not made to replace a graphics-chip in there. But the protocol might be 1:1 the same. You just have that GPU hard-wired on the board, but the layout of the chipset might be same and the lines are the same. And the voltage might be the same. And i reckon there´s a high chance that even standard-drivers- maybe a few lines have to be replaced/changed might work.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement