• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Proprietary hardware for consoles was ass, actually

OCASM

Member
Custom hardware that gets you better performance at a lower price = Good.
Exotic hardware that makes the lives of devs hell = Bad.
 

winjer

Member
ps5 is linux based or something. Chips are custom to a degree too.
So obviously it's not one button press away but for sure it's quite simple... yet they had to rebuild gow 2018 from sratch porting it from ps4... so maybe not easy

Not Linux. It's based on FreeBSD.
 

UnNamed

18+ Member
People keep saying videogames used to have exotic hardwares and "boooh now are just PC", but since 1972, many console actually had pretty standard chipsets.

Atari 2600 had a MOS 6507 used in NES, AppleII, Lynx and other 30 systems or even more.
Megadrive had a 68000 also used in Amiga, Neogeo, ST, X68000 and so on, not counting many arcade boards.
Do we talk about the Z80?
PSX used a RISC3000, the same family we also found in N64 and some workstation and arcade.
Gamecube used a IBM CPU used for like 10 years in many devices, some of them even used in space rockets
XBOX had a Pentium 3 CPU
Dreamcast and Saturn used "not" custom Hitachi chips.

Different story is about video chips, but just that.

The only chipset specifically created for games were the EE, Cell and the Xenon.
 

MarkMe2525

Member
It was out of necessity. There was no other way they could build a console with "pc" parts at an affordable price point.

It would be neat to see a modern stab at a "custom" modern chipset. I don't know what that looks like.
 

simpatico

Member
I like the Nintendo cartridge idea where a game maker could choose to add actual hardware components into the cartridge to improve the graphics. People say it's not practical now, but it was probably even less practical then. Would love to see that kind of wild ambition return to the gaming hardware milieu.
 

MrA

Member
This premise also ignores how custom computers were in the late 70s 80s and early 90s most computers had custom chips and even a pc would have mountains of custom stuff if you upped the graphics and sound capabilities
and even into the 2000s until opengl and direct x really developed pcs were heavily proprietary good luck running a glide game without a voodoo card
Proprietary hardware back then was a must to specialize and get results, now unnecessary, just look at the steam deck off the shelf apu ,400 dollars and plenty of power,
Ps5 and xsx are delivering amazing visuals and a reasonable price even the xss (well outside of jaded hard-core fanboys that will never be happy)
Switch 2 is likely to deliver ps4 visuals on a tablet, why customize when the off the shelf stuff is amazing
 

calistan

Member
Given that everything is homogenised and there’s barely any proprietary hardware now, it’s kind of bizarre that people still argue over which console is the best.

It used to be that you could instantly identify a machine from a single screenshot. They had a particular look and feel, and that’s why people became attached to them. Now that’s all been taken away, the only difference is marketing.
 

Clear

Member
I think some people are overstating the importance of the source of the CPU and GPU. Yes those components are familiar to the PC market, but the way the system is constructed as a whole is the key thing.

The PS5's i/o integration is pretty radical, and its a huge part of how it can trade blows with Series X despite that system being more powerful based on PC-style metrics.

For obvious reasons this sort of innovation can only really occur in the console space, even if AMD is the one providing the parts.
 
Top Bottom