Well, indeed, consoles are as is and PCs need some thought put into the purchased parts (or complete models) because a PC can be meant for many different uses like office work, design, video editing, bitcoin farming (lol), or video games, and then on top of deciding what you want it for you have to decide if you want it desktop, laptop, home theater, tablet form, and then look for the best things for your chosen budget. I don’t see what any of that has to do with claiming gaming support for an OS is less than on consoles.
Depending on the budget I don’t think a PC gamer absolutely has had to upgrade sooner than he would have to get a newer console and even so you save a lot of money on cheaper PC games that are commonly at least ~10 cheaper by default even before considering the amount of crazy sales happening digitally and in (mostly online retail) stores for them (which aren’t as hard on the publishers as they don’t need to pay as high platform fees, if any).
I think buying new parts a couple years after a console generation has started is a good rough guide. It’s best to wait on consoles anyway, for them to mature as platforms, to avoid early revision hardware, software and service problems, to build a decent library, to see which system gets the most support you enjoy, etc. By such a time you also get a good idea of the kind of PC hardware you might need to keep up with them throughout the generation if not easily exceed them enough to last another couple years into the next generation until you once again buy something up to date. Even if you have to settle for a less than optimal experience by lowering the visual quality or dealing with lower frame rates on certain later games (it’s not like you don’t deal with that on consoles, it’s kinda crazy some companies have been hyping full HD two generations now but the systems still fall short of that expectation too often to be comfortable with it, while frame rates get the short end of the stick way too often) it’s easily made up for by the rest of the platform benefits, not to mention you can revisit these games when you do upgrade and experience them at their best without the need to buy some HD remaster that fixes them up (if one is even made to give the choice on consoles, of course actual remakes are a different matter).
Consoles have their strong points and PC does too, many things PC gets as an open platform, not just hardware wise, are incorporated in later consoles (like online gaming back in the day). Hardware wise, things like the Occulus Rift are possible thanks to the platform being so open and not preset in its capabilities. It’s great to see what people experiment with and how they shape the future. For a more conventional example, the graphical features and power of the PS4 and Xbox One GPUs has been incorporated in PC graphics cards for years, so even if few games have utilized them all at the same time in the way you expect to see them used in your average cinematic blockbuster, we have seen more than mere glimpses of what’s possible, like how Battlefield 3 could easily run with 64 players, large maps and 60fps in 1080p on a half decent gaming PC from ~3 years ago and now Battlefield 4 on the new consoles approaches that (and iirc still doesn’t hit 1080p even on PS4).
If a console generation lasts longer then on PC you will be able to put off upgrading further. Or you may end up upgrading regardless because you bought a fancy new 4k resolution monitor or want to dabble with 3D gaming and/or ORift, or 120Hz (and the frame rate to match), or whatever else requires much beefier specs than the standard goal of 1080p/60fps, but has great benefits for those willing. Though anyway I don’t know if the first parties prolonged the last gen on purpose and they wish to repeat it or if it was simply a side effect of not being profitable enough until way too late to be comfortable with replacing the machines before making up for it with more sales. Besides, for all I know they’d love to go for the Apple model of providing a new console every year allowing the whales to shower them with money while maintaining compatibility for the less well off users that will still only upgrade once every 5 years (but end up with less than ideal experiences much more often than they do these days where hardware capability is universal for a time). Or they could end up turning games into a service with fees up the wazoo and the hardware provided and replaced with better for “free” (paid for tenfold by the subscriptions), or the hardware could become just a streaming box with irrelevant computational power or some other nightmare scenario. Who knows, we’ll just have to wait and see but I think we’ve got at least a couple traditional generations ahead (if not several more, I believe conventional hardware still has a ton of room for more power that games can eventually utilize, this generation’s leap isn’t so great only because consoles chose technically already outdated and basically off the shelf bits and pieces to keep costs lower rather than over the top customized technology a la cell which means recent PC games have already impressed people and a similar level of quality appears less astounding, it’s definitely not because we’ve reached the point of diminishing returns already) before things have the potential to really be shaken up with paradigm shifts beyond experiments that aren’t meant to completely replace them just yet (like Sony experiment with streaming and how VR is making steps on PC).
Anyway, on topic, consoles definitely have lost a lot of their ease of use while PC gaming has been improving on its issues (not to mention that less and less people are PC illiterate as they’re growing to be more and more convenient, if not necessary, for every day life).