StarDestroyer.Net BBS

Get your fill of sci-fi, science, and mockery of stupid people
Login   Register FAQ    Search

View unanswered posts | View active topics


It is currently 2015-09-04 03:54am (All times are UTC - 5 hours [ DST ])

Board index » Non-Fiction » Gaming, Electronics and Computers


Quote of the Week: "A great civilization is not conquered from without until it has destroyed itself from within." - Will Durant, American historian (1885-1981)

Will this be the shortest console generation?

Moderators: Stofsk, Thanas, PeZook, Keevan_Colton

Post new topic Post a reply  Page 1 of 4
 [ 76 posts ]  Go to page 1, 2, 3, 4  Next
  Print view Previous topic | Next topic 
Author Message

Arthur_Tuxedo
PostPosted: 2015-02-07 03:20pm 

Sith Acolyte


Joined: 2002-07-23 03:28am
Posts: 5361
Location: San Francisco, California
http://www.forbes.com/sites/johnarcher/ ... e-round-2/

This article was written a few months ago, but I stumbled on it and thought it was worth consideration. The author makes the argument that by sticking to the $299 price point at the expense of UHD capability, the current consoles have shot themselves in the foot and will be irrelevant in 2-3 years. I don't disagree about the last bit, but I think their hand was forced. 4K-capable consoles would have run at least $600, and that price point was a huge mistake for the PS3, which didn't gain any real traction against the X360 until the price came down to $400. Some commentators have opined that the 8th generation consoles should have launched in 2010 or 2011, and would have had plenty of horsepower for modern FullHD gaming and paved the way for timely 4K replacements in 2016, but that ignores that we were in the teeth of the worst recession in living memory and people don't exactly beat feet to get the latest and greatest electronic toys in those circumstances.

So it's clear that MS and Sony couldn't really have made any choice other than the one they did make, but also that the current consoles will be hopelessly behind the tech curve by 2017 at the latest, when 4K res will have taken over the high-end and enthusiast segments and VR headsets will be in the second generation. If they decide to release new consoles at that time, the Xbone and PS4 will only have been current a little more than 3 years, which will make this the shortest console generation, by my quick'n'dirty reckoning. Counting only consoles that sold well as beginning a generation, no generation to date has lasted less than 5 years:

Magnavox Odyssey 1972-1977: 5 years
Atari VCS 1977-1983: 6 years
Nintendo Famicom 1983-1989: 6 years
SEGA Genesis 1989-1994: 5 years
Sony PSX 1994-1999: 5 years
SEGA Dreamcast 1999-2005: 6 years
Microsoft Xbox 360 2005-2013: 8 years

Even if you count the same company replacing its own (successful) console, the only comparably short generation is the SEGA Master System -> Genesis (1986-1989) or Saturn -> Dreamcast (1995-1999).

Precedented or not, what choice will they have? Affordable 4K gaming simply doesn't exist in console tech, but it will in 2-3 years, and if they don't update their consoles in that time they'll be in a position where the latest smartphones might be able to exceed their consoles, especially if it becomes easier to mirror a second screen onto a TV or monitor.

It seems like the only option is to release new consoles around 2017, but perhaps make them backward compatible? They could even split the platform, where new PS4 games could continue to be developed by indie devs and studios on smaller budgets that would run on either console, and AAA titles would be PS5-only. A new PS4 could be had for $149, while a PS5 commands $399 or $499.

What would you do if you were an exec in MS or Sony's games division?
   Profile |  

The Vortex Empire
PostPosted: 2015-02-07 04:18pm 

Jedi Master


Joined: 2006-12-11 10:44pm
Posts: 1450
Location: Rhode Island
I wouldn't release a new console generation at all, the concept is obsolete when you can just plug a PC into the TV and have even more functionality.
   Profile |  

Mr Bean
PostPosted: 2015-02-07 04:26pm 

Lord of Irony


Joined: 2002-07-04 08:36am
Posts: 21333
Arthur_Tuxedo wrote:

What would you do if you were an exec in MS or Sony's games division?

I would have taken my own advice I gave two years ago and be launching the PS4/Xbox One this Holiday season in 2015 when I can go to AMD who needs the business rather badly and tell them to give me a four to six core Kaveri custom part with an equivalent Tonga GPU (Both parts released last year) because console 4k is quite possible if you mate a decent GPU with decent CPU with a decent chunk of ram. That's a ten year gap and a good enough set of hardware to last another ten years.
   Profile |  

DaveJB
PostPosted: 2015-02-08 09:50am 

Jedi Council Member


Joined: 2003-10-06 05:37pm
Posts: 1862
Location: Leeds, UK
The lack of 4K support isn't necessarily a deal-breaker by itself, considering that 1080P has been the mainstream TV resolution since 2009 or so, yet the vast majority of PS3/360 games are 720P at best, and even the original Wii was hanging in there pretty nicely for a long time with only SD graphics.

The big issue is going to be if we get to the point where 4K TVs are mainstream and affordable, but the PS4 and XB1 (though it seems to be a bigger problem for the latter) are still only managing to churn out 720P graphics. I think if we're in that situation, and the developers are still having serious problems working with the CPU set-up on the two consoles, this will be a short generation and the companies will put more of an emphasis on raw power afterwards. If nothing else, Nintendo are gonna have to do that since the "weak console with innovative stuff" strategy crashed and burned spectacularly this time around.
   Profile |  

Starglider
PostPosted: 2015-02-08 01:12pm 

Miles Dyson


Joined: 2007-04-05 09:44pm
Posts: 8005
Location: Isle of Dogs
4K? Really? Of all the things he could complain about, he picks 4K? Now I personally love my 4K monitor for getting actual work done, but I have to admit that for most people 4K TV (vs 1080p) is a pretty trivial luxury. We just had the longest console generation of not-even-real-720p (upscaling), usually-30-FPS. For TVs 1080p is good enough, more pixels is not the primary, secondary or even tertiary challenge for AAA gaming right now. The only thing likely to change this would be mass adoption of living room VR; good VR requires at least 60 Hz, preferable 120 Hz, and seriously benefits from at least 1440p.
   Profile |  

Zaune
PostPosted: 2015-02-08 01:41pm 

Sith Marauder


Joined: 2010-06-21 11:05am
Posts: 4467
Location: In Transit
More broadly, the graphics arms-race is going to peak sooner or later because it doesn't actually sell games by itself. Proper gaming journalists have been banging this drum since at least the mid-90s; visual spectacle < fun and innovative gameplay.
   Profile |  

Jub
PostPosted: 2015-02-08 02:40pm 

Jedi Council Member


Joined: 2012-08-06 07:58pm
Posts: 1775
Location: British Columbia, Canada
Zaune wrote:
More broadly, the graphics arms-race is going to peak sooner or later because it doesn't actually sell games by itself. Proper gaming journalists have been banging this drum since at least the mid-90s; visual spectacle < fun and innovative gameplay.


Yes, but that's not the only thing that can be done with additional horsepower. Physics interactions, larger area loads without pop-in or new loading screens, and more scripted events running at the same time (be they NPC's in the world or other triggers) are all things that can benefit from better CPUs/GPUs and more/faster RAM being used in consoles. Plus, consoles still haven't caught up to gaming PCs in terms of texture quality, resolution, frame rate, anti-aliasing, filtering and other areas effected by raw computing power and PC's have yet to meaningfully hit any such peak and likely won't even once 4k resolutions become common. The only exceptions to this rule come up where games have been ported or simultaneously released on consoles and then brought over to PC looking like ass next to games designed to run on a moderately powerful PC.

Even in these cases the difference between console and PC games is noticeable. Just look at a cross platform game like Farcry 4 on a console and the look at it on a PC and you'll see a massive difference in graphical quality. This difference is one that game critics are starting to notice given all the lies like '30 fps gives games a more cinematic feel' and 'You can't see the difference between 30fps and 60fps with the human eye' that games companies are spewing these days. While it's true that graphics aren't the be all and end all of gaming, it's no secret the AAA games go for great screenshots, box art, and game play videos - often run on hardware far above what a console has, see Watch Dogs for an example of this- to sell copies and that, to a significant extent, graphics still sell.
   Profile |  

Mr Bean
PostPosted: 2015-02-08 02:52pm 

Lord of Irony


Joined: 2002-07-04 08:36am
Posts: 21333
Extra power is not just raw graphics it also shapes the kind of games you make. There were many side stories to come out of that generation about how feature X or Y had to be removed because the consoles and memory limitations just could not handle it. Or ability Z only ran on PC versions because the console background was not deep enough to keep everything running smoothly enough.

Already Ubisoft is leaking all sort of stories about the PS4/Xbox about super fast memory fills up again since it's not 8 gigs of system memory + x gigs of GPU, it's all shared and using to many different textures (Like you say might find in a french street) drags games to a crawl as the GPU fights the CPU for memory storage space. I'm sure some of that is them not finding the balance yet but to be blunt. Even if you don't like the fact that "graphics sell" it remains true "Features sell" and having to cut 1/3rd of what you wanted to do from your game because the hardware can't handle it has an impact (Witness Xbox 360 Minecraft) even if it's not a financial one (Witness X360 Minecraft)
   Profile |  

TheFeniX
PostPosted: 2015-02-08 03:06pm 

Sith Devotee


Joined: 2003-06-26 04:24pm
Posts: 2753
Location: Texas
Starglider wrote:
4K? Really? Of all the things he could complain about, he picks 4K? Now I personally love my 4K monitor for getting actual work done, but I have to admit that for most people 4K TV (vs 1080p) is a pretty trivial luxury. We just had the longest console generation of not-even-real-720p (upscaling), usually-30-FPS. For TVs 1080p is good enough, more pixels is not the primary, secondary or even tertiary challenge for AAA gaming right now. The only thing likely to change this would be mass adoption of living room VR; good VR requires at least 60 Hz, preferable 120 Hz, and seriously benefits from at least 1440p.
The thing is, 4K price-points are dropping rediculously fast. Within a year, 4K went from a bad joke for me to "this is probably what I'll be buying when and if my TV explodes." When I was checking out TVs around March 2014, you were looking at anywhere from $3000-$4000, not including the $900 converter box. You can now find some 55" 4k TVs (Samsung even) for $1000. That's only going to continue to drop. The problem, of course, is getting 4K signals when most HD TV stations are lucky to be pushing >720p. 1080i if you're lucky.

But there's going to come a point where a 1080p TV will be as hard to find as a 480i fullscreen. And that seems to be a lot shorter timeframe than we saw with the adoption of HD over SD.
   Profile |  

Arthur_Tuxedo
PostPosted: 2015-02-09 03:37pm 

Sith Acolyte


Joined: 2002-07-23 03:28am
Posts: 5361
Location: San Francisco, California
Now that I think about it, another question might be "Will this be the LAST console generation"? Once PCs (maybe in a Steambox-style case) can be connected wirelessly to both TV's and gamepads with ease, and more casual / less graphically intense games can be played on a smartphone that can also easily stream to an HDTV and connect to a gamepad, where does a dedicated console fit? The Xbone and PS4's media capabilities are built right into the TV these days, and a Roku box is much cheaper and friendlier. Will the whole idea of a game console be obsolete when it's time to refresh the lineup?
   Profile |  

Elheru Aran
PostPosted: 2015-02-09 03:59pm 

Emperor's Hand


Joined: 2004-03-04 02:15am
Posts: 8528
Location: Georgia
I don't think this will be the last console generation. There will be more, out of sheer inertia and nostalgia if nothing else. Maybe not a whole lot more, certainly...
   Profile |  

TheFeniX
PostPosted: 2015-02-09 04:17pm 

Sith Devotee


Joined: 2003-06-26 04:24pm
Posts: 2753
Location: Texas
Nintendo isn't going anywhere. They're making way to much money even though they haven't adopted HD to the extent everyone else has (or says they have). Pretty sure, even with the weak start of the WiiU, they're still making money hand over fist with their first-party titles. This doesn't mention their mobile market.

Sony and Microsoft have way too much invested to back off. Subscription money alone is a lion's share they don't want to lose, not to mention what they likely make off advertising and user metrics you know they're collecting on their users. Giving that up and letting valve pick-up all the money on the table? Not happening.

There's still a market for dedicated consoles, even if they are branching out more and more into multimedia. Just like mobile gamers want to click an icon and play Angry Birds, console players like inserting a disc or selecting an icon on a game they've downloaded and just play it. To them, the console is the PC they're streaming to their TV. It's as technically complicated as they want and there's still a lot of pushback against PC gaming, even though a majority of the complaints haven't existed in over a decade. They don't want any middle-man BS, even as it gets easier.

valve wants to release a SteamBox. It's a console: just a cutdown computer with a gimped OS. They already sell boxes like this everywhere, but they lack brand-name recognition and people just like to ensure their Xbox controller works with their XBox and plays their Xbox version of CoD. Substitute Sony for the other side of the coin.
   Profile |  

Elheru Aran
PostPosted: 2015-02-09 04:23pm 

Emperor's Hand


Joined: 2004-03-04 02:15am
Posts: 8528
Location: Georgia
I understand a lot of Ninty's popularity is because the Wii is an easy party game console. You don't have four guys and a small bunch of guys in the corner playing CoD-- with the Wii even Grandma can put in bowling or some easy dancing shit, or Junior can play Mario Kart with Uncle Art and Aunt Mavis. It doesn't take much skill to use-- the use of the controls is a little less intuitive, but if you can point and click with a mouse or a TV remote, you can figure it out in two seconds. It's your technologically illiterate relative's game console.

So yeah, it's a system that coins money for them. They're not going to bother changing the system or even thinking about giving it up anytime soon because they found their niche and they're wringing it for all it's worth.
   Profile |  

bilateralrope
PostPosted: 2015-02-09 04:54pm 

Sith Devotee


Joined: 2005-06-25 06:50pm
Posts: 2673
Location: New Zealand
Arthur_Tuxedo wrote:
Now that I think about it, another question might be "Will this be the LAST console generation"? Once PCs (maybe in a Steambox-style case) can be connected wirelessly to both TV's and gamepads with ease, and more casual / less graphically intense games can be played on a smartphone that can also easily stream to an HDTV and connect to a gamepad, where does a dedicated console fit? The Xbone and PS4's media capabilities are built right into the TV these days, and a Roku box is much cheaper and friendlier. Will the whole idea of a game console be obsolete when it's time to refresh the lineup?

Previous console generations had the console hardware sold at a loss, games sold at the same price as their PC versions. Making consoles a more cost effective gaming option. That faded for people willing to wait for Steam sales late last generation, then vanished completely this generation through a combination of selling hardware at a profit, a subscription for multiplayer* and selling new games for a higher price than the PC version**.

I see no reason why the next console generation can't go back to being more cost effective than PC. Even if they keep the multiplayer subscriptions.

But the Steambox can't pull that off because Valve aren't going to lock it down. So, like any other piece of PC hardware, there is nothing stopping people loading it up with programs that Valve doesn't see a cent from. Meaning that if it's sold at a loss, people will do that because it's a cheap way to get useful hardware.

*I know that the subscription comes with 'free' games that does compare favourably to Steam sales if I like every single game that comes from the subscription. But if I don't like too many, then it's more expensive again.

**Lets take Dying light as an example, using the prices I'd pay:
- PS4/XBONE physical: $NZ 199.99
- PC physical: $NZ 98.00
- Steam: $US 71.99, which is $NZ 94.20 according to my banks foreign exchange calculator for Travellers Cheques / Drafts (no CC options, so I picked the most expensive conversion rate).

I don't know how to check digital prices on the PS4/XBONE stores.

TheFeniX wrote:
console players like inserting a disc or selecting an icon on a game they've downloaded and just play it.

PC gaming is mostly at the select an icon point. All it really needs is to automatically set the graphics settings in a game to something appropriate for your hardware.
   Profile |  

TheFeniX
PostPosted: 2015-02-09 05:33pm 

Sith Devotee


Joined: 2003-06-26 04:24pm
Posts: 2753
Location: Texas
bilateralrope wrote:
TheFeniX wrote:
console players like inserting a disc or selecting an icon on a game they've downloaded and just play it.
PC gaming is mostly at the select an icon point. All it really needs is to automatically set the graphics settings in a game to something appropriate for your hardware.
There's a few things going against this:
1. The perception that PCs require a PHD to install and play games decently is still pretty strong among dedicated console players. They also believe that a rig capable of playing a console game with better graphics and framerate runs in the thousands of dollars.

2. "Mostly." You're talking about people who want to plug a box in, go through NEXT>NEXT>FINISH, pop-in a disc or click an icon, and be playing their game with useable default settings. The advantage to proprietary systems is you don't have to hit them with a hammer to make everything play nice.

3. EA and other publishers don't fight with Sony or MS on their systems like they do with PC distribution. For just one example, Burnout: Paradise seamlessly integrates with my XBLive account. Picking it up for $5 on Steam means I now also need an origin account username/password to remember. Same with GFWL, even though MS said they'd patch it all out. In the PC realm, every publisher wants a slice of the pie and will ensure their authentication systems get in there somehow because they are assholes, even with already heavily DRMed software like Steam.
   Profile |  

Vendetta
PostPosted: 2015-02-09 05:54pm 

Emperor's Hand


Joined: 2002-07-07 04:57pm
Posts: 9709
Location: Sheffield, UK
Arthur_Tuxedo wrote:
Now that I think about it, another question might be "Will this be the LAST console generation"? Once PCs (maybe in a Steambox-style case) can be connected wirelessly to both TV's and gamepads with ease, and more casual / less graphically intense games can be played on a smartphone that can also easily stream to an HDTV and connect to a gamepad, where does a dedicated console fit? The Xbone and PS4's media capabilities are built right into the TV these days, and a Roku box is much cheaper and friendlier. Will the whole idea of a game console be obsolete when it's time to refresh the lineup?


People have been saying that for the last two generations.

PCs are unlikely to take the living room spot of consoles ever, no matter how easy the user interface becomes, because PC is a fundamentally variable platform. When you buy a console game you know reliably not just that it will work on your console but that it will work the way it was intended to (or at least as well as the developers were able to make it work). The framerate will be the same for you as for everyone else, the graphics will be the same for you as for everyone else, the audio will be the same, etc, you will be playing the game the way it was designed and you don't have to fuck about in an options menu for every single game to figure out how to make this game work acceptably on your particular hardware.

And if you make a fixed target platform like a steambox and make games work in a specific way on that hardware so the user doesn't have to piss about configuring it, congratulations you just made a console.
   Profile |  

bilateralrope
PostPosted: 2015-02-09 05:57pm 

Sith Devotee


Joined: 2005-06-25 06:50pm
Posts: 2673
Location: New Zealand
Quote:
1. The perception that PCs require a PHD to install and play games decently is still pretty strong among dedicated console players. They also believe that a rig capable of playing a console game with better graphics and framerate runs in the thousands of dollars.

That perception is a problem. Unless consoles become cheaper again.

Quote:
2. "Mostly." You're talking about people who want to plug a box in, go through NEXT>NEXT>FINISH, pop-in a disc or click an icon, and be playing their game with useable default settings. The advantage to proprietary systems is you don't have to hit them with a hammer to make everything play nice.

I do get a playable game on default settings every time. Well, I assume that having the wrong resolution for your screen is playable as console users seem happy playing at resolutions other than the native resolution for their monitor. Most games even default to my monitors native resolution, it's only a few that I have to adjust.

Quote:
3. EA and other publishers don't fight with Sony or MS on their systems like they do with PC distribution. For just one example, Burnout: Paradise seamlessly integrates with my XBLive account. Picking it up for $5 on Steam means I now also need an origin account username/password to remember. Same with GFWL, even though MS said they'd patch it all out. In the PC realm, every publisher wants a slice of the pie and will ensure their authentication systems get in there somehow because they are assholes, even with already heavily DRMed software like Steam.

Yes, that's a problem that needs fixing.

As for why they don't fight Sony or MS, that's because fighting them isn't an option. At one point it cost $40,000 to patch console games. So developers had a choice, pay the fee, or don't patch the game. While that fee is gone, I don't see anything to suggest that the relationship between publisher and Sony/MS has changed. Sony/MS set rules, those rules get followed if you want to put your game on their platforms.
   Profile |  

Arthur_Tuxedo
PostPosted: 2015-02-09 07:00pm 

Sith Acolyte


Joined: 2002-07-23 03:28am
Posts: 5361
Location: San Francisco, California
Vendetta wrote:
Arthur_Tuxedo wrote:
Now that I think about it, another question might be "Will this be the LAST console generation"? Once PCs (maybe in a Steambox-style case) can be connected wirelessly to both TV's and gamepads with ease, and more casual / less graphically intense games can be played on a smartphone that can also easily stream to an HDTV and connect to a gamepad, where does a dedicated console fit? The Xbone and PS4's media capabilities are built right into the TV these days, and a Roku box is much cheaper and friendlier. Will the whole idea of a game console be obsolete when it's time to refresh the lineup?


People have been saying that for the last two generations.

PCs are unlikely to take the living room spot of consoles ever, no matter how easy the user interface becomes, because PC is a fundamentally variable platform. When you buy a console game you know reliably not just that it will work on your console but that it will work the way it was intended to (or at least as well as the developers were able to make it work). The framerate will be the same for you as for everyone else, the graphics will be the same for you as for everyone else, the audio will be the same, etc, you will be playing the game the way it was designed and you don't have to fuck about in an options menu for every single game to figure out how to make this game work acceptably on your particular hardware.

Yes, but you can also say all of those things about the iPhone, and that's rapidly approaching graphical and computational parity with current consoles. With an easy, Bluetooth style standard to throw video onto an HDTV and a Bluetooth gamepad, you're really not missing anything from the console experience and with mobile carrier programs like Verizon's Jump and Sprint's One-Up, it becomes easy and inexpensive to upgrade every year.

PC's will take a bit longer to get there, but it's very possible to envision a PC that connects wirelessly to an HDTV (perhaps even from a different room) and a gamepad, where turning on the gamepad would cause Windows 10 or its successor to switch to a Metro-style tile interface with a list of games to play where graphics settings are defaulted to an optimized state a la Nvidia's GeForce Experience, which I use today and works very well. It will take a few years to come together seamlessly, but the technology is there to do it today, and I can't see the advantage a console would offer over that setup.

Quote:
And if you make a fixed target platform like a steambox and make games work in a specific way on that hardware so the user doesn't have to piss about configuring it, congratulations you just made a console.

I know. I guess what I'm saying is not necessarily that consoles will die, but that PC's will become consoles and vice versa on the hardcore end of the spectrum, and the casual gamer space will be gobbled up by smartphones and tablets leaving no room in the market for a dedicated console.
   Profile |  

TheFeniX
PostPosted: 2015-02-09 07:03pm 

Sith Devotee


Joined: 2003-06-26 04:24pm
Posts: 2753
Location: Texas
bilateralrope wrote:
I do get a playable game on default settings every time. Well, I assume that having the wrong resolution for your screen is playable as console users seem happy playing at resolutions other than the native resolution for their monitor. Most games even default to my monitors native resolution, it's only a few that I have to adjust.
You actually understand the term "native resolution." The class of people who argue so strongly against PC gaming do not. They don't understand why 60FPS is better than 30FPS. That's "nerd shit" they don't care about, which is fine. But this is also why PC has not made advances into the living room in... ever, even though the technology has existed in many forms for years. It's as simple as it's ever been now, but there's still a large stigma attached to it.

Quote:
As for why they don't fight Sony or MS, that's because fighting them isn't an option.
::snip::
Sony/MS set rules, those rules get followed if you want to put your game on their platforms.
Exactly, they have that kind of buying power. For valve to get near that, they'd have to have their own proprietary platform. So.... then we'd just have another console manufacturer on the market. But for valve to sell a cost competitive console at retail is just going to be another console that a mid-range PC can kick the shit out of.
   Profile |  

Joun_Lord
PostPosted: 2015-02-10 02:03am 

Padawan Learner


Joined: 2014-09-27 01:40am
Posts: 428
Location: West by Golly Virginia
I doubt this will be the last console generation just because console still want the ability to choose when they play games. The Xbone's original always online mode that was totally built into the console and there was no way the Xbone could work offline, totally, did not go over too well. Console gamers, for now, want to be able to just plug their shit into a tv and play it in the middle of nowhere where the high speed interwebs is barely above dial-up, the snowy mountains of Alaska where a moose goes faster then the web if they even got it, and some fob in Afghanistan where signals are spotty at best.

But that want to be "unplugged" so to speak is probably just inertia from gamers being used to being able to play offline, the lack of still reliable internet coverage, download caps, download caps errywhere, and peoples love of used gamers. Younger gamers might feel differently, I dunno, and with effort by Google and our own gubmint to try to expand coverage and up speeds the complaints people have about being online might shrink or disappear. But for now people want "dumb" consoles that they can just plug in and game with no other hassle.

For me personally I'd prefer consoles to be stretched out as long as possible. Not because I play consoles its because it helps prevent the PC gaming specs from shooting up to stupid extremes. I liked being able to play modern games reliably on a computer from 2007 (with upgrades of course) up until relatively recently.

Maybe its because I'm a cheap-ass but its also because I don't care about having super computer nose hair rendered 1080fuck graphics. If the graphics are serviceable I don't give a shit. Seems alot of other people feel that way to with the popularity of games like Source games and Bethesda games that look decent but nothing special.

Hell for some people graphics don't seem to matter at all looking at shit like Minecraft and some MMOs and multiplayer games I've seen. some of which looks like it would be at home of a NES console as old as I am.
   Profile |  

Jub
PostPosted: 2015-02-10 02:36am 

Jedi Council Member


Joined: 2012-08-06 07:58pm
Posts: 1775
Location: British Columbia, Canada
Joun_Lord wrote:
For me personally I'd prefer consoles to be stretched out as long as possible. Not because I play consoles its because it helps prevent the PC gaming specs from shooting up to stupid extremes. I liked being able to play modern games reliably on a computer from 2007 (with upgrades of course) up until relatively recently.

Maybe its because I'm a cheap-ass but its also because I don't care about having super computer nose hair rendered 1080fuck graphics. If the graphics are serviceable I don't give a shit. Seems alot of other people feel that way to with the popularity of games like Source games and Bethesda games that look decent but nothing special.

Hell for some people graphics don't seem to matter at all looking at shit like Minecraft and some MMOs and multiplayer games I've seen. some of which looks like it would be at home of a NES console as old as I am.


So you're saying everybody should suffer sub par graphics, less physics interactions, and not getting to see what better hardware can do because cheap asses like you don't want to spend money on computer upgrades and want to be able to use a shit box from nearly a decade ago for a few more years?

Hardware improvements aren't just about graphics either. Pushing consumer grade hardware up generates money for better chips to come out, which hastens the point where we hit the limits of silicone and have to figure out the next step. Frankly I want computer's to start running away again and becoming obsolete in months like they did in the 2000's because look how far that got us in a short span. I want the future and I don't just want it when it comes, I want it faster.
   Profile |  

Arthur_Tuxedo
PostPosted: 2015-02-10 03:20am 

Sith Acolyte


Joined: 2002-07-23 03:28am
Posts: 5361
Location: San Francisco, California
I've never understood the "graphics don't matter, gameplay does" mantra. Graphics are part of gameplay and of the experience, and whole new types of gameplay become available as technology advances. Skyrim isn't just a prettier Daggerfall. It presented a world and its characters in detail that never would have been possible back then, and therefore is qualitatively different and better experience. Just because (for example) the latest CoD doesn't offer anything substantial over the first Modern Warfare doesn't indict the entire concept of technical progress.
   Profile |  

DaveJB
PostPosted: 2015-02-10 06:22am 

Jedi Council Member


Joined: 2003-10-06 05:37pm
Posts: 1862
Location: Leeds, UK
I don't think anyone claims that graphics are completely irrelevant, just that they can only go so far in covering up a game's other flaws. To use an example, Doom 3 was a more technically advanced game than Half-Life 2, yet the latter is remembered as a landmark for the FPS genre, while if the former's remembered at all it's mostly as that game where you stumble around in the dark trying to shoot stuff.

Quote:
Frankly I want computer's to start running away again and becoming obsolete in months like they did in the 2000's because look how far that got us in a short span. I want the future and I don't just want it when it comes, I want it faster.

Not likely to happen anytime soon, I'm afraid. For one thing technology advanced so quickly in those days mostly because of the intense competition between Intel and AMD. Nowadays AMD offer hardly any competition on the CPU front, so Intel have no reason to push the envelope like they were doing back then. And for another thing, thermal limitations are going to restrict any huge leaps in performance, unless the next generation goes back to having separate CPUs and graphics chips (which, by all indications, would be about second only to Nintendo becoming the only console manufacturer in town as the last thing that developers want).
   Profile |  

Purple
PostPosted: 2015-02-10 07:25am 

Sith Marauder


Joined: 2010-04-20 08:31am
Posts: 3751
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.
DaveJB wrote:
I don't think anyone claims that graphics are completely irrelevant, just that they can only go so far in covering up a game's other flaws. To use an example, Doom 3 was a more technically advanced game than Half-Life 2, yet the latter is remembered as a landmark for the FPS genre, while if the former's remembered at all it's mostly as that game where you stumble around in the dark trying to shoot stuff.

This. Simply put, every video game is made on a budget. And not just one of development money and time but also of the hardware its supposed to run on. If you know you have X GHz of Processor power to employ and Y GB of memory to work with you can either fill that up with fancy graphics and high detail textures OR you can use it for other, non graphical stuff. And ultimately, what gaming history has proven in this regard is that going too far into the "fancy graphics" side of things and ignoring the rest makes for a shitty game. Where as going too far into the "other stuff" still produces good results if that "other stuff" has been done well. You might need to change the paradigm of what you are trying to achieve, but at least it can end up being a good game. Where as a super advanced graphics demonstrator that's there just to show off how awesome your graphics are isn't.

So simply put what we mean when we say that is that you can make a good game with bad graphics and it's far more likely to be enjoyed than if you made a bad game with good graphics.
   Profile |  

Joun_Lord
PostPosted: 2015-02-10 04:05pm 

Padawan Learner


Joined: 2014-09-27 01:40am
Posts: 428
Location: West by Golly Virginia
Jub wrote:
Joun_Lord wrote:
For me personally I'd prefer consoles to be stretched out as long as possible. Not because I play consoles its because it helps prevent the PC gaming specs from shooting up to stupid extremes. I liked being able to play modern games reliably on a computer from 2007 (with upgrades of course) up until relatively recently.

Maybe its because I'm a cheap-ass but its also because I don't care about having super computer nose hair rendered 1080fuck graphics. If the graphics are serviceable I don't give a shit. Seems alot of other people feel that way to with the popularity of games like Source games and Bethesda games that look decent but nothing special.

Hell for some people graphics don't seem to matter at all looking at shit like Minecraft and some MMOs and multiplayer games I've seen. some of which looks like it would be at home of a NES console as old as I am.


So you're saying everybody should suffer sub par graphics, less physics interactions, and not getting to see what better hardware can do because cheap asses like you don't want to spend money on computer upgrades and want to be able to use a shit box from nearly a decade ago for a few more years?

Hardware improvements aren't just about graphics either. Pushing consumer grade hardware up generates money for better chips to come out, which hastens the point where we hit the limits of silicone and have to figure out the next step. Frankly I want computer's to start running away again and becoming obsolete in months like they did in the 2000's because look how far that got us in a short span. I want the future and I don't just want it when it comes, I want it faster.


No I'm saying devs should use what they got to their fullest before upgrading to the latest nose hair rendering mega-card of DOOM. Devs shouldn't use stupidily high requirements in place of properly coding their games to run on something that isn't a superduperpoopercomputer.

Switching from dual cores to quadzillamega cores with limited upgrades in the gaming experience and graphics just seems foolish. Take a game like Saints Row 3 compared to 4. There isn't much of a difference between the games in graphics or anything but the 4th game requires a quad core. Or take its earlier iteration Saints Row 2 which came out around the same time as GTA 4 and both had high requirements for their terrible PC ports. GTA 4 was badly coded but on computers that could run it actually looked good. SR2 looked like assified ass with a healthy side helping of ass with terrible draw distances and physics. Some of the requirements were higher for the minimum for SR over GTA Quadruple.

Plus again the money reason for not wanting computer obsolete almost as soon as you open it. For people who have to work for a living, pay bills, buy stuff to shovel down your throat, and tacos, spending up to a grand a year on a new computer is not something most people can or want to do.

And it would hurt PC gaming if only rich ass motherfuckers or children whining until their rich ass mommies and daddies buy them a new puter can play the latest whiz bang graphical failure. One of the benefits of PC gaming is the price. Sure you might have to drop more on a PC to game on then you would a Xboner or Playwithyourselfstation Quadruple but the computer is longer lasting and can be upgraded without buying a whole new system.

Arthur_Tuxedo wrote:
I've never understood the "graphics don't matter, gameplay does" mantra. Graphics are part of gameplay and of the experience, and whole new types of gameplay become available as technology advances. Skyrim isn't just a prettier Daggerfall. It presented a world and its characters in detail that never would have been possible back then, and therefore is qualitatively different and better experience. Just because (for example) the latest CoD doesn't offer anything substantial over the first Modern Warfare doesn't indict the entire concept of technical progress.


Graphics are important, sometimes depending on the game or gamer, but they are as you said a part of the whole experience.

Some devs seem to want to make the latest shiniest graphics that render every single blade of hair while forgetting this isn't a painting people want to stare at and make wild theories of what it means, its something that people want to play.

Playing something that is super pretty but has terrible gameplay and no new concepts is not fun. Playing a game that is pretty enough to get the job done without turning your machine into the Sahara dessert and the has interesting gameplay and some new shit to do is by many considered to be a superior option.
   Profile |  

Display posts from previous:  Sort by  
Post new topic Post a reply  Page 1 of 4
 [ 76 posts ]  Go to page 1, 2, 3, 4  Next

It is currently 2015-09-04 03:54am (All times are UTC - 5 hours [ DST ])

Board index » Non-Fiction » Gaming, Electronics and Computers

Who is online: Users browsing this forum: koputusx and 2 guests

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum
Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group