The incredibly controversial Assassin's Creed Mirage chromatic aberration setting is getting removed for all players, with an option to turn it back on.
It is more or less a color halo outlining everything. It was supposed to simulate the subtle visual distortion of older lenses i.e cameras but… who the hell even wants that?
Every gimmick for verisimilitude gets abused to hell and back. We just gloss over the ones that are less frustrating to the goal of… lookin’ at stuff.
Destiny and Warframe are awash in gold because physically-based shaders made metals look super good. Ambient occlusion was egregious after Crysis, but games without a little bit feel weird now, and even the Wii got coerced into doing it efficiently. HDR tone-mapping was part of the brown-and-bloom era, but it’s still here and you’d never think twice about it.
Lens flare is more common than ever, but much better than its goofy line-of-sprites roots in the 90s, because you blur the whole screen and flip it. It doesn’t have to be blinding to be obvious and… aesthetic.
“God rays” and participating media / volumetric fog have been admirably restrained, considering how stupidly pretty they look, and the fact PS3 launch titles figured out you can just do it badly and blur. Downright awful sampling works so long as it’s different awful sampling from nearby pixels. Even Quake 3 did some sparse approximations on the CPU. I guess thick fog is just undesirable to developers, now that it’s not disguising tiny worlds or keeping framerates tolerable.
Unfortunately we can probably expect shakycam to take off after Unrecord. That game does a ton to look shockingly realistic, but a lot of companies will overdo about half of its tricks, and not understand why their playtesters have such queasy stomachs.
As an amateur astronomer with a strong eyeglass prescription, chromatic aberration is the bane of my existence. I get why they try to simulate a camera, but the more I can avoid the pitfalls of cheap low quality lenses, the better–I already have two of them on my face all the time
Same, I even pay hundreds of dollars out of pocket to have glass lenses cut because I legitimately don’t understand how people deal with the chromatic distortion and starburst effect that come with the high refraction plastic.
I mean, I do get it - people just don’t know any better. What I don’t get is why a literal doctor of optometry will look at you like you’ve got three heads when you start asking about the superior optic properties of glass.
Like Bloom in the 7th gen it was the style at the time. Someone at the time had a shitty idea that the “camera” in games should mimic cameras (bad ones) and I guess some exec liked it and was spread along all AAA games.
I guess now we’re going back on that like we did with “brown and grey = realism” fad.
I don’t understand this need to make games look like they’re filmed on a camera, it kills the immersion for me. I’m playing a fantasy game, I don’t want it to look like I’m watching a shitty video. Some games also do this thing where going from dark to light makes the screen super white so you can barely see for a second, cameras do that very noticably but eyeballs don’t
The only games that do the whole “blinded by the sun” thing worth a damn are the Fallout games, and it makes sense because you’ve never seen the sun in your life.
It’s especially annoying when racing games do that shit.
What’s really fucking stupid is that they’ll make it so it takes forever for your “sight” to adjust when going from dark to light but driving into a tunnel is near-instant perfect night vision. That is how neither vision nor camera works. Not only that, but you can always see further into tunnels than you possibly could in real life.
The correct way of implementing chromatic aberration would be like the one on the “corrected” side. There is still some, but it really is subtle.
Anyway, I don’t think games are a good target for chromatic aberration. It’s really meant for photorealistic scenes, mainly photorealistic renders, that give a sort of uncanny valley effect without it.
But once again - it looks stupid if your scene is not photo-realistic in the first place.
It is supposed to mimic low quality cameras. Chromatic abberation occurs because different colors of light focus at slightly different distances from the lens. This is the same effect that causes prisms to “split” white light into its component colors. i.e the angle light is bent depends on its wavelength/color. Newer, more expensive cameras have various means of either cirrecting for or avoiding the problem.
That said, even low-end lenses from the past decade or so have far less chromatic aberration than top-tier glass from decades back. I have an old Canon telephoto that produces crazy color fringes on anything and everything if I’m not careful, but my new cheapass Lumix zoom only does so in pretty extreme situations.
It’s definitely a good time to be a photography nerd.
Glad there isn’t even one picture to show what the effect is. Are they talking about lens flare or something similar but different?
It is more or less a color halo outlining everything. It was supposed to simulate the subtle visual distortion of older lenses i.e cameras but… who the hell even wants that?
I will never understand what asshole thought adding chromatic aberration into EVERY GAME EVER was a good idea.
Probably someone that also likes CBT and the show Big Brother.
Every gimmick for verisimilitude gets abused to hell and back. We just gloss over the ones that are less frustrating to the goal of… lookin’ at stuff.
Destiny and Warframe are awash in gold because physically-based shaders made metals look super good. Ambient occlusion was egregious after Crysis, but games without a little bit feel weird now, and even the Wii got coerced into doing it efficiently. HDR tone-mapping was part of the brown-and-bloom era, but it’s still here and you’d never think twice about it.
Lens flare is more common than ever, but much better than its goofy line-of-sprites roots in the 90s, because you blur the whole screen and flip it. It doesn’t have to be blinding to be obvious and… aesthetic.
“God rays” and participating media / volumetric fog have been admirably restrained, considering how stupidly pretty they look, and the fact PS3 launch titles figured out you can just do it badly and blur. Downright awful sampling works so long as it’s different awful sampling from nearby pixels. Even Quake 3 did some sparse approximations on the CPU. I guess thick fog is just undesirable to developers, now that it’s not disguising tiny worlds or keeping framerates tolerable.
Unfortunately we can probably expect shakycam to take off after Unrecord. That game does a ton to look shockingly realistic, but a lot of companies will overdo about half of its tricks, and not understand why their playtesters have such queasy stomachs.
They added it because they can and I am told that it was easy to do.
There’s nothing wrong with cock and ball torture.
As an amateur astronomer with a strong eyeglass prescription, chromatic aberration is the bane of my existence. I get why they try to simulate a camera, but the more I can avoid the pitfalls of cheap low quality lenses, the better–I already have two of them on my face all the time
Same, I even pay hundreds of dollars out of pocket to have glass lenses cut because I legitimately don’t understand how people deal with the chromatic distortion and starburst effect that come with the high refraction plastic.
I mean, I do get it - people just don’t know any better. What I don’t get is why a literal doctor of optometry will look at you like you’ve got three heads when you start asking about the superior optic properties of glass.
Like Bloom in the 7th gen it was the style at the time. Someone at the time had a shitty idea that the “camera” in games should mimic cameras (bad ones) and I guess some exec liked it and was spread along all AAA games.
I guess now we’re going back on that like we did with “brown and grey = realism” fad.
I do see bloom and light halos/rays at night thanks to an astigmatism
I don’t understand this need to make games look like they’re filmed on a camera, it kills the immersion for me. I’m playing a fantasy game, I don’t want it to look like I’m watching a shitty video. Some games also do this thing where going from dark to light makes the screen super white so you can barely see for a second, cameras do that very noticably but eyeballs don’t
The only games that do the whole “blinded by the sun” thing worth a damn are the Fallout games, and it makes sense because you’ve never seen the sun in your life.
Everywhere else it needs to piss off.
Been playing dead island 2 and every time you go from inside to daylight, the whole screen essentially goes white and I hate it
It’s especially annoying when racing games do that shit.
What’s really fucking stupid is that they’ll make it so it takes forever for your “sight” to adjust when going from dark to light but driving into a tunnel is near-instant perfect night vision. That is how neither vision nor camera works. Not only that, but you can always see further into tunnels than you possibly could in real life.
Eyeballs aren’t as bad as cameras, but they definitely also do that.
You’re still using your eyeballs!
The correct way of implementing chromatic aberration would be like the one on the “corrected” side. There is still some, but it really is subtle.
Anyway, I don’t think games are a good target for chromatic aberration. It’s really meant for photorealistic scenes, mainly photorealistic renders, that give a sort of uncanny valley effect without it.
But once again - it looks stupid if your scene is not photo-realistic in the first place.
Here is an alternative Piped link(s):
video by Blender Guru
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Is it supposed to highlight the “mirage-ness” of it all or something?
It is supposed to mimic low quality cameras. Chromatic abberation occurs because different colors of light focus at slightly different distances from the lens. This is the same effect that causes prisms to “split” white light into its component colors. i.e the angle light is bent depends on its wavelength/color. Newer, more expensive cameras have various means of either cirrecting for or avoiding the problem.
Putting on my “that guy” hat here…
The quality has nothing to do with it. Even very high end lenses can exhibit chromatic aberration under certain circumstances. Have a look at any sports broadcast. Once you see it, you can’t stop, and the lenses on those cameras are decidedly NOT low quality. Or price. https://www.bhphotovideo.com/c/product/1314025-REG/canon_uj86x9_3b_p01_dss_uhd_digisuper_86_broadcast.html
That said, even low-end lenses from the past decade or so have far less chromatic aberration than top-tier glass from decades back. I have an old Canon telephoto that produces crazy color fringes on anything and everything if I’m not careful, but my new cheapass Lumix zoom only does so in pretty extreme situations.
It’s definitely a good time to be a photography nerd.
It took me way too long to spot the difference.
In games it’s usually way less subtle