93143 wrote:
rainwarrior wrote:
The hard part about refraction in games is how you render/determine what's "under" the surface of the water. If the surface of the water was flat and still, you could render the entire scene upside down under the water, and use that are your reflected version, or an offset lookup to that for your refracted version.
...I can't tell if you actually know what refraction is. It's got nothing to do with reflection; it's just the distortion of the image of what's actually under the water - rocks, weeds, fish, sunken chests and so forth - due to the bending of light being transmitted up through the surface in accordance with Snell's Law.
Sorry, I conflated the two things a little when I said "upside down", but refraction and reflection are physically tied together, and usually both are needed together for a simulation of water.
So, the implementation I was talking about has two lookup textures, one for reflection, one for refraction. The refraction texture is the surface below the water (possibly just all the opaque stuff in the scene that's rendered and saved off to a texture before you start doing translucent stuff like water on top), and the reflection texture is a separate version of the scene rendered upside down flipped through the plane of the water (ignoring its perturbations).
So, for the refraction component, you take the view direction vs. the surface normal of the water, and you displace the texture lookup based on that. It's not accurate to the actual angles, but you at least get a continuous effect where a shallower viewing angle creates a stronger distortion.
The reflection component is the same idea but with the reflected view normal, and looking up into that upside down scene reflection texture instead. The two results are blended, the refracted light coming from under the surface, and the reflected light bouncing off it. The blend might be altered based on viewing angle, depending on how you want to simulate this.
What I was saying is that this particular way of faking water refraction (and reflection) is prone to error at the edges of the water especially, which is why you might want to have some vertex weight or something to fade the strength of the effect out at the edges. There are many other ways to simulate both reflection and refraction, though. This is just one thing I've used and seen used in several places.
Reflections are often done with cube maps (or other kind of environment map) where some static approximation of the scene, or often just the sky, is used in place of the actual reflected scene. This does have the potential to simulate how reflections change drastically according to viewing angle, so it's often pretty effective at simulating the feel of reflection. You can use the same technique for refraction, but it tends to be a bit less applicable/convincing.
93143 wrote:
And I've seen people fake the distortion of that transmitted image due to surface disturbances. What I've never seen is modelling of the average effect, the one that's still there even if the water is completely still. If you aim at a fish with a spear in a video game, you will hit it, and that's not physically accurate.
Well, I don't know which hypothetical fish-spearing game you're referring to. The pursuit of some aspect of realism and accessible gameplay are often at odds, so I'm not sure the incentive is there to make a properly refracted fish in a lot of games to begin with, even if it were feasible? A game like
Fishing Planet might be a good place to go looking for this kind of thing.
I think the one big thing that's hard to solve without raytracting is just the uneven surface of the water. If you have an object that is partially in the water and partially out, it's very hard to make that edge match up properly when you're relying on a flat plane approximation underneath, and if often results in seeing "inside" a 3D object that's being cut off or displaced and leaving a hole where it crosses the water. If you can control the player's viewpoint so that you can get away with just normal mapping instead of actually having a non-flat water geometry, there's a lot you can get away with, though.
93143 wrote:
rainwarrior wrote:
The problem is, refraction and water requires a continuous variation of the surface, i.e. every different angle requires a different viewpoint on that reflection. You can't get all of that from one upside-down viewpoint, you'd need a different view from each point on the curved surface. No-go. In general, the technique is to render the upside-down scene once (and save to a texture) and then use the angle of refraction to warp the lookup to that texture
I've actually tried to figure out how to do this on the Nintendo 64, to get somewhat realistic-looking water reflections. Unfortunately I don't think you can alter the position of a texture read based on the value of the previous texture read, possibly because the texture filter is downstream of the texturing unit so the latter doesn't actually have access to the filtered local value. There are other possible methods, but nothing quite as neat and easy has occurred to me yet...
I don't know quite what you've got available on the N64, but you can simulate both refractions and reflections with vertex effects. This is something vertex shaders can be quite good at on modern GPUs, and even without a GPU to do the grunt work it might be pretty reasonable on the CPU for the right number of vertices.
For refraction, distorting the shape unders a planar surface according to the viewing angle is pretty straightforward. Splitting it at the surface might not be quite as easy (though you can just let there be some error on edges that cross the surface). Dealing with a non-planar surface for the water becomes much, much tougher though. (Again, would be trivial for a raytracer...) Clipping planes and multiple passes can help. You can apply other "watery" distortions to the vertices to simulate some wobbly refraction too (similar to how an SNES game might put a sine offset on backgroundscanlines underwater), but that's getting away from accuracy and more toward just simulating the feel of it.
Similarly for reflections, you can flip the scene upside down and render it translucently. That can make a perfect looking reflection in a flat plane, at least. The same wobbly vertex modulations can apply to this too.
Actually, with a modern GPU it might even be pretty feasible to actually raytrace/raymarch some water surface in a shader. Wouldn't combine with the usual raster surfaces everything else is made of, but if the water can kinda be its own self-contained little procedural world it might work. Or if your game is fully about that kinds stuff, like that
Voxel Quest experiment, it could be pretty straightforward to implement very accurate refraction.