{"id":91275,"date":"2018-06-22T13:33:42","date_gmt":"2018-06-22T13:33:42","guid":{"rendered":"http:\/\/www.sickgaming.net\/blog\/2018\/06\/22\/creating-toon-water-for-the-web-part-2\/"},"modified":"2018-06-22T13:33:42","modified_gmt":"2018-06-22T13:33:42","slug":"creating-toon-water-for-the-web-part-2","status":"publish","type":"post","link":"https:\/\/sickgaming.net\/blog\/2018\/06\/22\/creating-toon-water-for-the-web-part-2\/","title":{"rendered":"Creating Toon Water for the Web: Part 2"},"content":{"rendered":"<p>Welcome back to this three-part series on creating stylized toon water in PlayCanvas\u00a0using vertex shaders.\u00a0In <a href=\"http:\/\/gamedevelopment.tutsplus.com\/tutorials\/creating-toon-water-for-the-web-part-1--cms-30447\" rel=\"external noopener noreferrer\" target=\"_blank\">Part 1<\/a>, we covered setting up our environment and water surface. This part will cover applying buoyancy to objects, adding water lines to the surface, and creating the foam lines with the depth buffer around the edges of objects that intersect the surface.\u00a0<\/p>\n<p>I made some small changes to my scene to make it look a little nicer. You can customize your scene however you like, but what I did was:<\/p>\n<ul>\n<li>Added the lighthouse and the octopus models.<\/li>\n<li>Added a ground plane with color <code class=\"inline\">#FFA457<\/code>.\u00a0<\/li>\n<li>Added a clear color for the camera of <code class=\"inline\">#6CC8FF<\/code>.<\/li>\n<li>Added an ambient color to the scene of <code class=\"inline\">#FFC480<\/code> (you can find this in the scene settings). \n<\/li>\n<\/ul>\n<p>Below is what my starting point now looks like.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"The scene now includes an octopus and a ligthouse\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Initial_Part_2.png\"><\/figure>\n<h2>Buoyancy\u00a0<\/h2>\n<p>The most straightforward way to create buoyancy is just to create a script that will push objects up and down. Create a new script called <strong>Buoyancy.js <\/strong>and set its initialize to:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">Buoyancy.prototype.initialize = function() { this.initialPosition = this.entity.getPosition().clone(); this.initialRotation = this.entity.getEulerAngles().clone(); \/\/ The initial time is set to a random value so that if \/\/ this script is attached to multiple objects they won't \/\/ all move the same way this.time = Math.random() * 2 * Math.PI;\n};<\/pre>\n<p>Now, in the update, we increment time and rotate the object:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">Buoyancy.prototype.update = function(dt) { this.time += 0.1; \/\/ Move the object up and down var pos = this.entity.getPosition().clone(); pos.y = this.initialPosition.y + Math.cos(this.time) * 0.07; this.entity.setPosition(pos.x,pos.y,pos.z); \/\/ Rotate the object slightly var rot = this.entity.getEulerAngles().clone(); rot.x = this.initialRotation.x + Math.cos(this.time * 0.25) * 1; rot.z = this.initialRotation.z + Math.sin(this.time * 0.5) * 2; this.entity.setLocalEulerAngles(rot.x,rot.y,rot.z);\n};<\/pre>\n<p>Apply this script to your boat and watch it bobbing up and down in the water! You can apply this script to several objects (including the camera\u2014try it)! <\/p>\n<h2>Texturing the Surface <br \/>\n<\/h2>\n<p>Right now, the only way you can see the waves is by looking at the edges of the water surface. Adding a texture helps make motion on the surface more visible and is a cheap way to simulate reflections and caustics. <\/p>\n<p>You can try to find some caustics texture or make your own. <a href=\"https:\/\/i.imgur.com\/079WEyx.png\" rel=\"external noopener noreferrer\" target=\"_blank\">Here&#8217;s one I drew<\/a> in Gimp that you can freely use. Any texture will work as long as it can be tiled seamlessly. <\/p>\n<p>Once you&#8217;ve found a texture you like, drag it into your project&#8217;s asset window. We need to reference this texture in our Water.js script, so create an attribute for it:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">Water.attributes.add('surfaceTexture', { type: 'asset', assetType: 'texture', title: 'Surface Texture'\n});<\/pre>\n<p>And then assign it in the editor:<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"The water texture is added to the water script\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Assign_Water_Texture.png\"><\/figure>\n<p>Now we need to pass it to our shader. Go to <strong>Water.js<\/strong> and set a new parameter in the <code class=\"inline\">CreateWaterMaterial<\/code> function:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">material.setParameter('uSurfaceTexture',this.surfaceTexture.resource);<\/pre>\n<p>Now go into <strong>Water.frag <\/strong>and declare our new uniform:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">uniform sampler2D uSurfaceTexture;<\/pre>\n<p>We&#8217;re almost there. To render the texture onto the plane, we need to know where each pixel is along the mesh. Which means we need to pass some data from the vertex shader to the fragment shader.<\/p>\n<h3>Varying Variables <br \/>\n<\/h3>\n<p>A <em>varying<\/em><strong> <\/strong>variable allows you to pass data from the vertex shader to the fragment shader. This is the third type of special variable you can have in a shader (the other two being <em>uniform<\/em><strong> <\/strong>and <em>attribute<\/em>). It is defined for each vertex and is accessible by each pixel. Since there are a lot more pixels than vertices, the value is interpolated between vertices (this is where the name &#8220;varying&#8221; comes from\u2014it varies from the values you give it).<\/p>\n<p>To try this out, declare a new variable in <strong>Water.vert<\/strong> as a varying:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">varying vec2 ScreenPosition;<\/pre>\n<p>And then set it to <code class=\"inline\">gl_Position<\/code> after it&#8217;s been computed:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">ScreenPosition = gl_Position.xyz;<\/pre>\n<p>Now go back to <strong>Water.frag<\/strong> and declare the same variable. There&#8217;s no way to get some debug output from within a shader, but we can use color to visually debug. Here&#8217;s one way to do this:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">uniform sampler2D uSurfaceTexture;\nvarying vec3 ScreenPosition; void main(void)\n{ vec4 color = vec4(0.0,0.7,1.0,0.5); \/\/ Testing out our new varying variable color = vec4(vec3(ScreenPosition.x),1.0); gl_FragColor = color;\n}<\/pre>\n<p>The plane should now look black and white, where the line separating them is where <code class=\"inline\">ScreenPosition.x<\/code> is 0. Color values only go from 0 to 1, but the values in <code class=\"inline\">ScreenPosition<\/code> can be outside this range. They get automatically clamped, so if you&#8217;re seeing black, that could be 0, or negative. <\/p>\n<p>What we&#8217;ve just done is passed the screen position of every vertex to every pixel. You can see that the line separating the black and white sides is always going to be in the center of the screen, regardless of where the surface actually is in the world. <\/p>\n<blockquote><p>\n<em>Challenge #1: Create a new varying variable to pass the world position instead of the screen position. Visualize it in the same way as we did above. If the color doesn&#8217;t change with the camera, then you&#8217;ve done this correctly. <\/em>\n<\/p><\/blockquote>\n<h3>Using UVs\u00a0<\/h3>\n<p>The <a href=\"https:\/\/en.wikipedia.org\/wiki\/UV_mapping\" rel=\"external noopener noreferrer\" target=\"_blank\">UV<\/a>s are the 2D coordinates for each vertex along the mesh, normalized from 0 to 1. This is exactly what we need to sample the texture onto the plane correctly, and it should already be set up from the previous part. <\/p>\n<p>Declare a new attribute in <strong>Water.vert<\/strong> (this name comes from the shader definition in Water.js):<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">attribute vec2 aUv0;<\/pre>\n<p>And all we need to do is pass it to the fragment shader, so just create a varying and set it to the attribute:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">\/\/ In Water.vert\n\/\/ We declare this along with our other variables at the top\nvarying vec2 vUv0; \/\/ ..\n\/\/ Down in the main function, we store the value of the attribute \/\/ in the varying so that the frag shader can access it vUv0 = aUv0;\n<\/pre>\n<p>Now we declare the same varying in the fragment shader. To verify it works, we can visualize it as before, so that Water.frag now looks like:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">uniform sampler2D uSurfaceTexture;\nvarying vec2 vUv0; void main(void)\n{ vec4 color = vec4(0.0,0.7,1.0,0.5); \/\/ Confirming UV's color = vec4(vec3(vUv0.x),1.0); gl_FragColor = color;\n}<\/pre>\n<p>And you should see a gradient, confirming that we have a value of 0 at one end and 1 at the other. Now, to actually sample our texture, all we have to do is:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">color = texture2D(uSurfaceTexture,vUv0);<\/pre>\n<p>And you should see the texture on the surface:<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Caustics texture is applied to the water surface\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Texture_On_Surface.png\"><\/figure>\n<h3>Stylizing the Texture<\/h3>\n<p>Instead of just setting the texture as our new color, let&#8217;s combine it with the blue we had:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">uniform sampler2D uSurfaceTexture;\nvarying vec2 vUv0; void main(void)\n{ vec4 color = vec4(0.0,0.7,1.0,0.5); vec4 WaterLines = texture2D(uSurfaceTexture,vUv0); color.rgba += WaterLines.r; gl_FragColor = color;\n}<\/pre>\n<p>This works because the color of the texture is black (0) everywhere except for the water lines. By adding it, we don&#8217;t change the original blue color except for the places where there are lines, where it becomes brighter.\u00a0<\/p>\n<p>This isn&#8217;t the only way to combine the colors, though. <\/p>\n<blockquote><p><em>Challenge #2: Can you combine the colors in a way to get the subtler effect shown below?<\/em><\/p><\/blockquote>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Water lines applied to the surface with a more subtle color\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Subtle_Water_Lines.png\"><\/figure>\n<h3>Moving the Texture<\/h3>\n<p>As a final effect, we want the lines to move along the surface so it doesn&#8217;t look so static. To do this, we use the fact that any value given to the <code class=\"inline\">texture2D<\/code> function outside the 0 to 1 range will wrap around (such that 1.5 and 2.5 both become 0.5). So we can increment our position by the time uniform variable we already set up and multiply the position to either increase or decrease the density of the lines in our surface, making our final frag shader look like this:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">uniform sampler2D uSurfaceTexture;\nuniform float uTime;\nvarying vec2 vUv0; void main(void)\n{ vec4 color = vec4(0.0,0.7,1.0,0.5); vec2 pos = vUv0; \/\/ Multiplying by a number greater than 1 causes the \/\/ texture to repeat more often pos *= 2.0; \/\/ Displacing the whole texture so it moves along the surface pos.y += uTime * 0.02; vec4 WaterLines = texture2D(uSurfaceTexture,pos); color.rgba += WaterLines.r; gl_FragColor = color;\n}<\/pre>\n<h2>Foam Lines &amp; the Depth Buffer<\/h2>\n<p>Rendering foam lines around objects in water makes it far easier to see how objects are immersed and where they cut the surface. It also makes our water look a lot more believable. To do this, we need to somehow figure out where the edges are on each object, and do this efficiently. <\/p>\n<h3>The Trick<br \/>\n<\/h3>\n<p>What we want is to be able to tell, given a pixel on the surface of the water, whether it&#8217;s close to an object. If so, we can color it as foam. There&#8217;s no straightforward way to do this (that I know of). So to figure this out, we&#8217;re going to use a helpful problem-solving technique: come up with an example we know the answer to, and see if we can generalize it.\u00a0<\/p>\n<p>Consider the view below. <\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Lighthouse in water\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Foam_Example_A.png\"><\/figure>\n<p>Which pixels should be part of the foam? We know it should look something like this:<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Lighthouse in water with foam \" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Foam_Example_A_Filled.png\"><\/figure>\n<p>So let&#8217;s think about two specific pixels. I&#8217;ve marked two with stars below. The black one is in the foam. The red one is not. How can we tell them apart inside a shader?<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Lighthouse in water with two marked pixels\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Foam_Example_A_Marked.png\"><\/figure>\n<p>What we know is that even though those two pixels are close together in screen space (both are rendered right on top of the lighthouse body), they&#8217;re actually far apart in world space. We can verify this by looking at the same scene from a different angle, as shown below.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Viewing the lighthouse from above\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Foam_Example_B_Marked.png\"><\/figure>\n<p>Notice that the red star isn&#8217;t on top of the lighthouse body as it appeared, but the black star actually is. We can tell them apart using the distance to the camera, commonly referred to as &#8220;depth&#8221;, where a depth of 1 means it&#8217;s very close to the camera and a depth of 0 means it&#8217;s very far.\u00a0 But it&#8217;s not just a matter of the absolute world distance, or depth, to the camera. It&#8217;s the depth <em>compared to the pixel behind<\/em>. <\/p>\n<p>Look back to the first view. Let&#8217;s say the lighthouse body has a depth value of 0.5. The black star&#8217;s depth would be very close to 0.5. So it and the pixel behind it have similar depth values. The red star, on the other hand, would have a much larger depth, because it would be closer to the camera, say 0.7. And yet the pixel behind it, still on the lighthouse, has a depth value of 0.5, so there&#8217;s a bigger difference there. <\/p>\n<p>This is the trick. <em>When the depth of the pixel on the water surface is close enough to the depth of the pixel it&#8217;s drawn on top of, we&#8217;re pretty close to the edge of something<\/em>, and\u00a0we can render it as foam.\u00a0 <\/p>\n<p>So we need more information than is available in any given pixel. We somehow need to know the depth of the pixel that it&#8217;s about to be drawn on top of. This is where the depth buffer comes in. <\/p>\n<h3>The Depth Buffer<br \/>\n<\/h3>\n<p>You can think of a buffer, or a framebuffer, as just an off-screen render target, or a texture. You would want to render off-screen\u00a0when you&#8217;re trying to read data back, <a href=\"https:\/\/gamedevelopment.tutsplus.com\/tutorials\/how-to-write-a-smoke-shader--cms-25587\" rel=\"external noopener noreferrer\" target=\"_blank\">a technique that this smoke effect employs.<\/a><\/p>\n<p>The depth buffer is a special render target that holds information about the depth values at each pixel. Remember that the value in <code class=\"inline\">gl_Position<\/code> computed in the vertex shader was a screen space value, but it also had a third coordinate, a Z value. This Z value is used to compute the depth which is written to the depth buffer.\u00a0<\/p>\n<p>The purpose of the depth buffer is to draw our scene correctly, without the need to sort objects back to front. Every pixel that is about to be drawn first consults the depth buffer. If its depth value is greater than the value in the buffer, it is drawn, and its own value overwrites the one in the buffer. Otherwise, it is discarded (because it means another object is in front of it). <\/p>\n<p>You can actually turn off writing to the depth buffer to see how things would look without it. You can try this in Water.js:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">material.depthTest = false;<\/pre>\n<p>You&#8217;ll\u00a0see how the water will always be rendered on top, even if it is behind opaque objects. <\/p>\n<h3>Visualizing the Depth Buffer<br \/>\n<\/h3>\n<p>Let&#8217;s add a way to visualize the depth buffer for debugging purposes. Create a new script called <strong>DepthVisualize.js<\/strong>. Attach this to your camera.\u00a0<\/p>\n<p>All we have to do to get access to the depth buffer in PlayCanvas is to say:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">this.entity.camera.camera.requestDepthMap();\n<\/pre>\n<p>This will then automatically inject a uniform into all of our shaders that we can use by declaring it as:<\/p>\n<pre class=\"brush: plain noskimlinks noskimwords\">uniform sampler2D uDepthMap;<\/pre>\n<p>Below is a sample script that requests the depth map and renders it on top of our scene. It&#8217;s set up for hot-reloading.\u00a0 <\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">var DepthVisualize = pc.createScript('depthVisualize'); \/\/ initialize code called once per entity\nDepthVisualize.prototype.initialize = function() { this.entity.camera.camera.requestDepthMap(); this.antiCacheCount = 0; \/\/ To prevent the engine from caching our shader so we can live-update it this.SetupDepthViz();\n}; DepthVisualize.prototype.SetupDepthViz = function(){ var device = this.app.graphicsDevice; var chunks = pc.shaderChunks; this.fs = ''; this.fs += 'varying vec2 vUv0;'; this.fs += 'uniform sampler2D uDepthMap;'; this.fs += ''; this.fs += 'float unpackFloat(vec4 rgbaDepth) {'; this.fs += ' const vec4 bitShift = vec4(1.0 \/ (256.0 * 256.0 * 256.0), 1.0 \/ (256.0 * 256.0), 1.0 \/ 256.0, 1.0);'; this.fs += ' float depth = dot(rgbaDepth, bitShift);'; this.fs += ' return depth;'; this.fs += '}'; this.fs += ''; this.fs += 'void main(void) {'; this.fs += ' float depth = unpackFloat(texture2D(uDepthMap, vUv0)) * 30.0; '; this.fs += ' gl_FragColor = vec4(vec3(depth),1.0);'; this.fs += '}'; this.shader = chunks.createShaderFromCode(device, chunks.fullscreenQuadVS, this.fs, \"renderDepth\" + this.antiCacheCount); this.antiCacheCount ++; \/\/ We manually create a draw call to render the depth map on top of everything this.command = new pc.Command(pc.LAYER_FX, pc.BLEND_NONE, function () { pc.drawQuadWithShader(device, null, this.shader); }.bind(this)); this.command.isDepthViz = true; \/\/ Just mark it so we can remove it later this.app.scene.drawCalls.push(this.command);\n}; \/\/ update code called every frame\nDepthVisualize.prototype.update = function(dt) { }; \/\/ swap method called for script hot-reloading\n\/\/ inherit your script state here\nDepthVisualize.prototype.swap = function(old) { this.antiCacheCount = old.antiCacheCount; \/\/ Remove the depth viz draw call for(var i=0;i&lt;this.app.scene.drawCalls.length;i++){ if(this.app.scene.drawCalls[i].isDepthViz){ this.app.scene.drawCalls.splice(i,1); break; } } \/\/ Recreate it this.SetupDepthViz();\n}; \/\/ to learn more about script anatomy, please read:\n\/\/ http:\/\/developer.playcanvas.com\/en\/user-manual\/scripting\/<\/pre>\n<p>Try copying that in, and comment\/uncomment the line <code class=\"inline\">this.app.scene.drawCalls.push(this.command);<\/code> to toggle the depth rendering. It should look something like the image below.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Boat and lighthouse scene rendered as a depth map\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Depth_Map.png\"><\/figure>\n<blockquote><p>\n<em>Challenge #3: The water surface is not drawn into the depth buffer. The PlayCanvas engine does this intentionally. Can you figure out why? What&#8217;s special about the water material? To put it another way, based on our depth checking rules, what would happen if the water pixels did write to the depth buffer?<\/em>\n<\/p><\/blockquote>\n<p><em>Hint: There is one line you can change in Water.js that will cause the water to be written to the depth buffer. <\/em><\/p>\n<p>Another thing to notice is that I multiply the depth value by 30 in the embedded shader in the initialize function. This is just to be able to see it clearly, because otherwise the range of values are too small to see as shades of color. <\/p>\n<h3>Implementing the Trick<\/h3>\n<p>The PlayCanvas engine includes a bunch of helper functions to work with depth values, but at the time of writing they aren&#8217;t released into production, so we&#8217;re just going to set these up ourselves.<\/p>\n<p>Define the following uniforms to <strong>Water.frag<\/strong>:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">\/\/ These uniforms are all injected automatically by PlayCanvas\nuniform sampler2D uDepthMap;\nuniform vec4 uScreenSize;\nuniform mat4 matrix_view;\n\/\/ We have to set this one up ourselves\nuniform vec4 camera_params;<\/pre>\n<p>Define these helper functions above the main function:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">#ifdef GL2 float linearizeDepth(float z) { z = z * 2.0 - 1.0; return 1.0 \/ (camera_params.z * z + camera_params.w); }\n#else #ifndef UNPACKFLOAT #define UNPACKFLOAT float unpackFloat(vec4 rgbaDepth) { const vec4 bitShift = vec4(1.0 \/ (256.0 * 256.0 * 256.0), 1.0 \/ (256.0 * 256.0), 1.0 \/ 256.0, 1.0); return dot(rgbaDepth, bitShift); } #endif\n#endif float getLinearScreenDepth(vec2 uv) { #ifdef GL2 return linearizeDepth(texture2D(uDepthMap, uv).r) * camera_params.y; #else return unpackFloat(texture2D(uDepthMap, uv)) * camera_params.y; #endif\n} float getLinearDepth(vec3 pos) { return -(matrix_view * vec4(pos, 1.0)).z;\n} float getLinearScreenDepth() { vec2 uv = gl_FragCoord.xy * uScreenSize.zw; return getLinearScreenDepth(uv);\n}<\/pre>\n<p>Pass some information about the camera to the shader in <strong>Water.js<\/strong>. Put this where you pass other uniforms like uTime:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">if(!this.camera){ this.camera = this.app.root.findByName(\"Camera\").camera;\n}\nvar camera = this.camera; var n = camera.nearClip;\nvar f = camera.farClip;\nvar camera_params = [ 1\/f, f, (1-f \/ n) \/ 2, (1 + f \/ n) \/ 2\n]; material.setParameter('camera_params', camera_params);<\/pre>\n<p>Finally, we need the world position for each pixel in our frag shader. We need to get this from the vertex shader. So define a varying in <strong>Water.frag<\/strong>:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">varying vec3 WorldPosition;<\/pre>\n<p>Define the same varying in <strong>Water.vert<\/strong>. Then set it to the distorted position in the vertex shader, so the full code would look like:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">attribute vec3 aPosition;\nattribute vec2 aUv0; varying vec2 vUv0;\nvarying vec3 WorldPosition; uniform mat4 matrix_model;\nuniform mat4 matrix_viewProjection; uniform float uTime; void main(void)\n{ vUv0 = aUv0; vec3 pos = aPosition; pos.y += cos(pos.z*5.0+uTime) * 0.1 * sin(pos.x * 5.0 + uTime); gl_Position = matrix_viewProjection * matrix_model * vec4(pos, 1.0); WorldPosition = pos;\n}\n<\/pre>\n<h3>Actually Implementing the Trick<br \/>\n<\/h3>\n<p>Now we&#8217;re finally ready to implement the technique described at the beginning of this section. We want to compare the depth of the pixel we&#8217;re at to the depth of the pixel behind it. The pixel we&#8217;re at comes from the world position, and the pixel behind comes from the screen position. So grab these two depths:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">float worldDepth = getLinearDepth(WorldPosition);\nfloat screenDepth = getLinearScreenDepth();<\/pre>\n<blockquote><p>\n<em>Challenge #4: One of these values will never be greater than the other (assuming depthTest = true). Can you deduce which? <\/em>\n<\/p><\/blockquote>\n<p>We know the foam is going to be where the distance between these two values is small. So let&#8217;s render that difference at each pixel. Put this at the bottom of your shader (and make sure the depth visualization script from the previous section is turned off):<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">color = vec4(vec3(screenDepth - worldDepth),1.0);\ngl_FragColor = color;<\/pre>\n<p>Which should look something like this:<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"A rendering of the depth difference at each pixel \" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30485\/image\/Depth_Difference.png\"><\/figure>\n<p>Which correctly picks out the edges of any object immersed in water in real time! You can of course scale this difference we&#8217;re rendering to make the foam look thicker\/thinner. <\/p>\n<p>There are now a lot of ways in which\u00a0you can combine this output with the water surface color to get nice-looking foam lines. You could keep it as a gradient, use it to sample from another texture, or set it to a specific color if the difference is less than or equal to some threshold.<\/p>\n<p>My favorite look is setting it to a color similar to that of the static water lines, so my final main function looks like this:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">void main(void)\n{ vec4 color = vec4(0.0,0.7,1.0,0.5); vec2 pos = vUv0 * 2.0; pos.y += uTime * 0.02; vec4 WaterLines = texture2D(uSurfaceTexture,pos); color.rgba += WaterLines.r * 0.1; float worldDepth = getLinearDepth(WorldPosition); float screenDepth = getLinearScreenDepth(); float foamLine = clamp((screenDepth - worldDepth),0.0,1.0) ; if(foamLine &lt; 0.7){ color.rgba += 0.2; } gl_FragColor = color;\n}<\/pre>\n<h2>Summary<\/h2>\n<p>We created buoyancy on objects floating in the water, we gave our surface a moving texture to simulate caustics, and we saw how we could use the depth buffer to create dynamic foam lines. <\/p>\n<p>To finish this up, the next and final part will introduce post-process effects and how to use them to create the underwater distortion effect. <\/p>\n<h2>Source Code<br \/>\n<\/h2>\n<p>You can find the <a href=\"https:\/\/playcanvas.com\/project\/533435\/overview\/toon-water--tuts-tutorial\" rel=\"external noopener noreferrer\" target=\"_blank\">finished hosted PlayCanvas project here<\/a>. A <a href=\"https:\/\/github.com\/OmarShehata\/tutsplus-toon-water\" rel=\"external noopener noreferrer\" target=\"_blank\">Three.js port is also available in this repository<\/a>.<\/p>\n<div class=\"mediafed_ad\"><img loading=\"lazy\" decoding=\"async\" border=\"0\" height=\"1\" src=\"http:\/\/audio.tutsplus.com.feedsportal.com\/c\/35227\/f\/668810\/s\/30485\/sc\/4\/mf.gif\" width=\"1\" \/><a href=\"http:\/\/da.feedsportal.com\/r\/186529796139\/u\/407\/f\/668810\/c\/35227\/s\/30485\/a2.htm\"><img decoding=\"async\" border=\"0\" src=\"http:\/\/da.feedsportal.com\/r\/186529796139\/u\/407\/f\/668810\/c\/35227\/s\/30485\/a2.img\" \/><\/a><img loading=\"lazy\" decoding=\"async\" border=\"0\" height=\"1\" src=\"http:\/\/pi.feedsportal.com\/r\/186529796139\/u\/407\/f\/668810\/c\/35227\/s\/30485\/a2t.img\" width=\"1\" \/><\/div>\n","protected":false},"excerpt":{"rendered":"<p>Welcome back to this three-part series on creating stylized toon water in PlayCanvas\u00a0using vertex shaders.\u00a0In Part 1, we covered setting up our environment and water surface. This part will cover applying buoyancy to objects, adding water lines to the surface, and creating the foam lines with the depth buffer around the edges of objects that [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-91275","post","type-post","status-publish","format-standard","hentry","category-tutorials"],"_links":{"self":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/91275","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/comments?post=91275"}],"version-history":[{"count":0,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/91275\/revisions"}],"wp:attachment":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/media?parent=91275"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/categories?post=91275"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/tags?post=91275"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}