{"id":91254,"date":"2018-06-25T13:33:43","date_gmt":"2018-06-25T13:33:43","guid":{"rendered":"http:\/\/www.sickgaming.net\/blog\/2018\/06\/25\/creating-toon-water-for-the-web-part-3\/"},"modified":"2018-06-25T13:33:43","modified_gmt":"2018-06-25T13:33:43","slug":"creating-toon-water-for-the-web-part-3","status":"publish","type":"post","link":"https:\/\/sickgaming.net\/blog\/2018\/06\/25\/creating-toon-water-for-the-web-part-3\/","title":{"rendered":"Creating Toon Water for the Web: Part 3"},"content":{"rendered":"<p>Welcome back to this three-part series on creating stylized toon water in PlayCanvas using vertex shaders.\u00a0In <a href=\"http:\/\/gamedevelopment.tutsplus.com\/tutorials\/creating-toon-water-for-the-web-part-2--cms-30485\" rel=\"external noopener noreferrer\" target=\"_blank\">Part 2<\/a> we covered buoyancy &amp; foam lines. In this final part, we&#8217;re going to apply the underwater distortion as a post-process effect. <\/p>\n<h2>Refraction &amp; Post-Process Effects<\/h2>\n<p>Our goal is to visually communicate the refraction of light through water. We&#8217;ve already covered <a href=\"https:\/\/gamedevelopment.tutsplus.com\/tutorials\/using-displacement-shaders-to-create-an-underwater-effect--cms-27191\" rel=\"external noopener noreferrer\" target=\"_blank\">how to create this sort of distortion in a fragment shader in a previous tutorial<\/a> for a 2D scene. The only difference here is that we&#8217;ll need to figure out which area of the screen is underwater and only apply the distortion there.\u00a0<\/p>\n<h3>Post-Processing<\/h3>\n<p>In general, a post-process effect is anything applied to the whole scene after it is rendered, such as a colored tint or an <a href=\"https:\/\/www.shadertoy.com\/view\/Ms23DR\" rel=\"external noopener noreferrer\" target=\"_blank\">old CRT screen effect<\/a>. Instead of rendering your scene directly to the screen, you first render it to a buffer or texture, and then render that to the screen, passing through a custom shader.<\/p>\n<p>In PlayCanvas, you can set up a post-process effect by creating a new script. Call it <strong>Refraction.js<\/strong>, and copy this template to start with:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">\/\/--------------- POST EFFECT DEFINITION------------------------\/\/\npc.extend(pc, function () { \/\/ Constructor - Creates an instance of our post effect var RefractionPostEffect = function (graphicsDevice, vs, fs, buffer) { var fragmentShader = \"precision \" + graphicsDevice.precision + \" float;\\n\"; fragmentShader = fragmentShader + fs; \/\/ this is the shader definition for our effect this.shader = new pc.Shader(graphicsDevice, { attributes: { aPosition: pc.SEMANTIC_POSITION }, vshader: vs, fshader: fs }); this.buffer = buffer; }; \/\/ Our effect must derive from pc.PostEffect RefractionPostEffect = pc.inherits(RefractionPostEffect, pc.PostEffect); RefractionPostEffect.prototype = pc.extend(RefractionPostEffect.prototype, { \/\/ Every post effect must implement the render method which \/\/ sets any parameters that the shader might require and \/\/ also renders the effect on the screen render: function (inputTarget, outputTarget, rect) { var device = this.device; var scope = device.scope; \/\/ Set the input render target to the shader. This is the image rendered from our camera scope.resolve(\"uColorBuffer\").setValue(inputTarget.colorBuffer); \/\/ Draw a full screen quad on the output target. In this case the output target is the screen. \/\/ Drawing a full screen quad will run the shader that we defined above pc.drawFullscreenQuad(device, outputTarget, this.vertexBuffer, this.shader, rect); } }); return { RefractionPostEffect: RefractionPostEffect };\n}()); \/\/--------------- SCRIPT DEFINITION------------------------\/\/\nvar Refraction = pc.createScript('refraction'); Refraction.attributes.add('vs', { type: 'asset', assetType: 'shader', title: 'Vertex Shader'\n}); Refraction.attributes.add('fs', { type: 'asset', assetType: 'shader', title: 'Fragment Shader'\n}); \/\/ initialize code called once per entity\nRefraction.prototype.initialize = function() { var effect = new pc.RefractionPostEffect(this.app.graphicsDevice, this.vs.resource, this.fs.resource); \/\/ add the effect to the camera's postEffects queue var queue = this.entity.camera.postEffects; queue.addEffect(effect); this.effect = effect; \/\/ Save the current shaders for hot reload this.savedVS = this.vs.resource; this.savedFS = this.fs.resource;\n}; Refraction.prototype.update = function(){ if(this.savedFS != this.fs.resource || this.savedVS != this.vs.resource){ this.swap(this); }\n}; Refraction.prototype.swap = function(old){ this.entity.camera.postEffects.removeEffect(old.effect); this.initialize(); };<\/pre>\n<p>This is just like a normal script, but we define a <code class=\"inline\">RefractionPostEffect<\/code> class that can be applied to the camera. This needs a vertex and a fragment shader to render. The attributes are already set up, so let&#8217;s create <strong>Refraction.frag<\/strong> with this content:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">precision highp float; uniform sampler2D uColorBuffer;\nvarying vec2 vUv0; void main() { vec4 color = texture2D(uColorBuffer, vUv0); gl_FragColor = color;\n}\n<\/pre>\n<p>And <strong>Refraction.vert<\/strong> with a basic vertex shader:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">attribute vec2 aPosition;\nvarying vec2 vUv0; void main(void)\n{ gl_Position = vec4(aPosition, 0.0, 1.0); vUv0 = (aPosition.xy + 1.0) * 0.5;\n}\n<\/pre>\n<p>Now attach the <strong>Refraction.js<\/strong> script to the camera, and assign the shaders to the appropriate attributes. When you launch the game, you should see the scene exactly as it was before. This is a blank post effect that simply re-renders the scene. To verify that this is working, try giving the scene a red tint. <\/p>\n<p>In Refraction.frag, instead of simply returning the color, try setting the red component to 1.0, which should look like the image below.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Scene rendered with a red tint \" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30487\/image\/Red_Tint.png\"><\/figure>\n<h3>Distortion Shader<\/h3>\n<p>We need to add a time uniform for the animated distortion, so go ahead and create one in Refraction.js, inside this constructor for the post effect:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">var RefractionPostEffect = function (graphicsDevice, vs, fs) { var fragmentShader = \"precision \" + graphicsDevice.precision + \" float;\\n\"; fragmentShader = fragmentShader + fs; \/\/ this is the shader definition for our effect this.shader = new pc.Shader(graphicsDevice, { attributes: { aPosition: pc.SEMANTIC_POSITION }, vshader: vs, fshader: fs }); \/\/ &gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt; Initialize the time here this.time = 0; };<\/pre>\n<p>Now, inside this render function, we pass it to our shader and increment it:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">RefractionPostEffect.prototype = pc.extend(RefractionPostEffect.prototype, { \/\/ Every post effect must implement the render method which \/\/ sets any parameters that the shader might require and \/\/ also renders the effect on the screen render: function (inputTarget, outputTarget, rect) { var device = this.device; var scope = device.scope; \/\/ Set the input render target to the shader. This is the image rendered from our camera scope.resolve(\"uColorBuffer\").setValue(inputTarget.colorBuffer); \/\/\/ &gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt;&gt; Pass the time uniform here scope.resolve(\"uTime\").setValue(this.time); this.time += 0.1; \/\/ Draw a full screen quad on the output target. In this case the output target is the screen. \/\/ Drawing a full screen quad will run the shader that we defined above pc.drawFullscreenQuad(device, outputTarget, this.vertexBuffer, this.shader, rect); }\n});<\/pre>\n<p>Now we can use the same shader code from the water distortion tutorial, making our full fragment shader look like this:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">precision highp float; uniform sampler2D uColorBuffer;\nuniform float uTime; varying vec2 vUv0; void main() { vec2 pos = vUv0; float X = pos.x*15.+uTime*0.5; float Y = pos.y*15.+uTime*0.5; pos.y += cos(X+Y)*0.01*cos(Y); pos.x += sin(X-Y)*0.01*sin(Y); vec4 color = texture2D(uColorBuffer, pos); gl_FragColor = color;\n}\n<\/pre>\n<p>If it all worked out, everything should now look like as if it&#8217;s underwater, as below.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Underwater distortion applied to the whole scene \" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30487\/image\/Screen_Distortion_Loop_Optimized.gif\"><\/figure>\n<blockquote><p>\n<em>Challenge #1: Make the distortion only apply to the bottom half of the screen. <\/em>\n<\/p><\/blockquote>\n<h3>Camera Masks<br \/>\n<\/h3>\n<p>We&#8217;re almost there. All we need to do now is to apply this distortion effect just on the underwater part of the screen. The most straightforward way I&#8217;ve come up with to do this is to re-render the scene with the water surface rendered as a solid white, as shown below.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Water surface rendered as a solid white to act as a mask\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30487\/image\/Water_Mask.png\"><\/figure>\n<p>This would be rendered to a texture that would act as a mask. We would then pass this texture to our refraction shader, which would only distort a pixel in the final image if the corresponding pixel in the mask is white. <\/p>\n<p>Let&#8217;s add a boolean attribute on the water surface to know if it&#8217;s being used as a mask. Add this to Water.js:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">Water.attributes.add('isMask', {type:'boolean',title:\"Is Mask?\"});<\/pre>\n<p>We can then pass it to the shader with <code class=\"inline\">material.setParameter('isMask',this.isMask);<\/code> as usual. Then declare it in Water.frag and set the color to white if it&#8217;s true.<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">\/\/ Declare the new uniform at the top\nuniform bool isMask; \/\/ At the end of the main function, override the color to be white \/\/ if the mask is true if(isMask){ color = vec4(1.0); }<\/pre>\n<p>Confirm that this works by toggling the &#8220;Is Mask?&#8221; property in the editor and relaunching the game. It should look white, as in the earlier image. <\/p>\n<p>Now, to re-render the scene, we need a second camera. Create a new camera in the editor and call it <strong>CameraMask<\/strong>. Duplicate the Water entity in the editor as well, and call it <strong>WaterMask<\/strong>. Make sure the &#8220;Is Mask?&#8221; is false for the Water entity but true for the WaterMask. <\/p>\n<p>To tell the new camera to render to a texture instead of the screen, create a new script called <strong>CameraMask.js<\/strong> and attach it to the new camera. We create a <a href=\"https:\/\/developer.playcanvas.com\/en\/api\/pc.RenderTarget.html\" rel=\"external noopener noreferrer\" target=\"_blank\">RenderTarget<\/a> to capture this camera&#8217;s output like this:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">\/\/ initialize code called once per entity\nCameraMask.prototype.initialize = function() { \/\/ Create a 512x512x24-bit render target with a depth buffer var colorBuffer = new pc.Texture(this.app.graphicsDevice, { width: 512, height: 512, format: pc.PIXELFORMAT_R8_G8_B8, autoMipmap: true }); colorBuffer.minFilter = pc.FILTER_LINEAR; colorBuffer.magFilter = pc.FILTER_LINEAR; var renderTarget = new pc.RenderTarget(this.app.graphicsDevice, colorBuffer, { depth: true }); this.entity.camera.renderTarget = renderTarget;\n};<\/pre>\n<p>Now, if you launch, you&#8217;ll see this camera is no longer rendering to the screen. We can grab the output of its render target in <strong>Refraction.js<\/strong> like this:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">Refraction.prototype.initialize = function() { var cameraMask = this.app.root.findByName('CameraMask'); var maskBuffer = cameraMask.camera.renderTarget.colorBuffer; var effect = new pc.RefractionPostEffect(this.app.graphicsDevice, this.vs.resource, this.fs.resource, maskBuffer); \/\/ ... \/\/ The rest of this function is the same as before };<\/pre>\n<p>Notice that I pass this mask texture as an argument to the post effect constructor. We need to create a reference to it in our constructor, so it looks like:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">\/\/\/\/ Added an extra argument on the line below\nvar RefractionPostEffect = function (graphicsDevice, vs, fs, buffer) { var fragmentShader = \"precision \" + graphicsDevice.precision + \" float;\\n\"; fragmentShader = fragmentShader + fs; \/\/ this is the shader definition for our effect this.shader = new pc.Shader(graphicsDevice, { attributes: { aPosition: pc.SEMANTIC_POSITION }, vshader: vs, fshader: fs }); this.time = 0; \/\/\/\/ &lt;&lt;&lt;&lt;&lt;&lt;&lt;&lt;&lt;&lt;&lt;&lt;&lt; Saving the buffer here this.buffer = buffer; };<\/pre>\n<p>Finally, in the render function, pass the buffer to our shader with:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">scope.resolve(\"uMaskBuffer\").setValue(this.buffer); <\/pre>\n<p>Now to verify that this is all working, I&#8217;ll leave that as a challenge. <\/p>\n<blockquote><p>Challenge #2: Render the uMaskBuffer to the screen to confirm it is the output of the second camera. \n<\/p><\/blockquote>\n<p>One thing to be aware of is that the render target is set up in the initialize of CameraMask.js, and that needs to be ready by the time Refraction.js is called. If the scripts run the other way around, you&#8217;ll get an error. To make sure they run in the right order, drag the CameraMask to the top of the entity list in the editor, as shown below.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"PlayCanvas editor with CameraMask at top of entity list\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30487\/image\/Entity_Order.png\"><\/figure>\n<p>The second camera should always be looking at the same view as the original one, so let&#8217;s make it always follow its position and rotation in the update of CameraMask.js:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">CameraMask.prototype.update = function(dt) { var pos = this.CameraToFollow.getPosition(); var rot = this.CameraToFollow.getRotation(); this.entity.setPosition(pos.x,pos.y,pos.z); this.entity.setRotation(rot);\n};<\/pre>\n<p>And define <code class=\"inline\">CameraToFollow<\/code> in the initialize:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">this.CameraToFollow = this.app.root.findByName('Camera');<\/pre>\n<h3>Culling Masks<br \/>\n<\/h3>\n<p>Both cameras are currently rendering the same thing. We want the mask camera to render everything except the real water, and we want the real camera to render everything except the mask water.<\/p>\n<p>To do this, we can use the camera&#8217;s culling bit mask. This works similarly to <a href=\"https:\/\/www.aurelienribon.com\/post\/2011-07-box2d-tutorial-collision-filtering\/\" rel=\"external noopener noreferrer\" target=\"_blank\">collision masks<\/a> if you&#8217;ve ever used those. An object will be culled (not rendered) if the result of a bitwise <code class=\"inline\">AND<\/code> between its mask and the camera mask is 1. <\/p>\n<p>Let&#8217;s say the Water will have bit 2 set, and WaterMask will have bit 3. Then the real camera needs to have all bits set except for 3, and the mask camera needs to have all bits set except for 2. An easy way to say &#8220;all bits except N&#8221; is to do:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">~(1 &lt;&lt; N) &gt;&gt;&gt; 0<\/pre>\n<p>You can <a href=\"https:\/\/developer.mozilla.org\/en-US\/docs\/Web\/JavaScript\/Reference\/Operators\/Bitwise_Operators\" rel=\"external noopener noreferrer\" target=\"_blank\">read more about bitwise operators here<\/a>. <\/p>\n<p>To set up the camera culling masks, we can put this inside <strong>CameraMask.js&#8217;<\/strong>s initialize at the bottom:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\"> \/\/ Set all bits except for 2 this.entity.camera.camera.cullingMask &amp;= ~(1 &lt;&lt; 2) &gt;&gt;&gt; 0; \/\/ Set all bits except for 3 this.CameraToFollow.camera.camera.cullingMask &amp;= ~(1 &lt;&lt; 3) &gt;&gt;&gt; 0; \/\/ If you want to print out this bit mask, try: \/\/ console.log((this.CameraToFollow.camera.camera.cullingMask &gt;&gt;&gt; 0).toString(2));<\/pre>\n<p>Now, in Water.js, set the Water mesh&#8217;s mask on bit 2, and the mask version of it on bit 3:<\/p>\n<pre class=\"brush: javascript noskimlinks noskimwords\">\/\/ Put this at the bottom of the initialize of Water.js \/\/ Set the culling masks var bit = this.isMask ? 3 : 2; meshInstance.mask = 0; meshInstance.mask |= (1 &lt;&lt; bit);<\/pre>\n<p>Now, one view will have the normal water, and the other will have the solid white water. The left half of the image below is the view from the original camera, and the right half is from the mask camera.<\/p>\n<figure class=\"post_image\"><img decoding=\"async\" alt=\"Split view of mask camera and original camera\" src=\"https:\/\/cms-assets.tutsplus.com\/uploads\/users\/728\/posts\/30487\/image\/Mask_View.png\"><\/figure>\n<h3>Applying the Mask<\/h3>\n<p>One final step now! We know the areas underwater are marked with white pixels. We just need to check if we&#8217;re not at a white pixel, and if so, turn off the distortion in <strong>Refraction.frag<\/strong>:<\/p>\n<pre class=\"brush: c noskimlinks noskimwords\">\/\/ Check original position as well as new distorted position\nvec4 maskColor = texture2D(uMaskBuffer, pos);\nvec4 maskColor2 = texture2D(uMaskBuffer, vUv0);\n\/\/ We're not at a white pixel?\nif(maskColor != vec4(1.0) || maskColor2 != vec4(1.0)){ \/\/ Return it back to the original position pos = vUv0;\n}<\/pre>\n<p>And that should do it! <\/p>\n<p><em>One thing to note is that since the texture for the mask is initialized on launch, if you resize the window at runtime, it will no longer match the size of the screen. <\/em><\/p>\n<h3>Anti-Aliasing<\/h3>\n<p>As an optional clean-up step, you might have noticed that edges in the scene now look a little sharp. This is because when we applied our post effect, we lost anti-aliasing.\u00a0<\/p>\n<p>We can apply an additional anti-alias on top of our effect as another post effect. Luckily, there&#8217;s one available in the <a href=\"https:\/\/store.playcanvas.com\" rel=\"external noopener noreferrer\" target=\"_blank\">PlayCanvas store<\/a> we can just use. Go to the <a href=\"https:\/\/store.playcanvas.com\/item\/57\/fxaa-post-effect\" rel=\"external noopener noreferrer\" target=\"_blank\">script asset page<\/a>, click the big green download button, and choose your project from the list that appears. The script will appear in the root of your asset window as <strong>posteffect-fxaa.js<\/strong>. Just attach this to the Camera entity, and your scene should look a little nicer!\u00a0 <\/p>\n<h2>Final Thoughts<br \/>\n<\/h2>\n<p>If you&#8217;ve made it this far, give yourself a pat on the back! We covered a lot of techniques in this series. You should now be comfortable with vertex shaders, rendering to textures, applying post-processing effects, selectively culling objects, using the depth buffer, and working with blending and transparency. Even though we were implementing this in PlayCanvas, these are all general graphics concepts you&#8217;ll find in some form on whatever platform you end up in.<\/p>\n<p>All these techniques are also applicable to a variety of other effects. One particularly interesting application I&#8217;ve found of vertex shaders is in <a href=\"https:\/\/www.youtube.com\/watch?v=l9NX06mvp2E\" rel=\"external noopener noreferrer\" target=\"_blank\">this talk on the art of Abzu<\/a>,\u00a0where they explain how they used vertex shaders to efficiently animate tens of thousands of fish on screen. <\/p>\n<p>You should now also have a nice water effect you can apply to your games! You could easily customize it now that you&#8217;ve put together every detail yourself. There&#8217;s still a lot more you can do with water (I haven&#8217;t even mentioned any sort of reflection at all). Below are a couple of ideas.<\/p>\n<h4>Noise-Based Waves<br \/>\n<\/h4>\n<p>Instead of simply animating the waves with a combination of sine and cosines, you can sample a noise texture to make the waves look a bit more natural and unpredictable. <\/p>\n<h4>Dynamic Foam Trails<\/h4>\n<p>Instead of completely static water lines on the surface, you could draw onto that texture when objects move, to create a dynamic foam trail. There are a lot of ways to go about doing this, so this could be its own project. <\/p>\n<h2>Source Code<\/h2>\n<p>You can find the <a href=\"https:\/\/playcanvas.com\/project\/533435\/overview\/toon-water--tuts-tutorial\" rel=\"external noopener noreferrer\" target=\"_blank\">finished hosted PlayCanvas project here<\/a>. A <a href=\"https:\/\/github.com\/OmarShehata\/tutsplus-toon-water\" rel=\"external noopener noreferrer\" target=\"_blank\">Three.js port is also available in this repository<\/a>.<\/p>\n<div class=\"mediafed_ad\"><img loading=\"lazy\" decoding=\"async\" border=\"0\" height=\"1\" src=\"http:\/\/audio.tutsplus.com.feedsportal.com\/c\/35227\/f\/668810\/s\/30487\/sc\/4\/mf.gif\" width=\"1\" \/><a href=\"http:\/\/da.feedsportal.com\/r\/186529796139\/u\/407\/f\/668810\/c\/35227\/s\/30487\/a2.htm\"><img decoding=\"async\" border=\"0\" src=\"http:\/\/da.feedsportal.com\/r\/186529796139\/u\/407\/f\/668810\/c\/35227\/s\/30487\/a2.img\" \/><\/a><img loading=\"lazy\" decoding=\"async\" border=\"0\" height=\"1\" src=\"http:\/\/pi.feedsportal.com\/r\/186529796139\/u\/407\/f\/668810\/c\/35227\/s\/30487\/a2t.img\" width=\"1\" \/><\/div>\n","protected":false},"excerpt":{"rendered":"<p>Welcome back to this three-part series on creating stylized toon water in PlayCanvas using vertex shaders.\u00a0In Part 2 we covered buoyancy &amp; foam lines. In this final part, we&#8217;re going to apply the underwater distortion as a post-process effect. Refraction &amp; Post-Process Effects Our goal is to visually communicate the refraction of light through water. [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-91254","post","type-post","status-publish","format-standard","hentry","category-tutorials"],"_links":{"self":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/91254","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/comments?post=91254"}],"version-history":[{"count":0,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/91254\/revisions"}],"wp:attachment":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/media?parent=91254"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/categories?post=91254"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/tags?post=91254"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}