Demystifying Shaders: Unlocking the Magic Behind Stunning Graphics

Have you ever wondered how video games and animations achieve their breathtaking visuals? The answer lies in shaders, the unsung heroes of computer graphics programming. In this post, we'll delve into the world of shaders, demystify their purpose, and explore the incredible effects they bring to life in virtual worlds.

What are Shaders?

At their core, shaders are small programs that run on the GPU (Graphics Processing Unit) and determine the visual appearance of objects on the screen. They control everything from colors and textures to lighting and shadows. Shaders come in different types, including vertex shaders, fragment shaders, geometry shaders, and compute shaders. Each type serves a specific purpose in the graphics pipeline.

Vertex Shaders

Vertex shaders operate on individual vertices of 3D objects and handle transformations like position, rotation, and scaling. They lay the foundation for object movement and positioning in a virtual scene. Additionally, vertex shaders can calculate lighting effects on a per-vertex basis, adding realism and depth to the visuals.

Fragment Shaders

Pixel shaders work on individual pixels of the rendered image. They determine the final color of each pixel, taking into account lighting conditions, textures, material properties, and intricate details. With pixel shaders, developers can create mesmerizing effects such as shadows, reflections, refractions, and intricate surface textures.

Geometry Shaders

Geometry shaders process entire primitive shapes, such as triangles, rather than individual vertices or pixels. They have the power to create new geometry or modify existing geometry, enabling advanced techniques like tessellation, particle systems, and procedural generation. Geometry shaders provide an extra layer of flexibility and creativity in graphics programming.

Compute Shaders

Compute shaders are a relatively new addition to the world of shaders. They are designed for general-purpose computations on the GPU, allowing developers to perform complex calculations and simulations. Compute shaders find applications in physics simulations, AI algorithms, image processing, and other computationally intensive tasks.

The Impact of Shaders

Shaders play a pivotal role in transforming simple 3D models into stunning visual experiences. They empower developers to create realistic lighting effects, simulate intricate material properties, and craft immersive virtual environments. Shaders contribute to the overall atmosphere, mood, and realism that captivate players and viewers in video games, movies, and animations.

Embark on a Journey of Shader Creation

In the upcoming section, we invite you to join us on an exciting journey where we'll delve into shader creation, starting with simple shaders and gradually exploring more complex ones. Get ready to unleash your creativity as we guide you through each step, allowing you to witness firsthand the power and artistry behind shader development. Sit back, relax, and enjoy this immersive adventure as we embark on a joyous exploration of shader creation.

To embark on this creative journey of shader development, we'll need to have some essential tools at our disposal. The tools we'll be using are React,  Three.js, Fiber, Drei, Glslify and leva.

npx create-react-app SimpleShader

cd SimpleShader

npm install three @react-three/fiber
npm install @react-three/drei
npm install leva
npm install glslify

A Simple Shader

🧠
Github with all the code implementation.

This code snippet showcases a React component called "SimpleShader" that utilizes the @react-three/fiber, @react-three/drei, and leva libraries to create a simple shader effect. Let's break down the code step by step:

Import Statements:

  • The Canvas component is imported from @react-three/fiber library. It provides the WebGL rendering context for the 3D scene.
  • The Plane and OrbitControls components are imported from @react-three/drei. They represent a 3D plane geometry and a control component for interactive camera movement, respectively.
  • The useControls function is imported from the leva library. It allows for easy creation and management of GUI controls to adjust shader parameters.
  • The THREE object is imported from the three library, which provides the necessary utilities and classes for working with 3D graphics.

Shader Definition:

  • The vertexShader and fragmentShader are imported from separate files. These contain the GLSL code that defines the vertex and fragment shaders, respectively. They control how the geometry and pixels are rendered and manipulated.
  • A useRef hook is imported from React, which is later used to store references to certain values that need to persist across component renders.

Shader Uniforms:

  • The uniforms object defines the shader uniform variables. In this case, it includes:
  • u_colorA and u_colorB: Vector3 uniforms representing two colors used in the shader.
  • u_resolution: Vector2 uniform representing the screen resolution.

useControls:

  • The useControls hook is used to create GUI controls that allow interactive adjustments of shader parameters.
  • Two controls are created, one for value representing u_colorA, and the other for end representing u_colorB. Each control has an onChange callback that updates the respective uniform value when the control is adjusted.

JSX Return:

  • The Canvas component is rendered with additional props, including dpr for device pixel ratio and camera for specifying the camera position.
  • Within the Canvas, a Plane component is rendered, representing a flat rectangular surface in the 3D scene.
  • The Plane is given a shaderMaterial component as its material, which takes the uniforms, vertexShader, and fragmentShader as props. It also sets the side prop to THREE.DoubleSide to render both sides of the plane.
  • The OrbitControls component allows for interactive camera movement in the scene.

Overall, this code demonstrates how to create a basic shader effect using React, @react-three/fiber, and @react-three/drei libraries. The shader is applied to a plane geometry, and the GUI controls provided by leva enable real-time adjustments of the shader's color parameters.

The vertexShader

The contents of the vertexShader.js file include the import of the glslify library and the definition of a vertex shader. The glslify library allows for the use of GLSL code within JavaScript. The vertex shader code declares a varying variable, performs transformations on vertex positions using matrices, assigns texture coordinates to the vUv variable, and assigns the transformed vertex position to gl_Position. The vertex shader code is exported as the default export of the module.

Vertex Shader Definition:

  • Inside the shader code, there is a varying variable declaration varying vec2 vUv;. The vUv variable represents the texture coordinates of the vertex and is used to pass data from the vertex shader to the fragment shader.
  • The main() function is the entry point of the vertex shader.
  • Within main(), the vUv variable is assigned the value of the vertex's texture coordinates uv.
  • The modelMatrix, viewMatrix, and projectionMatrix are built-in uniform matrices that transform the vertex position from model space to clip space.
  • The vertex position is transformed by multiplying it with the modelMatrix, viewMatrix, and projectionMatrix to obtain the final position in clip space.
  • The transformed position is assigned to gl_Position, a built-in output variable that represents the vertex position in clip space.

The fragmentShader

The contents of the fragmentShader.js file consist of an import statement for the glslify library and the definition of a fragment shader. The glslify library enables the use of GLSL code within JavaScript.

Within the fragment shader code:

  1. The varying variable vUv represents the texture coordinates passed from the vertex shader.
  2. The uniform variables u_colorA, u_colorB, and u_resolution are declared. These uniforms allow for dynamic color manipulation and specify the resolution of the screen.
  3. The main() function serves as the entry point for the fragment shader.
  4. Within main(), gl_FragCoord.xy represents the current pixel's coordinates on the screen. The normalizedPixel variable is calculated by dividing gl_FragCoord.xy by u_resolution.x, resulting in a normalized value between 0 and 1.
  5. The mix() function is used to interpolate between u_colorA and u_colorB based on the normalizedPixel.x value. This creates a smooth color transition effect.
  6. The resulting color is assigned to the gl_FragColor, which represents the final color output of the fragment shader.

The code concludes with the default export of the fragmentShader variable, containing the GLSL code.

In summary, the fragmentShader.js file imports the glslify library and defines a fragment shader that calculates a normalized pixel value, performs color interpolation between u_colorA and u_colorB, and outputs the final color through gl_FragColor.

A Psychedelic Shader

This psychedelic fragment shader creates mesmerizing patterns and vibrant color variations based on the texture coordinates and the elapsed time. Feel free to experiment and modify the code to achieve your desired psychedelic effects!

🧠
Github with all the code implementation.

The main differences between the PsychedelicShader and the SimpleShader components are as follows:

Shader Effect:

  • The SimpleShader component uses a simple shader effect that manipulates colors and transitions between two user-defined colors.
  • The PsychedelicShader component uses a more complex shader effect that creates psychedelic patterns and vibrant color variations based on mathematical calculations.

Shader Uniforms:

  • In the SimpleShader, there are three uniform variables: u_colorA, u_colorB, and u_resolution. These control the colors and screen resolution in the shader.
  • In the PsychedelicShader, there is a single uniform variable: u_time. This uniform represents the elapsed time and allows for time-based animations in the shader.

Custom Component:

  • In the SimpleShader, there is no custom component defined. The shader is directly applied to the Plane component.
  • In the PsychedelicShader, a custom component named MyCustomPlane is created. This component encapsulates the Plane component and manages the shader material, uniform updates, and animation using the useFrame hook.

The fragmentShader

This fragment shader creates a psychedelic effect by distorting the coordinates and applying sine functions to create circular patterns. The resulting color is based on the sine of the calculated circles value.

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
	vec2 uv = fragCoord.xy / u_resolution.xy;
	vec2 pos = (uv.xy-0.5);
	vec2 cir = ((pos.xy*pos.xy+sin(uv.x*18.0+u_time)/25.0*sin(uv.y*7.0+u_time*1.5)/1.0)+uv.x*sin(u_time)/16.0+uv.y*sin(u_time*1.2)/16.0);
	float circles = (sqrt(abs(cir.x+cir.y*0.5)*25.0)*5.0);
	fragColor = vec4(sin(circles*1.25+2.0),abs(sin(circles*1.0-1.0)-sin(circles)),abs(sin(circles)*1.0),1.0);
}
  • uv represents the normalized pixel coordinates (fragCoord.xy) divided by the resolution (u_resolution.xy). This ensures that uv ranges from 0 to 1.
  • pos is obtained by subtracting 0.5 from uv.xy, effectively shifting the coordinate system to the center. This means that pos now ranges from -0.5 to 0.5.
  • cir combines pos.xy with various calculations to create circular patterns and distortions:
  • (pos.xy * pos.xy) creates a squared pattern where the distance from the center determines the magnitude.
  • sin(uv.x * 18.0 + u_time) / 25.0 introduces sinusoidal variations based on uv.x and u_time.
  • sin(uv.y * 7.0 + u_time * 1.5) / 1.0 introduces sinusoidal variations based on uv.y and u_time.
  • uv.x * sin(u_time) / 16.0 and uv.y * sin(u_time * 1.2) / 16.0 further introduce sinusoidal variations based on uv.x and uv.y in relation to u_time.
  • circles calculates the magnitude of the circular patterns by taking the square root of the absolute sum of cir.x and half of cir.y, multiplied by constants.
  • fragColor is assigned based on the circles value:
  • sin(circles * 1.25 + 2.0) determines the red component based on the sine of circles.
  • abs(sin(circles * 1.0 - 1.0) - sin(circles)) determines the green component as the absolute difference between two sine variations.
  • abs(sin(circles) * 1.0) determines the blue component as the absolute value of the sine of circles.
  • The alpha component is set to 1.0, indicating full opacity.

Together, these calculations create a psychedelic effect by manipulating the pixel coordinates and applying sinusoidal variations. The resulting fragColor exhibits vibrant and dynamic patterns. Feel free to modify the values and experiment to achieve different visual effects!

A Water Shader

The Water Shader is a captivating visual effect that simulates the appearance of flowing water within a 3D scene. By utilizing advanced shaders, it creates the illusion of realistic waves and ripples, producing an immersive and dynamic water surface.

🧠
Github with all the code implementation.

The main differences between the WaterShader and the previous shaders are as follows:

Custom Geometry:

  • The WaterShader component uses a custom geometry in the form of a cylinder. It creates a cylinder using the <cylinderGeometry> component from Three.js.
  • In contrast, the previous shaders used a <Plane> component to create a flat rectangular surface.

Mesh and Ref:

  • In the MyCustomPlane component of the WaterShader, a mesh reference is created using the useRef hook. This reference is used to access the material of the mesh.
  • In the previous shaders, the material reference was used to access the shader material of the <Plane> component.

Updating Uniforms:

  • In the useFrame hook of the MyCustomPlane component in the WaterShader, the clock.getElapsedTime() value is assigned to the u_time uniform of the material using mesh.current.material.uniforms.u_time.value.
  • In the previous shaders, the useFrame hook updated the u_time uniform of the material directly using material.current.uniforms.u_time.value.

The fragmentShader

Explanation:

  • #define TAU 6.28318530718 defines the value of Tau, which is equal to 2π. It is used for angular calculations.
  • #define MAX_ITER 5 sets the maximum number of iterations for the loop, controlling the complexity and detail of the water effect.
  • float time = u_time * .5 + 23.0 scales the u_time uniform and adds an offset to control the speed of the animation.
  • vec2 uv = fragCoord.xy / u_resolution.xy normalizes the fragment coordinates (fragCoord) by dividing them by the resolution (u_resolution.xy).
  • vec2 p = mod(uv * TAU, TAU) - 250.0 applies modulo operations to wrap the normalized coordinates into a repeating pattern, and then subtracts 250.0 to center the pattern.
  • vec2 i = vec2(p) initializes the i vector with the current position.
  • float c = 1.0 initializes the c variable as 1.0, which accumulates the color intensity.
  • float inten = .005 represents the intensity factor of the water effect.

The subsequent for loop iterates MAX_ITER times, calculating the water effect:

  • float t = time * (1.0 - (3.5 / float(n + 1))) controls the time factor for each iteration.
  • i is updated using trigonometric functions applied to t and the previous i values.
  • c accumulates the intensity by adding the reciprocal of the length of a vector calculated from p divided by inten.
  • After the loop, c is divided by float(MAX_ITER) to obtain the average intensity.
  • c = 1.17 - pow(c, 1.4) applies a power function to manipulate the intensity and create a wave-like effect.
  • vec3 colour = vec3(pow(abs(c), 8.0)) calculates the color based on the modified intensity value.
  • colour = clamp(colour + vec3(0.0, 0.35, 0.5), 0.0, 1.0) applies additional color transformations and clamps the result between 0 and 1.
  • #ifdef SHOW_TILING is a preprocessor directive that checks if the SHOW_TILING flag is defined.

Within the conditional block:

  • vec2 pixel = 2.0 / u_resolution.xy calculates the size of a pixel in UV coordinates.
  • uv *= 2.0 scales the UV coordinates by 2.
  • float f = floor(mod(u_time * .5, 2.0)) calculates a flashing effect based on the u_time.
  • vec2 first = step(pixel, uv) * f creates a mask to rule out the first line of pixels based on pixel size and f.
  • uv = step(fract(uv), pixel) adds one line of pixels per tile.
  • colour = mix(colour, vec3(1.0, 1.0, 0.0), (uv.x + uv.y) * first.x * first.y) mixes the color with yellow along the tile borders.
  • Finally, fragColor = vec4(colour, 1.0) assigns the resulting color to the output fragment color.

This shader uses mathematical calculations, trigonometric functions, and iterations to generate a visually pleasing water effect. The values of the uniforms, such as u_time and u_resolution, can be modified to control the animation speed and resolution, respectively.

A Sky Shader

The Cosmic Sky Shader is a breathtaking visual effect that replicates the vast expanse of the cosmos in a 3D environment. By harnessing advanced shader techniques, it creates a realistic representation of a starry sky, capturing the awe-inspiring beauty of the universe. This shader is designed to transport viewers to distant galaxies, nebulae, and celestial wonders, providing an immersive and otherworldly experience. With its ability to simulate the twinkling stars, cosmic dust, and celestial phenomena, the Cosmic Sky Shader adds a mesmerizing backdrop that evokes a sense of wonder and exploration.

🧠
Github with all the code implementation.

The fragmentShader

Explanation:

  • The code begins with importing the glslify library to enable the use of external GLSL code within the shader.
  • precision highp float; sets the precision for floating-point calculations to highp.
  • uniform vec2 iResolution; and uniform float iTime; declare the input uniforms for resolution and time.
  • varying vec2 vUv; is the varying variable that stores the UV coordinates interpolated from the vertex shader.
  • vec4 texture(sampler2D sampler, vec2 coord, float x) is a helper function to sample a texture at the specified coordinates.
  • void mainImage(out vec4, vec2 fragCoord) is the function declaration for the main shader logic.
  • void main() is the entry point of the shader. It initializes outfrag and fragCoord variables, then calls the mainImage function.
  • The subsequent function declarations (float mod289(float x), vec4 mod289(vec4 x), vec4 perm(vec4 x)) are utility functions used for noise generation.
  • float noise(vec3 p) is a function that generates Perlin-like noise based on a 3D position p. It uses pseudo-random gradients and interpolation to create smooth and natural-looking noise.
  • float field(in vec3 p, float s) is a function that calculates the field value at a given position p using a strength parameter s. It iteratively calculates the field strength based on the distance to the previous position and adds weighted contributions.
  • float field2(in vec3 p, float s) is a similar function to field, but with fewer iterations, representing the second layer of the effect.
  • vec3 nrand3(vec2 co) is a function that generates three pseudo-random values based on a 2D coordinate co. It uses trigonometric functions to produce varied and random-looking values.
  • void mainImage(out vec4 fragColor, in vec2 fragCoord) is the main shader logic. It begins by calculating the normalized UV coordinates and the corresponding position p.
  • freqs[0], freqs[1], freqs[2], and freqs[3] store the noise values for different frequencies and time.
  • The field values t and v are calculated using the field function and the UV coordinates. v represents the falloff effect at the edges of the screen.
  • A second layer is introduced with p2 and t2 using the field2 function.
  • Stars are added using a random seed and nrand3 function.
  • Finally, the fragColor is assigned the final color value based on the calculated field values, second layer, and stars.

This shader combines noise generation, field calculations, and randomization to create a cosmic sky effect with varying frequencies, field strengths, and star patterns.

Conclusion

Shaders are powerful tools that enable the creation of stunning visual effects and immersive experiences in 3D environments. From simple shaders that manipulate colors and shapes to more complex shaders that simulate natural phenomena or cosmic scenes, the possibilities are endless. By harnessing the capabilities of tools like React Fiber, Three.js, and glslify, developers and artists can unlock their creativity and bring their visions to life. Whether you're exploring the depths of the ocean with a realistic water shader, gazing at the mesmerizing patterns of a psychedelic shader, or venturing into the cosmic realms with a sky shader, the world of shaders offers a vast playground for innovation and artistic expression. So, dive into the realm of shaders, experiment, and let your imagination soar. Goodbye and happy shader coding!