🟢 questions tend to be easier; I will give extensive hints (e.g. pseudocode, pointing out exactly where your bug is). 🟦 questions tend to be harder; I will give somewhat less extensive hints in person 🏴 questions tend to be hardest
a. (10 pts) 🟢 Make a texture, and then update its data (very slowly on the CPU) to match the shadertoy starter code NOTE: I will code some of this live in class.
✨ HINT
uv
is the texture coordinates for a given pixel
Recall texture coordinates go from (0,0) in the lower left to (1,1) in the upper right. You have to do a little math to compute these yourself for each pixel.
(NOTE ! Don't confuse this with NDC!--which goes from (-1,-1) to (1,1).)
GLSL syntax is delightfully flexible, if also somewhat intimidating.
Let's break down line 7.
vec3 col = 0.5 + 0.5 * cos(iTime + uv.xyx + vec3(0.0, 2.0, 4.0));
- a GLSL vec3
is three floats (very similar to snail's vec3
)
- 0.5
is being added to a GLSL vec3
, so here it translates to V3(0.5, 0.5, 0.5)
in snail
- for some GLSL vec3 a;
, cos(a)
is the same as V3(cos(a.x), cos(a.y), cos(a.z))
- iTime
is the current time (i've already set up a time variable t
for you); note that iTime
is being added to a GLSL vec3
, so in this case it translates to V3(t, t, t)
in snail
- uv.xyx
is an example of swizzling and translates to V3(uv.x, uv.y, uv.x)
in snailclip2
and clip1
examples to start...
- (20 pts) 🟢 draw the triangles in a single solid color (e.g., blue)
NOTE: the cycle
and tilt
examples should show up on the film plane
- (5 pts) 🟢 NDC color interpolation (perspective-_incorrect_ interpolation; interpolate with NDC barycentric weights)
NOTE: you should see the cycle
example triangles in the correct colors but with _incorrect overlap_
NOTE: you should see that the tilt
example looks decent (but slightly wrong)
NOTE: you don't need the pdf yet
✨ HINT
NDC (perspective-_incorrect_ color interpolation) would works like this (recall that homework where you dragged the colorful dot around inside the boundary of a triangle)perspective_incorrect_color = alpha_NDC * color_a + beta_NDC * color_b + gamma_NDC * color_c
V3(.5, .5, .5) + .5 * face_normal
NOTE: the bunny should look fairly fabulous
NOTE: face_normal
here refers to triangle (a, b, c)'s unit normal vector (i happened to compute it in world coordinates)
HINT: run this in release mode (--release) to make it a bit faster
- (10 pts) 🟦 implement a working z bufffer
NOTE: you may need the pdf that i shipped with the codebase (but can probably get away without using it)
NOTE: you should see that the cycle
example has correct overlap
💡 let's talk about that pdf real quick note: the pdf uses a different convention (z-axis points forward instead of -z-axis) this doesn't change the result of the derivation tl;dr: when the pdf author uses linear interpolation between two points, you can substitute barycentric interpolation
✨✨✨ EXPLANATION OF EQUATION 12 IN THE PDF
say we want to color pixel (i, j). this pixel has position p_NDC in NDC (from the whole "double(i) / (S - 1) * 2 - 1" thing) how did we get here? well, we projected triangle (a, b, c) into NDC, and discovered that p_NDC is inside of triangle (a_NDC, b_NDC, c_NDC) in particular, we found that its NDC barycentric weights alpha_NDC, beta_NDC, gamma_NDC are all greater than 0 (i.e., it passed the inside-triange test we briefly went over earlier in the course) now, in addition to being the pixel position in NDC, p_NDC also corresponds to a point p _on the triangle_ (not a vertex, just a point p on the triangle) in principle, we could write this point in camera coordinates as p_camera (to cross-reference the pdf, p_camera is the point C = (X_t, Z_t)) note that we do not have p_camera :( we have a_camera, b_camera, c_camera, as well as a_NDC, b_NDC, c_NDC... but no p_camera 😥 wait, but why do we care about p_camera at all? well, what we really care about is p_camera.z (which i'll notate p_camera_z) this is the z-component of the point on the triangle we are currently drawing *in camera coordinates*); in the pdf, p_camera_z is Z_t p_camera_z would enable us to cleanly do depth buffering and clip planes, and also enable perspective-correct color interpolation the first punchline of the pdf is Equation 12, which says that we can compute p_camera.z, *using the NDC barycentric weights* in words, we barycentrically interpolate the _reciprocal_ of z_camera, and then take the reciprocal of that in pseudocode, we compute p_camera_z = 1. / (alpha_NDC * (1 / a_camera.z) + beta_NDC * (1 / b_camera.z) + gamma_NDC * (1 / c_camera.z)) assuming Equation 12 is in fact true, we have recovered p_camera_z! 🎉 bonus: implementing clip planes is now a one liner 🎉 but we didn't come here to talk about clip planes; we came to this exceptionally long, drawn out hint to talk about depth buffering let's assume p_camera_z is between our clip planes, i.e. in the range [renderer_f, renderer_n] note that renderer_f, renderer_n are NEGATIVE in our convention we want to map p_camera_z to "depth," which for us will should range from 0 to 255 (recall that depth_buffer.data is an array of unsigned char's) the specific map we want is [f, n] -> [255, 0] (since we initialize the depth buffer to full red) we can use the contents of depth_buffer.data to do depth testing- (5 pts) 🟦 implement clip planes NOTE: you will likely need the pdf that i shipped with the codebase NOTE: you should check that scrubbing
renderer_f
and renderer_n
clips properly
- (5 pts) 🟦 perspective-correct color interpolation
NOTE: you will _definitely_ need the pdf that i shipped with the codebase
NOTE: the tilt example should look correct (the difference is a bit subtle)
HINT: see Equation 16 of the pdf
NOTE: Now, assuming all that was done perfectly...
- (? pts) 🏴? (but with hints) clip triangles to the near plane
NOTE: you may have to rewrite and refactor large parts of the app
NOTE: the clip2
and clip1
examples should no longer be broken
HINT: the triangle in clip2
will become 1 (different) triangle after clipping
HINT: the triangle in clip1
will become 2 (different) triangles after clipping
HINT: simon yeung's writeup