ECSE 446/546: Realistic/Advanced Image Synthesis
Assignment 2: Direct Illumination
1 Assignment Policies
All future assignments, including this one, will build on top of Assignment 1.
1.1 Assignment Submission
Gather all the python source files within the taichi_tracer/ folder (i.e., everything except the scene_data_dir/ folder) and compress them into a single zip file. Name your zip file according to your student ID, as:
YourStudentID.zip
For example, if your ID is 234567890, your submission filename should be 234567890.zip.
DO NOT ADD ANYTHING BEFORE OR AFTER THE MCGILL ID.
Every time you submit a new file on myCourses, your previous submission will be overwritten. We will only grade the final submitted file, so feel free to submit often as you progress through the assignment.
In accordance with article 15 of the Charter of Students' Rights, students may submit any written or programming components in either French or English.
1.2 Late policy
All the assignments are to completed individually. You are expected to respect the late day policy and collaboration/plagiarism polices, discussed below.
Late Day Allotment and Late Policy
Every student will be allowed a total of six (6) late days during the entire semester, without penalty. Specifically, failure to submit a (valid) assignment on time will result in a late day (rounded up to the nearest day) being deducted from the student's late day allotment. Once the late day allotment is exhausted, any further late submissions will obtain a score of 0%. Exceptional circumstances will be treated as per McGill's Policies on Student Rights and Responsibilities.
If you require an accomodation, please advise McGill Student Accessibility and Achievement (514-398-6009) as early in the semester as possible. In the event of circumstances beyond our control, the evaluation scheme as detailed on the course website and on assignment handouts may require modification.
1.3 Collaboration & Plagiarism
Plagiarism is an academic offense of misrepresenting authorship. This can result in penalties up to expulsion. It is also possible to plagiarise your own work, e.g., by submitting work from another course without proper attribution. When in doubt, attribute!
You are expected to submit your own work. Assignments are individual tasks. This does not need to preclude forming an environment where you can be comfortable discussing ideas with your classmates. When in doubt, some good rules to follow include:
fully understand every step of every solution you submit,
only submit solution code that was written (not copy/pasted/modified, not ChatGPT'ed, etc.) by you, and
never refer to another student's code — if at all possible, we recommend that you avoid looking at another classmates code.
McGill values academic integrity and students should take the time to fully understand the meaning and consequences of cheating, plagiarism and other academic offenses (as defined in the Code of Student Conduct and Disciplinary Procedures — see these two links).
Computational plagiarism detection tools are employed as part of the evaluation procedure in ECSE 446/546. Students may only be notified of potential infractions at the end of the semester.
Additional policies governing academic issues which affect students can be found in the Handbook on Student Rights and Responsibilities.
2 Pixel Anti-aliasing and Progressive Renderer
When generating your eye rays in Assignment 1, we exclusively sampled a viewing direction through the center of each square pixel. When the directly visible geometry varies spatially at a rate higher than our pixel grid resolution, our resulting image can suffer from so-called “jaggies” — aliasing artifacts that manifest themselves primarly at the silhouettes of visible objects:
Note the aliasing artifacts around object silhouettes.
One simple strategy to eliminate these artifacts is to anti-alias (AA) our image, i.e., by super-sampling eye ray directions over each pixel's area.
Concretely, for each pixel, instead of considering a single ray through its center, we will average the contribution (e.g., the shading) across many primary rays. These jittered eye ray directions will instead be generated by picking a random location (uniformly over the area of the pixel) when generating your primary rays:
Pixel Anti-Aliasing.
When averaging the shading result across many such jittered rays, often referred to as SPP (Samples Per Pixel), the noise is eventually averaged away, and so too the aliasing:
Averaging jittered eye rays through each pixel.
Deliverable 1 [10 points]
Modify your generate_eye_rays() function to generate a jittered ray when jitter is True
Modify your render() function in the A1Renderer to progressively average the renders in the canvas
Implement the A2Renderer's render() function as a progressive renderer, similarly to the A1Renderer
We will not grade your A1Renderer rendering results with jittered eye rays, however this is a great way to ensure that your jittering is implemented correctly before moving on to the next deliverables.
3 General Direct Illumination
This assignment treats the generic direct illumination equation:
We will now consider a few parameters in the Materials class which encapsulates BRDFs. When the value of the parameter Material.Ns is 1, the the object's BRDF is diffuse and the diffuse reflectance is given by Material.Kd. If, however, Material.Ns is greater than 1, the object's BRDF is Phong, with the Material.Ns parameter corresponding to the phong exponent , and the Material.Kd corresponding to the specular reflectance .
In this specialized setting, the BRDF in our direct illumination equation can be defined as
where the reflected view-direction , and is oriented from the shading point to the viewer (i.e., the opposite direction of your eye rays).
Note here that, when shading with the Phong BRDF, there are two cosine terms in your direct illumination equation: the foreshortening term about the normal, and the Phong cosine (exponent) reflection lobe; in the diffuse setting, the latter — view-dependent — cosine is no longer present. Appropriately taking this into account will be necessary for all your estimators, with additional attention required when implementing the second assignment deliverable (BRDF Importance Sampling).
3.1 Environment Light Map
In this assignment, we will be consider and treat environment lights, which are implemented in the Environment class in the environment.py file. This class takes as input a 2D RGB image corresponding to the (spherical) environment light that surrounds your scene; we will assume the environment is infinitely distant from every shading point.
You will implement the query_ray() function, which takes a Ray as input, and returns a tm.vec3 representing the RGB color of the texel in the environment map corresponding to texel “intersected” by the ray. The environment maps we use are encoded in an equi-rectangular spherical parameterization, according to:
Note that both of the trigonometric functions above are available through the taichi math module as tm.atan2 and tm.asin.
4 Uniform. Spherical Importance Sampling
The first Monte Carlo estimator you will implement is the simplest one, and will employ uniform. spherical importance sampling.
Recall from the lecture that you can generate a uniformly-sampled ray direction on the sphere using two canonical random variables — computed using ti.random() — and the following transformation:
Here, the sampling PDF Puniform(ω) = 4π/1
You can now proceed with evaluating your uniform. spherical importance sampling MC estimator by tracing these sampled shadow rays — from the shading point towards — as:
Keep in mind that your integrator should only return an -sample estimate, as the progressive renderer (deliverable 1) will progressively render estimates with increasing .
Uniform. Spherical Importance Sampling @ 512×512, 100 SPP
5 BRDF Importance Sampling
The next MC estimator you will implement will employ BRDF importance sampling. Here, the sampling PDF you will implement will depend on whether you are on a diffuse or glossy Phong shading point, as:
You can draw samples in two stages, first drawing them in a canonical orientation aligned with the -axis, before rotating them into an appropriate coordinate system at the shade point.
For diffuse surfaces, you will be drawing samples according to a (normalized) cosine lobe aligned about the shading normal and, for glossy Phong surfaces, a (normalized) cosine-power lobe aligned about the reflected outgoing viewing directions .
Conveniently, the sampling routine for the canonical orientation is parameterized by and can generate both cosine and cosine-power distributions, as:
Sampled directions (yellow arrows) and BRDF lobes (blue outline) for diffuse and glossy Phong surfaces.
You can now similarly proceed with evaluating your BRDF importance sampling MC estimator by tracing these sampled shadow rays — from your hit point towards — as:
keeping in mind that, depending on the type of surface, the PDF will “cancel out” different terms, above: for diffuse surfaces, will simplify to and, for glossy Phong surfaces it will simplify to . While you can implement these optimizing simplifications to reduce superfluous computation, simply substituting into the MC estimator expression above will yield the same result.
As before, your integrator should only output a single -sample estimate, as your progressive rendering loop will appropriately accumulate and average these individual estimates.
BRDF Importance Sampling @ 512×512, 100 SPP
Deliverable 2 [15 points]
Implement the shading logic of both uniform. spherical importance sampling and BRDF importance sampling in the A2Renderer's shade_ray() function
The shading function will either employ the uniform. or BRDF importance sampling based on the SamplingMode flag.