天天看點

Scriptable Render Pipeline-Lights

https://catlikecoding.com/unity/tutorials/scriptable-render-pipeline/lights/

this is the third installment of a tutorial series covering unity’s scriptable render pipeline. this time we will add support for diffuse lighting, by shading up to eight lights per object witha single draw call.

1 shading with a light

to support lights, we have to add a lit shader to our pipeline. lighting complexity an go from very simple – only including diffuse light-- to very complex – full-blown 全面的 physically-based shading. it can also be unrealistic, like cell-shading. we will start with the minimum of a lit shader that calculates diffuse directional lighting, without shadows.

1.1 shader

duplicate Unlit.hlsl and rename it to Lit.hlsl. replace all instances of unlit in the new file with lit, specifically the include define and the vertex and fragment functoin names.

#ifndef MYRP_LIT_INCLUDED
#define MYRP_LIT_INCLUDED

…

VertexOutput LitPassVertex (VertexInput input) {
	…
}

float4 LitPassFragment (VertexOutput input) : SV_TARGET {
	…
}

#endif // MYRP_LIT_INCLUDED
           

also duplicate Unlit.shader and rename it to Lit.shader, again replacing unlit with lit in the new file.

Shader "My Pipeline/Lit" {
	
	Properties {
		_Color ("Color", Color) = (1, 1, 1, 1)
	}
	
	SubShader {
		
		Pass {
			HLSLPROGRAM
			
			#pragma target 3.5
			
			#pragma multi_compile_instancing
			#pragma instancing_options assumeuniformscaling
			
			#pragma vertex LitPassVertex
			#pragma fragment LitPassFragment
			
			#include "../ShaderLibrary/Lit.hlsl"
			
			ENDHLSL
		}
	}
}
           

now we can create a lit opaque material with the new shader, although it still does exactly the same as the unlit variant.

1.2 normal vectors

in order to calculate the contribution of a directional light, we need to know the surface normal. so we have to add a normal vector to both the vertex input and output structures. for a detailed description of how the lighting is calculated, see rendering 4, the first light.

struct VertexInput {
	float4 pos : POSITION;
	float3 normal : NORMAL;
	UNITY_VERTEX_INPUT_INSTANCE_ID
};

struct VertexOutput {
	float4 clipPos : SV_POSITION;
	float3 normal : TEXCOORD0;
	UNITY_VERTEX_INPUT_INSTANCE_ID
};
           

convert the normal from object space to world space in LitPassVertex. as we assume that we are only using uniform scales, we can simply use the 3x3 part of the model matrix, followed by normalization per fragment in LitPassFragment. support for nonuniform scales would require us to use a transposed world-to-object matrix instead.

VertexOutput LitPassVertex (VertexInput input) {
	…
	output.normal = mul((float3x3)UNITY_MATRIX_M, input.normal);
	return output;
}

float4 LitPassFragment (VertexOutput input) : SV_TARGET {
	UNITY_SETUP_INSTANCE_ID(input);
	input.normal = normalize(input.normal);
	…
}
           

to verify that we end up with correct normal vectors, use them for the final color. but still keep track of the material’s, as we will use that for its albedo later.

float4 LitPassFragment (VertexOutput input) : SV_TARGET {
	UNITY_SETUP_INSTANCE_ID(input);
	input.normal = normalize(input.normal);
	float3 albedo = UNITY_ACCESS_INSTANCED_PROP(PerInstance, _Color).rgb;
	
	float3 color = input.normal;
	return float4(color, 1);
}
           

1.3 diffuse light

the diffuse light contribution depends on the angle at which light hits the surface, which is found by computing the dot product of the surface normal and the direction where the light is coming from, discarding negative results. in the case of a directional light, the light vector is constant. let us use a hard-coded directional for now, pointing straight up. multiply the diffuse light with the albedo to get the final color.

float4 LitPassFragment (VertexOutput input) : SV_TARGET {
	UNITY_SETUP_INSTANCE_ID(input);
	input.normal = normalize(input.normal);
	float3 albedo = UNITY_ACCESS_INSTANCED_PROP(PerInstance, _Color).rgb;
	
	float3 diffuseLight = saturate(dot(input.normal, float3(0, 1, 0)));
	float3 color = diffuseLight * albedo;
	return float4(color, 1);
}
           

2 visible lights

to be able to use lights defined in the scene, our pipeline has to send the light data to the gpu. it is possible to have multiple lights in a scene, so we should support multiple lights too. there are multiple ways to do that. unity’s default pipeline renders each light in a separate pass, per object. the lightweight pipeline renders all lights in a single pass, per object. and the HD pipeline uses deferred rendering, which renders the surface data of all objects, followed by one pass per light.

we are going to use the same approach as the lightweight pipeline, so each objects is rendered once, taking all lights into consideration. we do that by sending the data of all lights that are currently visible to the gpu. lights that are in the scene but do not affect anything that is going to be rendered will be ignored.

2.1 light buffer

rendering all lights in one pass means that lighting data must be available at the same time. limiting ourselves to directional lights only for now, that means we need to know both the color and the direction of each light. to support an arbitrary amount of lights, we will use arrays to store this data, which we will put in a separate buffer that we will name _LightBuffer. arrays are defined in shaders like in C#, except the brackets come after the variable name instead of the type.

CBUFFER_START(UnityPerDraw)
	float4x4 unity_ObjectToWorld;
CBUFFER_END

CBUFFER_START(_LightBuffer)
	float4 _VisibleLightColors[];
	float4 _VisibleLightDirections[];
CBUFFER_END
           

however, we can not define arrays of arbitrary size. the array definition must immediately declare its size. let use use an array length of 4. that means that we can support up to four visible lights at once. define this limit with a macro for easy reference.

#define MAX_VISIBLE_LIGHTS 4

CBUFFER_START(_LightBuffer)
	float4 _VisibleLightColors[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightDirections[MAX_VISIBLE_LIGHTS];
CBUFFER_END
           

below the light buffer, add a DiffuseLight function that uses the light data to take care of the lighting calculation. it needs a light index and normal vector as parameters, extracts the relevant data from the arrays, then performs the diffuse lighting calculation and returns it, modulated by the light’s color.

float3 DiffuseLight (int index, float3 normal) {
	float3 lightColor = _VisibleLightColors[index].rgb;
	float3 lightDirection = _VisibleLightDirections[index].xyz;
	float diffuse = saturate(dot(normal, lightDirection));
	return diffuse * lightColor;
}
           

in LitPassFragment, use a for loop to invoke the new function once per light, accumulating the total diffuse light affecting the fragment.

float4 LitPassFragment (VertexOutput input) : SV_TARGET {
	…
	
	float3 diffuseLight = 0;
	for (int i = 0; i < MAX_VISIBLE_LIGHTS; i++) {
		diffuseLight += DiffuseLight(i, input.normal);
	}
	float3 color = diffuseLight * albedo;
	return float4(color, 1);
}
           

note that even though we use a loop, the shader compiler will likely unroll it. as our shader becomes more complex, at some point the compiler will switch to using an actual loop.

2.2 filing the buffer

at the moment we end up with fully black shapes, because we are not passing any light data to the gpu yet. we have to add the same arrays to MyPipeline, with the same size. also, use the static Shader.PropertyToID method to find the identifiers of the relevant shader properties. the shader IDs are constant per session, so can be stored in static variables.

const int maxVisibleLights = 4;
	
	static int visibleLightColorsId =
		Shader.PropertyToID("_VisibleLightColors");
	static int visibleLightDirectionsId =
		Shader.PropertyToID("_VisibleLightDirections");
	
	Vector4[] visibleLightColors = new Vector4[maxVisibleLights];
	Vector4[] visibleLightDirections = new Vector4[maxVisibleLights];
           

the arrays can be copied to the gpu via invoking the SetGlobalVectorArray method on a command buffer, and then executing it. as we already have cameraBuffer, let us use that buffer, at the same momoent that we start the Render Camera sample.

cameraBuffer.BeginSample("Render Camera");
		cameraBuffer.SetGlobalVectorArray(
			visibleLightColorsId, visibleLightColors
		);
		cameraBuffer.SetGlobalVectorArray(
			visibleLightDirectionsId, visibleLightDirections
		);
		context.ExecuteCommandBuffer(cameraBuffer);
		cameraBuffer.Clear();
           

2.3 configuring the lights

we are now sending data to the gpu each frame, but it is still the default data, so the objects remain black. we have to configure the lights before copying the vectors. let us delegate that responsibility to a new ConfigureLights method.

cameraBuffer.ClearRenderTarget(
			(clearFlags & CameraClearFlags.Depth) != 0,
			(clearFlags & CameraClearFlags.Color) != 0,
			camera.backgroundColor
		);

		ConfigureLights();

		cameraBuffer.BeginSample("Render Camera");
           

during culling, unity also figures out which lights are visible. this information is made available via a visibleLights list that is part of the cull results. the list’s elements are VisibleLights structs that contain all the data that we need. create the required ConfigureLights method and have it loop through the list.

void ConfigureLights () {
		for (int i = 0; i < cull.visibleLights.Count; i++) {
			VisibleLight light = cull.visibleLights[i];
		}
	}
           

the VisibleLight.finalColor field hods the light’s color. it is the light’s color multiplied by its intensity, and also converted to the correct color space. so we can directly copy it to visibleLightColors, at the same index.

VisibleLight light = cull.visibleLights[i];
visibleLightColors[i] = light.finalColor;
           

however, by default unity considers the light’s intensity to be defined in gamma space, even though we are working with linear space. this is a holdover of unity’s default render pipeline; the new pipelines consider it a linear value. this behavior is controlled via the boolean GraphicsSettings.lightsUseLinearIntensity property. it is a project setting, but can only be adjusted via code. we only need to set it once, so let us do that in the constructor method of MyPipeline.

public MyPipeline (bool dynamicBatching, bool instancing) {
		GraphicsSettings.lightsUseLinearIntensity = true;
		…
	}
           

changing this setting only affects the editor when it re-applies its graphics settings, which does not automatically happen. entering and existing play mode will apply it.

besides that, the direciton of a directional light is determined by its rotation. the light shines along its local z axis. we can find this vector in world space via the VisibleLight.localtoWorld matrix field. the third column of that matrix defines the transformed local Z direction vector, which we can get via the Matrix4x4.GetColumn method, with index 2 as an argument.

that gives use the direction in which the light is shining, but in the shader we use the direction from the surface toward the light sources. so we have to negate the vector before we assign it to visibleLightDirections. as the fourth component of a direction vector is always zero, we only have to negate X, Y and Z.

VisibleLight light = cull.visibleLights[i];
			visibleLightColors[i] = light.finalColor;
			Vector4 v = light.localToWorld.GetColumn(2);
			v.x = -v.x;
			v.y = -v.y;
			v.z = -v.z;
			visibleLightDirections[i] = v;
           

our obejcts are now shaded with the color and direction of the main direction light, assuming u have no other lights in the scene. if u do not have a light source in the scene, just add a single directional light.

but our shader always calcualtes the lighting contribution of four lights, even if we only have a single light in the scene. so u could add three more directional lights and it would not slow down the gpu.

u can inspect the light data that is sent to the gpu via the frame debugger. select one of the draw calls that uses our shader, then expand the vector arrays to see their contents.

2.4 varying the number of lights

everything works as expected when using exactly four directional lights. we can even have more, as long as only four are visible at the same time. but when there are more than four visible lights our pipeline fails with an index-out-of-bounds exception. we can only support up to four visible lights, but unity does not take that into consideration while culling. so visibleLights can end up with more elements than our arrays. we have to abort the loop when we exceed the maximum. that means that we simply ignore some of the visible lights.

for (int i = 0; i < cull.visibleLights.Count; i++) {
			if (i == maxVisibleLights) {
				break;
			}
			VisibleLight light = cull.visibleLights[i];
			…
		}
           

which lights get omitted???

we simple skip the last lights in the visibleLights list. the lights are ordered based on various criteria, including light type, intensity, and whether they have shadows enabled. u can assume that the lights are ordered from most to least important. for example, the directional light with the highest intensity and shadows enabled will be the first element.

another weird thing happens when the amount of visible lights decreases. they remain visible, because we do not reset their data. we can solve that by continuing to loop through our arrays after finishing the visible lights, clearing the color of all lights that are not used.

int i = 0;
		for (; i < cull.visibleLights.Count; i++) {
			…
		}
		for (; i < maxVisibleLights; i++) {
			visibleLightColors[i] = Color.clear;
		}
           

将減少的可見的燈光顔色,清除掉。

3 point lights

we currently only support directional lights, but typically a scene has only a single directional light plus additional point lights. while we can add points lights to the scene, they are currently interpreted as a directional light. we are going to fix that now.

rendering 5, multiple lights describes point lights and spotlights, but uses the old approach of unity’s default pipeline. we are going to use the same approach as the lightweight pipeline.

3.1 light position

unlike a directional light, the position of a point light matters. rather than adding a separate array for positions, we will store both direction and position data in the same array, each element containing either a direction or a position. rename the variables in MyPipeline accordingly.

static int visibleLightColorsId =
		Shader.PropertyToID("_VisibleLightColors");
	static int visibleLightDirectionsOrPositionsId =
		Shader.PropertyToID("_VisibleLightDirectionsOrPositions");

	Vector4[] visibleLightColors = new Vector4[maxVisibleLights];
	Vector4[] visibleLightDirectionsOrPositions = new Vector4[maxVisibleLights];
           

ConfigureLights can use VisibleLight.lightType to check the type of each light. in case of a direction light, storing the direction is correct. otherwise, store the light’s world position instead, which is can be extracted from the fourth column of its local-to-world matrix.

if (light.lightType == LightType.Directional) {
				Vector4 v = light.localToWorld.GetColumn(2);
				v.x = -v.x;
				v.y = -v.y;
				v.z = -v.z;
				visibleLightDirectionsOrPositions[i] = v;
			}
			else {
				visibleLightDirectionsOrPositions[i] =
					light.localToWorld.GetColumn(3);
			}
           

rename the array in the shader too. in DiffuseLight, begin by assuming that we are still always dealing with a directional light.

CBUFFER_START(_LightBuffer)
	float4 _VisibleLightColors[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightDirectionsOrPositions[MAX_VISIBLE_LIGHTS];
CBUFFER_END

float3 DiffuseLight (int index, float3 normal) {
	float3 lightColor = _VisibleLightColors[index].rgb;
	float4 lightPositionOrDirection = _VisibleLightDirectionsOrPositions[index];
	float3 lightDirection = lightPositionOrDirection.xyz;
	float diffuse = saturate(dot(normal, lightDirection));
	return diffuse * lightColor;
}
           

but if we are dealing with a point light, we have to calculate the light direction ourselves. first, we subtract the surface position from the light position, which requires us to add an additional parameter to the function. that gives us the light vector, in world space, which we turn into a direction by normalizing it.

float3 DiffuseLight (int index, float3 normal, float3 worldPos) {
	float3 lightColor = _VisibleLightColors[index].rgb;
	float4 lightPositionOrDirection = _VisibleLightDirectionsOrPositions[index];
	float3 lightVector =
		lightPositionOrDirection.xyz - worldPos;
	float3 lightDirection = normalize(lightVector);
	float diffuse = saturate(dot(normal, lightDirection));
	return diffuse * lightColor;
}
           

that works for point lights, but is nonsensical for directional lights. we can support both with the same calculation, by multiplying the world position with the W component of the light’s direction or position vector. if it is a position vector, then W is 1 and the calculation is unchanged. but if it is a direction vector, then W is 0 and the subtraction is eliminated. so we end up normalizing the original direction vector, which makes no difference. it does introduce an unneeded normalization for directional lights, but branching to avoid that is not worth it.

to make this work we need to know the fragment’s world-space position in LitPassFragment. we already have it in LitPassVertex, so add it as an additional output and pass it along.

struct VertexOutput {
	float4 clipPos : SV_POSITION;
	float3 normal : TEXCOORD0;
	float3 worldPos : TEXCOORD1;
	UNITY_VERTEX_INPUT_INSTANCE_ID
};

VertexOutput LitPassVertex (VertexInput input) {
	…
	output.worldPos = worldPos.xyz;
	return output;
}

float4 LitPassFragment (VertexOutput input) : SV_TARGET {
	…
	
	float3 diffuseLight = 0;
	for (int i = 0; i < MAX_VISIBLE_LIGHTS; i++) {
		diffuseLight += DiffuseLight(i, input.normal, input.worldPos);
	}
	float3 color = diffuseLight * albedo;
	return float4(color, 1);
}
           

3.2 distance attenuation

except for directional lights – which are assumed to be infinitely far away-- the intensity of a light decreases with distance. the relation is i/d^2, where i is the light’s stated intensity and d is the distance between the light source and the surface. this is known as the inverse-square law. so we have to divide the final diffuse contribution by the square of the light vector. to avoid a division by zero, we enforce a tiny minimum for the square distance used.

float diffuse = saturate(dot(normal, lightDirection));
	
	float distanceSqr = max(dot(lightVector, lightVector), 0.00001);
	diffuse /= distanceSqr;
	
	return diffuse * lightColor;
           

does not that increase the intensity very close to point lights??

indeed, when d is less than 1 a light’s intensity goes up. when d approaches its minimum the intensity becomes enormous.

unity’s default pipeline ues i/(1+d^2) to avoid increasing the brightness, but is less realistic and produces results that are too dark close to the lights. the lightwieght pipeline initially used the same falloff, but starting with version 3.3.0 it uses the correct square falloff.

Scriptable Render Pipeline-Lights

as the light vector is the same as the direction vector for directional lights, the square distance ends up as 1. that means that directional lights are not affected by distance attenuation, which is correct.

3.3 light range

point lights also have a configured range, which limits their area of influence. nothing outside this range is affected by the light, even though it could still illuminate objects. this is not realistic, but allows better control of lighting and limits how many objects are affected by the light. without this range limit every light would always be considered visible.

the range limit is not a sudden cutoff. instead, the light’s intensity is smoothly faded out, based on the square distance. the lightweight pipeline and lightmapper use

Scriptable Render Pipeline-Lights

where r is the lights’ range. we will use the same fade curve.

Scriptable Render Pipeline-Lights

the light ranges are part of the scene data, so we have to send it to the gpu, per light. we will use another array for this attenuation data. while we could suffice with a float array, we will once again use a vector array, as we will need to include more data later.

static int visibleLightColorsId =
		Shader.PropertyToID("_VisibleLightColors");
	static int visibleLightDirectionsOrPositionsId =
		Shader.PropertyToID("_VisibleLightDirectionsOrPositions");
	static int visibleLightAttenuationsId =
		Shader.PropertyToID("_VisibleLightAttenuations");

	Vector4[] visibleLightColors = new Vector4[maxVisibleLights];
	Vector4[] visibleLightDirectionsOrPositions = new Vector4[maxVisibleLights];
	Vector4[] visibleLightAttenuations = new Vector4[maxVisibleLights];
           

also copy the new array to the gpu in render.

cameraBuffer.SetGlobalVectorArray(
			visibleLightDirectionsOrPositionsId, visibleLightDirectionsOrPositions
		);
		cameraBuffer.SetGlobalVectorArray(
			visibleLightAttenuationsId, visibleLightAttenuations
		);
           

and fill it in ConfigureLights. directional lights have no range limit, so they can use the zero vector. in the case of points lights, we put their range in the X component of the vector. but rather than store the range directly, we reduce the work that the shader has to do by storing 1/r^2 and avoiding a division by zero.

Vector4 attenuation = Vector4.zero;

			if (light.lightType == LightType.Directional) {
				…
			}
			else {
				visibleLightDirectionsOrPositions[i] =
					light.localToWorld.GetColumn(3);
				attenuation.x = 1f /
					Mathf.Max(light.range * light.range, 0.00001f);
			}
			
			visibleLightAttenuations[i] = attenuation;
           

add the new array to the shader, calculate the fading caused by range, and factor that into the final diffuse contribution.

CBUFFER_START(_LightBuffer)
	float4 _VisibleLightColors[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightDirectionsOrPositions[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightAttenuations[MAX_VISIBLE_LIGHTS];
CBUFFER_END

float3 DiffuseLight (int index, float3 normal, float3 worldPos) {
	float3 lightColor = _VisibleLightColors[index].rgb;
	float4 lightPositionOrDirection = _VisibleLightDirectionsOrPositions[index];
	float4 lightAttenuation = _VisibleLightAttenuations[index];
	
	float3 lightVector =
		lightPositionOrDirection.xyz - worldPos * lightPositionOrDirection.w;
	float3 lightDirection = normalize(lightVector);
	float diffuse = saturate(dot(normal, lightDirection));
	
	float rangeFade = dot(lightVector, lightVector) * lightAttenuation.x;
	rangeFade = saturate(1.0 - rangeFade * rangeFade);
	rangeFade *= rangeFade;
	
	float distanceSqr = max(dot(lightVector, lightVector), 0.00001);
	diffuse *= rangeFade / distanceSqr;
	
	return diffuse * lightColor;
}
           

once again directional lights are not affected, because in their case lightAttenuation.x is always 0, thus rangeFage is always 1.

4 spotlights

the lightweight pipeline also supports spotlights, so we will add them too. spotlights work like point lights, but are restricted to a cone instead of shining in all directions.

4.1 spot direction

like a directional light, a spotlight shines along its local Z axis, but in a cone. and it also has a position, which means that we have to provide both for spotlights. so add an additional array for the spot direction to MyPipeline.

static int visibleLightAttenuationsId =
		Shader.PropertyToID("_VisibleLightAttenuations");
	static int visibleLightSpotDirectionsId =
		Shader.PropertyToID("_VisibleLightSpotDirections");

	Vector4[] visibleLightColors = new Vector4[maxVisibleLights];
	Vector4[] visibleLightDirectionsOrPositions = new Vector4[maxVisibleLights];
	Vector4[] visibleLightAttenuations = new Vector4[maxVisibleLights];
	Vector4[] visibleLightSpotDirections = new Vector4[maxVisibleLights];
	
	…
	
	void Render (ScriptableRenderContext context, Camera camera) {
		…
			cameraBuffer.SetGlobalVectorArray(
			visibleLightAttenuationsId, visibleLightAttenuations
		);
		cameraBuffer.SetGlobalVectorArray(
			visibleLightSpotDirectionsId, visibleLightSpotDirections
		);
		…
	}
           

in ConfigureLights, when not dealing with a directional light, also check whether the light is a spotlight. if so, steup the direction vector, just like for a directional light, but assign it to visibleLightSpotDirections instead.

if (light.lightType == LightType.Directional) {
				Vector4 v = light.localToWorld.GetColumn(2);
				v.x = -v.x;
				v.y = -v.y;
				v.z = -v.z;
				visibleLightDirectionsOrPositions[i] = v;
			}
			else {
				visibleLightDirectionsOrPositions[i] =
					light.localToWorld.GetColumn(3);
				attenuation.x = 1f /
					Mathf.Max(light.range * light.range, 0.00001f);

				if (light.lightType == LightType.Spot) {
					Vector4 v = light.localToWorld.GetColumn(2);
					v.x = -v.x;
					v.y = -v.y;
					v.z = -v.z;
					visibleLightSpotDirections[i] = v;
				}
			}
           

add the new data to the shader too.

CBUFFER_START(_LightBuffer)
	float4 _VisibleLightColors[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightDirectionsOrPositions[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightAttenuations[MAX_VISIBLE_LIGHTS];
	float4 _VisibleLightSpotDirections[MAX_VISIBLE_LIGHTS];
CBUFFER_END

float3 DiffuseLight (int index, float3 normal, float3 worldPos) {
	float3 lightColor = _VisibleLightColors[index].rgb;
	float4 lightPositionOrDirection = _VisibleLightDirectionsOrPositions[index];
	float4 lightAttenuation = _VisibleLightAttenuations[index];
	float3 spotDirection = _VisibleLightSpotDirections[index].xyz;
	
	…
}
           

4.2 angle fallof

the cone of a spotlight is specified with a positive angle that is less than 180 degree. we can determine whether a surface point lies within the cone by taking the dot product of the spot’s direction and the light direction. if the result is at most the cosine of half the configured spot angle, then the fragment is affect by the light.

there is not an instant cutoff at the edge of the cone. instead, there is a transition range in which the light fades out. this range can be defined by an inner spot angle where the fading begins and an outer spot angle where the light intensity reachers zero. however, unity’s spotlight only allows us to set the outer angle. unity’s default pipeline uses a light cookie to determine the falloff, while the lightweight pipeline computes the falloff with a smooth function that assumes a fixed relationship between the inner and outer angles.

to determine the falloff, begin by converting half the spot’s angle from degrees to radians, then compute its cosine. the configured angle is made available via VisibleLight.spotAngle.

if (light.lightType == LightType.Spot) {
					…
					
					float outerRad = Mathf.Deg2Rad * 0.5f * light.spotAngle;
					float outerCos = Mathf.Cos(outerRad);
				}
           

the lightweight pipeline and the lightmapper define the inner angle with the relathinship

Scriptable Render Pipeline-Lights

where ri, and ro are half the inner and outer spot angles in radians. we need to use the cosine of the inner angle, so the complete relationship is:

Scriptable Render Pipeline-Lights
float outerRad = Mathf.Deg2Rad * 0.5f * light.spotAngle;
float outerCos = Mathf.Cos(outerRad);
float outerTan = Mathf.Tan(outerRad);
float innerCos =
Mathf.Cos(Mathf.Atan(((46f / 64f) * outerTan));
           

the angle-based falloff is defined as

Scriptable Render Pipeline-Lights

clamped to 0-1 and then sqaured, with Ds.Dl being the dot product of the spot direction and light direction.

Scriptable Render Pipeline-Lights

the expression can be simplified to

Scriptable Render Pipeline-Lights

with

Scriptable Render Pipeline-Lights

and

Scriptable Render Pipeline-Lights

that allows us to compute a and b in ConfigureLights and store them in the last two components of the attenuation data vector.

float outerRad = Mathf.Deg2Rad * 0.5f * light.spotAngle;
float outerCos = Mathf.Cos(outerRad);
float outerTan = Mathf.Tan(outerRad);
float innerCos =
Mathf.Cos(Mathf.Atan(((64f - 18f) / 64f) * outerTan));
float angleRange = Mathf.Max(innerCos - outerCos, 0.001f);
attenuation.z = 1f / angleRange;
attenuation.w = -outerCos * attenuation.z;
           

in the shader, the spot fade factor can then be computed with a dot product, multiplication, addition, saturation, and finally squaring. then use the result to modulate the diffuse light.

float rangeFade = dot(lightVector, lightVector) * lightAttenuation.x;
rangeFade = saturate(1.0 - rangeFade * rangeFade);
rangeFade *= rangeFade;

float spotFade = dot(spotDirection, lightDirection);
spotFade = saturate(spotFade * lightAttenuation.z + lightAttenuation.w);
spotFade *= spotFade;

float distanceSqr = max(dot(lightVector, lightVector), 0.00001);
diffuse *= spotFade * rangeFade / distanceSqr;
           

to keep the spot fade calcualtion from affecting the other light types, set the W component of their attenuation vector to 1.

Vector4 attenuation = Vector4.zero;
attenuation.w = 1f;
           

what about the area lights?

unity’s lightweight and default pipelines do not support realtime area lights, and neither will we. area lights are only used for lightmapping, which we will support later.