There has been difficulty utilising conditional statements as part of the neural network graph (e.g. if input $> x$, pass input to network $N$). This is due to the inability to backpropagate through branching conditions. The Linear Array of Conditions, TOpologies with Separated Error-backpropagation (LACTOSE) Algorithm addresses this issue and allows the conditional use of available machine learning layers for supervised learning models. In this paper, the LACTOSE algorithm is applied to a simple use of DDSP, however, the main point is the development of the "if" conditional for DDSP use. The LACTOSE algorithm stores trained parameters for each user-specified numerical range and loads the parameters dynamically during prediction.