Hi Paul,
Easy answer is no, the learning rate (decay) won’t affect the weight decay. It’s two different things though the decay terminology can be applied to both.
Weight decay = summed square of all the weights * decay (usually .01 or .001) which is then added to the loss. Designed to penalize large weights in favor of smaller ones by increasing the loss if the weights are large.
Learning rate (decay) = learning rate being slowly adjusted lower over training (usually on a schedule), designed to help the network settle into a minimum.
Learning rate itself adjust the weights by subtracting gradient *lr each pass.
Anyway, hope this info is helpful!