Mean absolute error is differentiable almost everywhere. Having objectives that are not differentiable, but are differentiable almost everywhere is very common - in a deep net, if you have rectified linear activations (very common) or L1 regularisation (not unheard of), you have an objective that is not differentiable everywhere ... but the methods still work.