A modeler of biochemical systems points out that as more factors are included in a climate model, to make it more accurate, uncertainty rises, rendering it less predictive.
... the need to balance accuracy with reliability. This paradox is not as strange as it seems. Typically when you build a model you include a lot of approximations supposed to make the modeling process easier; ideally you want a model to be as simple as possible and contain as few parameters as possible. But this strategy does not work all the time since sometimes it turns out that in your drive for simplicity you have left a crucial factor out. So now you include this crucial factor, only to find that the uncertainties in your model go through the roof. What’s happening in such unfortunate cases is that along with including the signal from the previously excluded factors, you have also inevitably included a large amount of noise. This noise can typically result from an incomplete knowledge of the factor, either from calculation or from measurement. Modelers of every stripe thus have to tread a fine balance between including as much of reality as possibility as possible and making the model accurate enough for quantitative explanation and prediction.
While this conflict is obvious, I think that a system where the built-in feedbacks are almost invariably positive needs to include such processes even if they are not pinned down with fine quantitative precision. Comparing a model of the speed at which a molecule crosses a membrane to models of planetary climate destabilization under anthropogenic forcing is a whole other animal.