"Parts of your bridge would be both there and not there until you tried to cross it, what decides what stabilizes it to either value is not really understood."
Actually, existence and non-existence are not the kind of quantum states which can be superposed. A state is basically one realization of the list of all the properties which can be measured. Every atom of a real bridge contributes microscopic properties to the whole and you can imagine how irrelevant most of these properties are. When a real bridge is in a superposed state, it could mean for instance that there's no way to predict whether the rotation of one single electron in the whole bridge is clockwise or counter-clockwise. So the bridge spends most of its superposition time between very similar states.
The event which stabilises a superposed quantity is a measurement-like interaction, for instance a photon hitting the bridge. A large object is undergoing these measurement-like interactions all the time, billions of times in a split second. After each measurement the superposition slowly starts to restore itself. The probability of the measured state decreases from one and the probabilities of the other states gradually start to deviate from zero. But then another photon hits. So our bridges are stable, because they just keep switching back and forth between these similar quantum states and they never get even close to the states where the structure is failing.
You could think about a coin tossing experiment: you take thousand pennies and look how much the average number of tails deviates from 500 for each round. Most of the results will be inside 3% from the average value. With an increased number of pennies, the relative deviation gets smaller and smaller. So the average doesn't go away but the deviations cancel out. In quantum mechanics the averages are by design such that they follow the classical mechanics. There's no indication that there would be something fundamental to the macro systems which doesn't follow from the quantum theory.
It is difficult to do justice to the way the averages come out of quantum mechanics without a more technical discussion: how there's a complex amplitude associated to each possibility, how they vary over time, how they cancel out and how it follows that the least varying amplitudes make the largest contribution to the probability distribution, corresponding to the classical behaviour. I'm afraid that to get a good grasp of the technicalities one really has to consult a textbook, like Feynman lectures part 3.
"it doesn't sound like the quantum principles lead to classical mechanics by themselves. That is they have to be set to yield or agree with the right ones."
No, nothing has to be artificially set to yield or agree with some results from another source. The classical mechanics really follows from the quantum theory and if the latter had been discovered first, we would have been able to write down the former as an approximation. My intention was to say that because we can deduce this from the general properties of the quantum mechanics, it will also hold in any specific cases we might ever encounter.