The classical algorithm for multiple-precision division normalizes digits during each step and sometimes makes correction steps when the initial guess for the quotient digit turns out to be wrong. A method is presented that runs faster by skipping most of the intermediate normalization and recovers from wrong guesses without separate correction steps.
First published in Mathematics of Computation in 1996, published by the American Mathematical Society
Smith, D. M. A Multiple-Precision Division Algorithm, Mathematics of Computation. vol. 65 (1996) pp. 157-163.