Hi, I wonder if someone coluld help. am looking fo a mathmetical procedure to adjust the calibrations of the OSCAR-11 magnetometer so that the mean square of the error between the measured field and the theoretical field is a minimum. I would really like to be able to do this on a spread sheet. I have a table of about 3000 values of the theoretical field BT, and corresponding measurements. The measurements consist of two values, N and BH. The relationship between the measured field BM and the measurements is - BM = SQRT(BZ*BZ + I*BH*BH) BZ = K*N + J I have initial values of I, J, and K which give a fairly good fit. I believe that there may be an iterative procedure which enables optimum values of I, J,and K to be calculated. I think that the approach may be to obtain some linear equations by partial differentation, and solve these by substituting statistical values obtained from the table of data. Any help would be greatly appreciated. -- 73 Clive G3CWV Hitchin, North Hertfordshire, UK.