How long should I measure a Faraday baseline?

 

Voltages measurements on Faraday collectors, like all potentials, are estimated relative to a reference value. Faraday measurements therefore have two parts: the reference (called a baseline), and the 'on-peak' measurement when the ion beams are in the collectors.  Most TIMS baselines are measured between mass spectrum peaks while the sample is ionizing, meaning that the baseline and on-peak portions of the analysis must take place during the finite time that the sample runs.  It is reasonable to ask, how much of the sample run time should optimally be devoted to baselines and how much to the on-peak portion of the analysis?

Imagine that you only have enough sample to run for a hundred seconds.  If you measure the baseline for only one second and the ion beams on-peak for the other ninety nine, then you will have a precise on-peak measurement but an imprecisely measured baseline.  The uncertainty in the baseline-subtracted intensity measurement is the sum of the baseline and on-peak uncertainties in quadrature, so the short baseline would result in an imprecise baseline-corrected measurement.  On the other hand, measuring the baseline for ninety nine seconds and on-peak for just one second would likely produce an imprecise baseline-corrected measurement for the same reasons.  The best balance between baseline and on-peak measurement times must lie somewhere in between.

To figure out where, I modeled the uncertainty in measuring a baseline-corrected