Signal Noise

The Signal Noise parameter calculates the signal noise within a given time range of the chromatogram. The parameter is used for two main purposes:

Signal noise calculation in Chromeleon is based on the ASTM method: After fitting a regression line (using the method of least squares) to all the data points in the time range, Chromeleon determines the maximum distance of the data points above and below the line. (When calculating the regression line, all data points are weighted with their corresponding step unless the step is equidistant.) The signal noise equals the sum of the maximum distances above and below the line.

Determining the Signal Noise for Internal Use (Default)

When no parameters are provided, Chromeleon calculates the signal noise in the first and last 10 seconds of the chromatogram. These values are compared to an internal value. The lowest value is returned as signal noise. If the sensitivity is not explicitly set on the Detection tab of the QNT Editor, this value is then used as the sensitivity for integration (peak detection).

 Caution:

To calculate quantification and detection limits via the signal-to-noise ratio, always use the signal noise with a user-defined time range (see below). If you use the default calculation, the quantification and detection limits might not be calculated correctly.

Signal Noise in User-defined Time Range (Parameter Input)

You can also determine the signal noise for a specific time range of the chromatogram. For example, in the Integration view, specify this time range in the Report Column Properties dialog box (after double-clicking the column to edit) or in the Add Report Column dialog box by selecting Add Column on the Table menu. Select Chromatogram in the Categories list, and then select Signal Noise in the Variables list. Click Parameter to open the Parameter Input for 'Signal Noise' dialog box. Select the Specific Range check box and enter a range.