The Lubricant Test Monitoring System (LTMS) is the calibration system methodology and protocol for North American engine oil and gear oil tests. This system, administered by the American Society for Testing Materials (ASTM) Test Monitoring Center (TMC) since 1992, has grown in scope from five gasoline engine tests to over two dozen gasoline, heavy duty diesel and gear oil tests ranging from several thousand dollars per test to almost one-hundred thousand dollars per test. LTMS utilizes Shewhart and Exponentially Weighted Moving Average (EWMA) control charts of reference oil data to assist in the decision making process on the calibration status of test stands and test laboratories. Equipment calibration is the backbone step necessary in the unbiased evaluation of candidate oils for oil quality specifications.
Given that calibration of test equipment is both expensive and vital to the evaluation of candidate oil capability, it is worth reviewing current issues and concerns, and to consider enhancements to the LTMS. Recently, enhancements have been suggested and accepted for the ASTM Sequence IIIG. These enhancements include an emphasis on the test laboratory data as opposed to the test stand data, removal of unnecessary testing due to particular control chart alarms, and the monitoring of precision using a moving block of four tests against the chi-squared distribution. Enhancements, such as the ones implemented in the Sequence IIIG, need to be continuously suggested and evaluated to facilitate the spirit of the LTMS goal. This goal is to monitor engine and gear oil tests in a cost effective manner to determine and correct calibration issues, in order to enhance the discrimination capability of the test. While escalating engine oil test costs and rapid category turnover threaten both the cost structure and the technical strength of LTMS, balancing the LTMS goal is more important than ever as we move into the future.