A CVT belt is composed of multiple elements and layered rings. Each of these component parts generates loss, including relative slippage caused by the geometrical relationship between the elements and innermost ring layer. An effective way of increasing CVT efficiency is to reduce this slippage. However, since the relative slippage also controls whether the rings transmit constant torque at all times, reducing the slippage will also have an effect on the torque transmission performance of the rings. Therefore, to improve CVT efficiency by reducing the relative slippage, it is first necessary to analyze the changes to torque transmission. However, this slippage is a phenomenon of the inner portion of the belt and it is extremely difficult to identify the internal thrust force when actual load is applied. This paper describes experiments carried out to analyze the changes in each torque transmission ratio when the relative slippage between the elements and innermost ring layer changes. It also compares the results of these experiments with the results of analysis performed using the finite element method to enable the same analysis to be carried out by simulation.