A study is carried out here to examine the probable causes for the seemingly contradictory results found in the present-day literature on low-heat-rejection (LHR) engines, provide plausible explanations, and indicate possible directions for future research. Almost all numerical studies predict improved thermal efficiency, increased availability in the exhaust, and reduced in-cylinder heat rejection in the case of LHR engines. The degree of improvement varies considerably from a few percentage points to several, depending on the extent of insulation and whether or not turbocompounding and Rankine bottoming cycle are included. In these simulations, care is taken not to allow the volumetric efficiency to decrease due to higher cylinder temperatures in LHR engines. In addition, air-fuel ratio and, in many instances, peak conditions are maintained constant in both the LHR and conventionally-cooled engines. One experimental investigation, which has followed the above conditions, has shown considerable improvement in fuel consumption over conventionally-cooled engines. Some other experimental studies which have not followed these conditions indicate that LHR engines are poor performers. In fact, some of the engines used in the tests cannot even be classified as LHR engines as they have rejected more heat than the conventional ones. In the experiments where the volumetric efficiency is allowed to fall, combustion to degrade, peak conditions to alter, and in-cylinder heat rejection to increase, the LHR engine's performance naturally suffers. In conclusion, it appears significant improvements in thermal efficiency can be obtained by LHR engines relative to baseline engines. This fact is borne by numerous numerical simulations and also by the experiment which is carried under the proper constraints. To further validate this, more experiments, under the aforementioned conditions, are recommended on LHR engines built for this purpose rather than the existing ones improvised by component substitution.