Methodology and Analysis of Determining Plug-In Hybrid Engine Thermal State and Resulting Efficiency 2009-01-1308
Testing plug-In hybrid vehicles over standardized and real world drive cycles has shown relatively large efficiency differences between ambient cold starts and hot starts(1,2) (CS/HS). This variation is dependent upon the drive cycle and powertrain architecture, and is significant in magnitude. Quantifying this inefficiency is non-trivial as charge-depleting modes, coupled with vehicle calibration sensitivity, consume small portions of fuel resulting in test variations in which thermal effects cannot be decoupled from slight calibration changes that mask thermal influences.
In this paper, a methodology for modeling and analyzing the fuel efficiency of a plug-in hybrid vehicle powertrain as a function of the engine operating temperature will be presented. Response surface methodology (RSM) techniques have been applied to generate brake specific fuel consumption (BSFC) maps as a function of the engine thermal state indicated by crankcase oil temperature. Coupled with a second response surface model of the engine thermal state using integrated fuel consumed (energy generated) as an input factor, analysis of fuel consumption as a function of thermal state is presented. Results show a 25% integrated efficiency loss associated with cold vs. hot start optimal initial engine oil temperature over the standard Urban Dynamometer Driving Schedule (UDDS). There is initially an approximate 3% increase in fueling for every five degrees of initial engine oil temperature decrease. This loss asymptotes as the optimal temperature is reached. Engine cold start enrichment for the cycle investigated is approximately 3% of the total fuel; therefore, greater losses are associated with powertrain heat transfer, friction, and calibration changes during the engine warm up period. Optimal initial engine oil temperature for this powertrain is estimated to be approximately 85°C.