The Common Rail Fuel Injection System (CRS) has completely changed the whole diesel engine combustion cloud dynamics and enhanced the applicability of diesel engines further with a motto of providing a more cleaner sky and greener earth. The most cutting-edge technological developments made in CRS and EGT system enables OEMs to achieve further more stringent emission norms and adopt the environmental protection compliances. Today’s CRS systems are the most advanced generation fuel injection systems providing further high injection pressures, wide multiple injections capability with shorter dwell periods enabling real smoother Digital Rate Shaping (DRS) benefits and Smart thermal management of exhaust systems while meeting stringent emission compliances and achieving future CO2 reductions goal. Initial DRS I system developments were tried with a dwell period up to 500us-800us due to injector durability life limitations.
The increasing demand for zero-emission vehicles to counter effects of global warming has increased the demand for reliable, energy-efficient, long-range and fast charging vehicles. However, the Electric Vehicle (EV) Systems like drivetrain, energy storage; must be used in the optimum safe temperature; enhances the system reliability. Here, an efficient thermal management system plays a crucial role. The thermal management systems are consuming the stored energy from the battery pack. Hence optimization is very much essential; to enhance higher vehicle range. This paper deals with the optimization techniques in thermal management system through mathematical approaches, Mat lab/Simulink models using the Fuzzy Logic Control (FLC) process. In this, a Mat lab/Simulink model of thermal management system for the drivetrain and energy management system worked out; considering different vehicle modes; drive cycles.
Technology improving on a daily base, the innovating structures and development of LED’S has led to dramatic improvements of the performance in LED technology. Replacing incandescent lights and CFLs clearly concludes the character of the LED. Producing high lumen output and efficiency has been the greatest advantage. Every industry including automobile is slowly shifting from halogen lamps to LED lamps. LED producing localized heat and maintaining the junction temperature for maximum lumen output range causing failure of LED has been the greatest hurdle for the engineers. Researches carried out in order to minimize the disadvantages are the main focus in lighting industry. In traditional methods, Thermal management of LED’s are done by passive cooling of LED using heat sinks and fluids such as coolant based heat sink model.
The automobile sector is moving towards electrification as a replacement for the conventional IC Engine as the power source of vehicles. In electric vehicles, Li-ion battery is the widely used energy source for traction and is a major differentiator among various sub components that affect vehicle performance, safety and efficiency. The life of the battery pack is affected by different stress factors like SoC (state of charge), DoD (depth of discharge), C-rates (charge and discharge currents) and battery temperature. Out of the mentioned stress factors, the life of batteries is most influenced by the temperature excursion seen by the li-ion cells in the battery pack. There are various thermal management strategies available to keep the temperature under control like air cooling , chilled liquid cooling and hybrid cooling systems.
During the cold start conditions engine must overcome higher friction loss, at the cost of fuel penalty till the optimum temperatures are reached in coolant and lubrication circuits. The lower thermal capacity of the lubrication oil (with respect to the coolant) inverses the relation of viscosity with temperature, improves engine thermal efficiency benefit. Engine oil takes full NEDC test cycle duration to reach 90°C. This leads to higher friction loss throughout the test cycle, contributing a significant increase in fuel consumption. Increasing oil temperature reduces viscosity, thereby reducing the engine friction. This helps to identify the focus for thermal management in the direction of speeding up the temperature rise during a cold engine starting. This work aims at the study and experiment of an exhaust recovery mechanism to improve the NEDC fuel economy.
The current automotive market demands quick product development which leads to the requirement of virtual validation. Virtual validation faces a key challenge of initial product information at the early stage of the project. Engine thermal management system is one such critical development area which lacks the simulation input data of cooling system air path resistance (APR) and warm – up temperatures. The standard approach for the above inputs is to use vehicle platform-specific data from software simulation libraries or optimizations based on charge air cooler (CAC) or radiator temperature measurements. This process lacks accuracy as it does not consider radiator, CAC packaging, and the variation of air velocity and warmup temperature over different sections. In this paper correlation building procedure between vehicle testing data and simulation parameters viz. APR and the warm-up temperature is presented in one dimensional (1D) thermal management software viz. Magna KULI.
Future automotive emission norms put stress for proper engine thermal management system which paves the path for new technologies to provide solutions. The temperature of engine intake charge air plays a key role in determining the performance and emission of the turbocharged SI engines. Conventional charge air coolers or intercooler use air as a medium to cool the compressed charge air from the turbocharger. In high-temperature conditions, the performance of conventional charge air coolers is low which reduces engine performance and creates a need for a more effective charge air cooler. A water-cooled charge air cooler is a heat exchanger where compressed intake charge air is cooled using coolant (Water + ethylene glycol) in the separate low-temperature cooling circuit. This low-temperature cooling circuit uses an independent low-temperature radiator (LTR), water-cooled charge air cooler (WCAC), electric water pump, and degas bottle.
The major challenge faced by most of the genset engine manufacturers nowadays is to meet the stringent emission norms with least modification in the engine design and optimizing the available emission control techniques. Indian emission norms for stationary Gensets is planned to be upgraded from CPCB II to CPCB IV+. The CPCB IV+ norms call for a 90% reduction in PM emission reduction for power range of 19 kW up to 56 kW compared to existing CPCB II norms. The mechanical fuel injection system is very popular in these application for its simplicity, serviceability and cost advantage. Hence it is desired to upgrade the existing CPCB II mechanical genset engine to CPCB 4+ keeping the fuel injections system similar and with DOC+DPF solution.
Increasing pollution all over the world has led to stringent emission norms and development of more environment friendly technologies. In near future, rapid transition towards greener and cleaner technologies is anticipated by automotive organisations. In India, FAME (Faster Adoption and Manufacturing of Electric Vehicles) scheme by NITI Aayog has made the intent about the usage of Hybrid & Electric vehicles (EV) clearer. As range is a major concern in EVs, the component that has become the subject of major interest is Battery due to its energy storage ability. Choosing batteries depends on energy density, weight and cost, which makes Li-ion battery technology leading option among others due to its better Power density v/s Energy density relations. For optimum performance and safety, one of the major concerns in the development of lithium-ion (Li-ion) battery packs for EVs is thermal management.
The shift over of the automobile sector from the ICE to the electric drives is imminent due to arising global issues of pollution and ever rising pressure on the demand of the natural resources due to lower efficiency of the ICE drives. This has led to uprising of the Lithium-ion batteries, with addition of the burden of living to expectation of clean energy and higher efficiencies. Alongside, with limitation in the availability of the lithium-ion batteries they carry a hefty price tag with them, hence causing huddles in the research. Lack of research leads to failure of batteries and may cause life threatening situations when operating in the vehicle. In order to insight the working of the cylindrical lithium-ion batteries under different driving and environmental conditions a methodology is developed for the coupled electro-chemical and thermal phenomenon. This allows anticipating the behaviour of the battery under different conditions that influence its performance.
Thermal management is considered one of the key enablers for the adoption of New Energy Vehicles. An efficient design of an electrified vehicle’s cooling system, be it a HEV, BEV or FCEV, is of major importance to guarantee vehicle lifetime, optimize energy efficiency, enable adequate driving range and allowing high charging speed. Moreover, it is of critical importance for safety. Compared to internal combustion engine (ICE) vehicles, cooling systems for electrified vehicles have become more complex with increasing integration of a variety of parts. The cooling medium’s main function is no longer limited to cooling of the ICE; it also used to conserve and transport heat to essential powertrain parts such as the battery pack, all while electrical safety cannot be jeopardized. Many recent launched electrified vehicles successfully employ the same water-glycol based cooling liquids that are found in ICE vehicles.
Till recently supercharging was the most accepted technique for boost solution in gasoline engines. Recent advents in turbochargers introduced turbocharging technology into gasoline engines. Turbocharging of gasoline engines has helped in powertrains with higher power density and less overall weight. Along with the advantages in performance, new challenges arise, both in terms of thermal management as well as overall acoustic performance of powertrains. The study focuses mainly on NVH aspects of turbocharging of gasoline engines. Compressor surge is a common phenomenon in turbochargers. As the operating point on the compressor map moves closer to the surge line, the compressor starts to generate noise. The amplitude and frequency of the noise depends on the proximity of the operating point to the surge line. The severity of noise can be reduced by selecting a turbocharger with enough compressor surge margin.
To study the functioning of a fuel cell and optimize its operating parameters to achieve the best efficiency in operation it is important to have a robust fuel cell model that can simulate the behavior of the fuel cell stack under various operating conditions. The operating voltage of the fuel cell at different current densities depends upon thermodynamic parameters like temperature and pressure of the reactants as well factors like the state of humidification of the electrolyte membrane. A 1D model is developed to capture the variation in voltage at different current densities due to internal losses and changes to operating conditions like temperature and pressure. Additionally since the stack temperature and moisture content within the stack influence the stack operation directly models for the thermal management of the stack and humidification of the membrane are also developed.
Valve seat inserts (VSI) are installed in cylinder heads to provide a seating surface for poppet valves. Insert material is more heat and wear resistant than the base cylinder head material and hence it makes them better suited for valve seating and improved engine durability. Also using inserts permits easier repair or rebuild of cylinder heads as only the wear surfaces need to be replaced. Desirable performance characteristics are appropriate sealing, heat-transfer and minimizing valve to VSI wear and undesired outputs include valve seat dropping and cracking. With downsizing trend of diesel engines, it leads to increasing power density and therefore higher cylinder pressure and temperatures. Hence the engine components are getting exposed to more severe loadings and hence to failure modes, which were not heretofore experienced based on the warranty data.
Automotive industry has seen implementation of advanced emission regulations like BS-VI in India along with growing market demand for increased product performance and reduction in total cost of ownership. This has made the engine architecture more intricate leading to complex interaction among engine and vehicle level parameters. It poses a challenge to understand mechanism and underlying physics for achieving product attributes like increased power density, higher fluid economy and reduced oil consumption. The current paper focusses on reducing engine oil consumption across diverse duty cycles using simulation tools, vehicle data analytics and design of experiments. The contribution of oil consumption mechanisms viz. oil evaporation, oil throw and oil transport are identified across different loads and duty cycles patterns. Piston ring dynamics simulation has been used to identify critical piston and ring parameters impacting oil consumption through directional trends.
With the rising emission level in Indian cities, the focus on pure battery electric vehicle is increasing also in India. The Government of India his also focusing on preparing right policies like FAME-2 to promote the acceptability of EV's by providing subsidies as well as creating the required infrastructure. The environmental conditions in India is much different than other developed countries of Euro, China etc. The max. temperatures in India can go up to 55℃ in the hottest summer time, while during winters temperature can go up to <-25℃ in the northern Himalia region. In these conditions the cooling system of the electric-powertrain components like E-Machine Battery pack etc have to be protected for worst case scenarios specific to Indian conditions. Major cost driver of EV's lie in the HV battery packs and it is very important for acceptability of EV's that the Battery is maximized.
In this paper a VTB (Virtual Test Bed) based approach were used to virtually calibrate the medium duty diesel engine for component protection in different combination of environment pressure and temperature, also to calibrate thermal requirement to ensure sufficient heating of ATS (After Treatment Systems) to reduce tail pipe emission. In view of BS6 emission norms implementation, the emissions are not only confirmed in standard condition but also in non-standard condition including different combinations of ambient temperature and pressure especially in NOx emission in WNTE cycle. However, achieving the emissions in different environment condition requires physical emission calibration in such conditions. Hence engine must be calibrated in climatic test chambers to ensure emission in different altitude condition and to save the time. But these test beds are quite expensive hence minimum time of these test bed preferred to save calibration cost.
Abstract The hybrid/electric vehicle (H/EV) market is very dependent on battery models. Battery models inform cell and battery pack design, critical in online battery management systems (BMSs), and can be used as predictive tools to maximize the lifetime of a battery pack. Battery models require parameterization, through experimentation. Temperature affects every aspect of a battery’s operation and must therefore be closely controlled throughout all battery experiments. Today, the private sector prefers climate chambers for experimental thermal control. However, evidence suggests that climate chambers are unable to adequately control the surface temperature of a battery under test. In this study, laboratory apparatus is introduced that controls the temperature of any exposed surface of a battery through conduction.
Abstract Nowadays, demanding carbon dioxide (CO2) targets push for the electrification of the powertrain since the internal combustion engine, as the sole way to propel the vehicle, cannot reach those targets. In this context, Mild-Hybrid Vehicles (MHV) with a low-voltage 48 V electric network proved to be a cost-effective solution for the reduction of CO2 emissions. However, the impact of the electrification on the powertrain thermal management, on the aftertreatment light-off, and thus on tailpipe (TP) emissions must be properly assessed. For this reason, a virtual testing approach for thorough powertrain and vehicle virtual validation is required to reduce testing and calibration efforts. The aim of this research work is to bridge the gap between high-fidelity models commonly used for the development of components and a system-level approach for the evaluation of full vehicle technologies and architectures.