Many of the Advanced Driver Safety Systems (ADAS) provide vision based driver assistance. All vision based systems are not the same since they differ in algorithms. Thus, it is necessary to compare how robust these different algorithms are and provide the best possible solution to the end user. Such characterization is often difficult due to changing environmental conditions under which the algorithms have to perform. In these systems, performance becomes a critical parameter since any slight lapse could translate to serious danger on road. The reliability of any algorithm depends on its accuracy and consistency under varying environmental conditions. It is important to define the boundary conditions of the algorithm to evaluate these performance parameters. Often one finds out system performance under varying conditions only by expensive road testing.In this paper, we propose a performance characterization platform to quantify the performance of vision based algorithms for driver safety systems by simulating real-life conditions. With this platform, one can simulate conditions like low light, fog, rain, dust etc. by varying the contrast, brightness, and blur of input image and synthetically adding noise to it. In our platform, one can simulate several different types of noise distributions such as Gaussian, salt and pepper, uniform, and speckle. The accuracy of the algorithm under test is determined for each of these conditions and a plot of accuracy against these parameters is obtained. The plot is useful to define the best operating region in terms of input parameters, for the algorithm under consideration. Additionally, for multiple algorithms, the plots obtained for each algorithm characterizes the performance under identical conditions. Comparison of this performance enables one to determine the best possible solution to give desired accuracy under varying conditions. The newly developed platform saves time as well as money required for extensive road testing.