In embedded system software architectural design, the Real-Time (RT) behavior estimation needs special care and contains many technical challenges. Most of the current approaches depend on either the engineering judgment or the actual measurements that are performed during the integration-testing phase. Both approaches may cause errors that lead to violations in the RT constraints. Both approaches are not error proof and can yield to RT constraints violations discovered during simulation of RT architectural design or during product validation. Impact on project could even be a Central Processing Unit (CPU) change.In this work, Operating System (OS) process Execution Time (ET) is considered the basic element of RT architectural design. Each process ET is predicted based on previous software releases, using Machine Learning (ML) algorithms. Different types of extracted feature vectors of a software release (e.g. number of static architecture requirements, hardware factor, and seniority index) are proposed to be inputs to ML multiple classifiers to predict OS processes ETs. The used ML algorithms (e.g. Support Vector Machine Regression, Extreme Learning Machine, and Neural Networks) are compared, in terms of the prediction performance, using practical case studies, which contain AUTOSAR Application (APP) and Basic Software (BSW) module, with proven results.The predicted ETs are given as inputs to OS modeling, simulation and verification tool. This tool is used in RT architectural design (e.g. design of the OS Tasks/ISRs configurations), OS behavior modeling and simulation and RT verification (missed deadlines, extra CPU load, etc…). Using this method, the architects can predict OS process ET, design, verify and refine the RT architecture either at the early phase of the project or before the coding phase. Summary, conclusion with proven results and future work are illustrated.