Presented in the paper is the methodology of queue time prediction in a semiconductor fabrica tion which consists of hundreds of complex and expensive equipment. Occurring between a continuous single process or multi-process, queue time in a semiconductor FAB is a highly important factor considering the quality of wafers that may have a significant impact on the cost. For this reason, most semiconductor fabrications already use the limits in queue time as one of the key dispatching factors. However, despite of the queue time policy, some wafers are scraped or reworked in the process. If we can predict the queue time, it is possible to reduce the unnecessary scraps by blocking or re-dispatching the wafers. Two approximations are pro posed to predict the queue time and compared in terms of accuracy and prediction time. One is performed by machine learning model which is made of the results from experiments. The other is the methodology of multi-resolution simulation model of which resolution can be divided into several level considering fidelity. The simulation model with SMAT2022 data set was used to validate the two methodologies.