The ventricular action potential duration (APD) is a fundamental determinant of cardiac electrical stability and can be estimated by measuring the activation recovery interval (ARI) from the unipolar electrogram (UEG), which represents the electrical activity of the heart at the tissue level. Under experimental conditions, automatic estimation of ARIs is challenging due to non-related interferences and low signal-to-noise ratios (SNRs). In this simulation study, we investigated how the reliability of ARI estimates is affected by noise and artefacts in the UEG. Real-like electrograms were generated using a 257-node whole heart model to synthesize 20 real-like UEGs exhibiting constant and dynamic ARI patterns. Controlled degrees of noise and contamination (ectopic beats) were added to obtain a range of signal qualities. The generated recordings were automatically analyzed using a proposed standard method to estimate the ARI. The performance was compared with two improvements of the standard method including a narrow search window and a correlation filter. The results show that the robustness of automatic ARI analysis was dramatically improved by using the proposed improvement methods. For typical recordings with a SNR of 10dB and filtered with a common used cutoff frequency of 30Hz to measure repolarization, the average mean absolute error of the estimations was reduced from 16.2ms (range:12.2-29.0ms) for the standard method to 11.6ms (range:6.0-13.4ms) for the improved method. The standard deviation was reduced from 38.2ms (range:26.8- 58.5ms) to 14.6ms (range:7.6-16.9ms). Detection of cyclical variation of ARI was also improved by using the improvement strategy: for 0.2Hz ARI oscillations with an amplitude of 5ms, the highest average detection rate increased from 41% for the standard method to 100% using the improved method for recordings with a SNR of 10dB.