To model interferometric data, has to compute model visibilities taking into account the measurement process and all its limitations.
The integration time, scans.int_time, is defined as the length of time over which the data were averaged to yield the given data point. For large sources, e.g. wide binaries, long integration times can significantly lower the averaged fringe contrasts. Taking the system variable !int_time to specify an upper limit for the integration time in seconds which will not require integrating the model visibilities, will compare the actual integration times to this limit and determine the maximum number of computations required to cover the largest integration interval. The step size for each observation will be adjusted according to the actual integration time of the particular observation, given the number of computations.