Abstract:The development of wireless communication technology has imposed strict requirements on antennas, including high-frequency operation, miniaturization, and strong directivity, to support higher data transmission rates in the future. Due to inherent amplitude and phase differences among channels in millimeter-wave chips and multi-antenna arrays, as well as additional amplitude and phase differences caused by factors such as temperature characteristics and aging during operation, beamforming accuracy can be significantly degraded. To mitigate the impact of inter-channel amplitude and phase differences on beam control, this paper proposes a near-field calibration algorithm based on plane wave spectrum expansion. The algorithm constructs a signal model between the antenna under test (AUT) and the probe using plane wave spectrum theory. Based on this model, a virtual signal for calibration measurement is generated and compared with the measured signal. A genetic algorithm is used to optimize the difference between the virtual and measured signals, and the initial excitations of the millimeter-wave array antenna are determined when the difference reaches its minimum. To validate the effectiveness of the proposed calibration algorithm, a four-element millimeter-wave linear array was calibrated. The probe scanning range was approximately one-third of the array aperture. The obtained amplitude calibration error range was ±0.43 dB, and the phase calibration error range was ±4.6°, achieving high-precision calibration.