This study investigates the impact of temporal aggregation on the performance of the Lagrange Multiplier (LM) test for exchange rate nonlinearity. Using Monte Carlo experiments based on various levels of temporal aggregation, autoregressive parameter values, and sample sizes, we analyze how they influence the ability of the test to detect nonlinearities. Our findings indicate that the power of the LM test decreases as temporal aggregation increases, particularly for exchange rate time series that exhibit highly autoregressive behavior. To empirically validate these results, we apply the LM test to high-frequency (one-minute) EUR/USD, GBP/USD, and USD/JPY exchange rate returns from December 4, 2024, to March 14, 2025, for 10,000 observations. Empirical evidence supports the simulation findings to the extent that nonlinearity is stronger in high-frequency data but declines as it gets aggregated over a longer horizon. The implications for monetary policy, risk management, and supervision in financial markets are substantial, and therefore, researchers and policymakers are encouraged to test for nonlinearities cautiously, taking into account the data's frequency.