Remote sensing enables real-time insights into crop progress, aiding farmer decision-making on crop management factors such as fertilization and irrigation. However, artifacts in satellite image time series (SITS), including clouds, sun and sensor geometry variations, intersensor biases, and atmospheric effects, cause reflectance and vegetation index fluctuations, complicating interpretation. To improve SITS temporal consistency, this study developed empirical models that correct these artifacts. The focus scenario was rice monitoring, including 7266 rice fields from four years in New South Wales, Australia. However, the method generalized to corn and cotton crops with promising results. Masking cloud-affected data using Cloud Score+ provided a better trade-off between image frequency and time-series smoothness compared with other cloud masking methods. Self-supervised LightGBM models outperformed linear models in correcting these deviations. The importance of key features including solar and view angles, reflectances in bands sensitive to atmospheric effects and the satellite sensor (differences between Sentinel-2A and -2B, especially in a red edge band), were indicated by SHAP analyses. Surprisingly, top of atmosphere time series had similar or better consistency than harmonized Landsat Sentinel surface reflectance time series after correcting the deviations. Our method reduced the root mean squared deviation of a red edge chlorophyll index by 39.7% and the normalized difference vegetation index by 21.4%. These corrections enhance SITS interpretability for near real-time crop monitoring, improving data in farmer decision-support tools.