Abstract
ABSTRACT Background: Neoadjuvant chemotherapy (NAC) is the standard of care for locally advanced breast cancer. However, the disconnect between efficacy in randomized trials and effectiveness in real-world practice attributable to real-world treatment delays and adherence barriers remains underexplored for early-stage (cT1-cT3) operable disease. Methods: We applied the Target Trial Emulation (TTE) framework to a propensity-score matched cohort from the SEER database. To mitigate immortal time bias and staging migration, we reconstructed clinical baselines. Individualized Treatment Effects (ITE) were estimated using a Double-Robust Causal Forest algorithm. To rigorously cross-validate these estimates against model misspecification, we employed a DeepCox neural network as a non-linear sensitivity analysis tool, exposing complex risk structures (e.g., U-shaped hazards) that traditional linear assumptions might overlook. Results: In the matched cohort (N=26,946), Standard NAC was associated with an operational survival deficit (Absolute Risk Difference: 3.6%) compared to upfront surgery, corresponding to a hazard ratio of 1.32 (95% CI, 1.24-1.40; P<0.001). Causal Forest analysis revealed a critical "Response-Survival Discordance": while young TNBC patients exhibited high nodal pathologic complete response (npCR) rates, they paradoxically faced the worst survival outcomes (Standard Cox HR 1.87). Even in the 6-month landmark analysis to account for immortal time bias, this survival detriment persisted (Landmark HR 1.39; 95% CI, 1.06-1.81; P=0.016). Crucially, node-positive (cN+) patients-traditionally considered ideal candidates for systemic downstaging-experienced a significant survival detriment with NAC (HR 1.39). This disadvantage was most pronounced in Luminal A subtype and Invasive Lobular Carcinoma (ILC), where NAC failed to provide effective source control. In contrast, HER2-positive status exhibited a trend towards survival benefit. Anatomically, while cT2 tumors identified a "window of minimal operational deficit," operational risk paradoxically resurged in cT3 tumors. Conclusion: Our causal analysis reveals a critical disconnect between biological risk and therapeutic efficacy. While SHAP modeling identified node-positive (cN+) status as a high-priority indicator for systemic therapy, the low real-world response rate (npCR 15.0%) rendered historical standard NAC regimens insufficient to counterbalance the risks of surgical delay (HR 1.39). Our findings indicate that without therapeutic escalation (e.g., immunotherapy) to ensure high pathologic response rates, the operational risks of deferring surgery may outweigh the benefits of downstaging in this subgroup. Our findings highlight a critical "Implementation Gap" where standard NAC regimens yield suboptimal real-world outcomes for high-risk subgroups.
Links & Resources
Authors
Cite This Paper
S., G., Y., J., W., D., L., D. (2025). Operational Survival Deficit of Neoadjuvant Chemotherapy in Early-Stage Breast Cancer: A Target Trial Emulation and Causal Machine Learning Study. arXiv preprint arXiv:10.64898/2025.12.22.25342768.
Guan, S., Jian, Y., Dong, W., and Dong, L.. "Operational Survival Deficit of Neoadjuvant Chemotherapy in Early-Stage Breast Cancer: A Target Trial Emulation and Causal Machine Learning Study." arXiv preprint arXiv:10.64898/2025.12.22.25342768 (2025).