Breast cancer remains a leading cause of mortality among women worldwide, making early and accurate detection vital for effective treatment. This study proposes a deep learning model, ThermoFusionNet, which integrates visual and infrared thermal imaging to detect breast abnormalities in a cost-effective and non-invasive manner. Adaptive Bilateral Kernel Filtering (ABKF) reduces noise while preserving edges in input images. Segmentation uses Distance Regularised Level Set Evolution (DRLSE) for precise delineation of breast tissue irregularities. Feature sensitivity and segmentation convergence are enhanced by Anisotropic Gaussian Smoothing Gradient-Based Optimisation (AGSGO). Classification is performed on fused visual and thermal image data. Experimental results demonstrate characteristic fluctuations in a shared feature axis ranging from 0 to 120, where benign cases maintain values below 10, and malignant cases begin above 160, peaking before declining. Malignant features exhibit distinct thermal and visual patterns that aid reliable detection. The model achieves improved accuracy and sensitivity compared to traditional methods. These findings support ThermoFusionNet as an effective diagnostic tool for early breast cancer detection. Future work aims at real-time diagnostics and mobile health integration to increase accessibility in low-resource settings.