Urban fire response is a core concern in contemporary smart-city development because emergency management depends not only on detection accuracy but also on rapid, reliable routing through complex urban environments. This manuscript presents an integrated fire-surveillance framework that combines visual analytics, environmental sensing, and state-space navigation to support intelligent emergency response. The proposed system uses a convolutional neural network (CNN) to analyse fire imagery and a multilayer perceptron (MLP) to process heat and smoke sensor signals, after which an intelligent agent navigates the urban search space using the A* algorithm. The framework is positioned as a practical smart-city safety architecture: it links distributed sensing, machine learning, and graph-based route optimisation in a single operational pipeline.
The empirical design follows the source implementation: the image pipeline is trained on a balanced 1,900-image fire/no-fire collection, supported by a 31-video fire-surveillance set and smoke-sensor data; the navigation stage operates on a custom graph with 297 sensor nodes and 2,345 links. Results reported in the source paper show stable learning behaviour for both CNN and MLP branches, strong classification performance on the held-out image test set, and a consistent operational advantage of A* over heuristic-only best-first routing in weighted state-space navigation. Interpreted for an urban-development and smart-cities audience, the study demonstrates how intelligent surveillance can strengthen public safety, improve response coordination, and support resilient urban infrastructure.