Multi-Class Classification of Urban Regeneration Using a Siamese Network: An Analysis with Real-World Data from Portland, Oregon
Mar 2, 2025··
1 min read
Yang Yang

Abstract
Urban regeneration is a critical phenomenon as cities worldwide adapt to increasing populations and evolving economic, social, and policy landscapes. Traditional methods—relying on ground surveys, permit records, and medium-resolution remote sensing data—often fail to capture the subtle, incremental changes characteristic of urban regeneration. This study proposes a novel multi-class classification framework using Siamese networks to detect and categorize nuanced urban regeneration processes from high-resolution aerial imagery. We constructed an urban change dataset comprising 2,000 residential parcels from Portland, Oregon—leveraging verified permit records as baseline ground truth—and implemented several Siamese network architectures with popular convolutional neural network backbones including ResNet, U-Net, and YOLO, as well as a U-Net variant enhanced with a local similarity attention module. Our experiments, conducted using five-fold cross-validation, reveal that the U-Net architecture achieves the highest and most stable overall accuracy (exceeding 85%) in distinguishing between four classes: No Change, New Development, Redevelopment, and Demolition. Although challenges remain—particularly in differentiating between unchanged and redeveloped parcels—the proposed method demonstrates significant potential for supplementing or even replacing traditional urban change detection techniques. This approach offers a scalable, cost-effective tool for urban planners and researchers, facilitating rapid and reliable assessments of urban regeneration dynamics even in data-limited environments. Future research will explore the viability of transfer learning to adapt the model for use in diverse urban settings, further broadening its practical applicability in the field of urban planning.
Type
This work was presented in AAAI 2024 AI for Urban Workshop
Click on the Slides button above to see the highlights of this paper