Most of the current algorithms for reconstructing three-dimensional urban scenes use aerial imagery to derive the underlying three-dimensional geometry, and then apply textures taken from the aerial imagery to the three-dimensional model. Small errors, however, often occur in both the construction of the three-dimensional meshes and the application of the textures. Due to human ability to recognize patterns and symmetry, humans can quickly identify these errors that break the regularity of building facades. By applying lattice detection algorithms and other principles of computational symmetry to these reconstructed scenes, these errors can be identified and corrected. This thesis explores an error-correction step using texture regularity for the reconstruction of urban scenes.