Abstract:Neural operators have become an effective framework for learning mappings between function spaces, yet most existing architectures realize operators within a single representational domain, such as physical, spectral, or latent space. In this work, we introduce UFO (Domain-Unification-Free Operator), a cross-domain neural operator framework that realizes operators through adaptive, jointly conditioned interactions among representations defined on distinct domains. UFO enables discretization decoupling: the input function can be observed at resolutions or locations different from those used during training, while the solution can be queried at arbitrary output resolutions. Across four complementary benchmarks covering discontinuous inputs, irregular sampling with spectral mismatch, nonlinear dynamics, and stochastic high-frequency fields, UFO delivers accurate, robust, and physically coherent predictions under distribution shifts. These results establish cross-domain, phase-modulated realization as a powerful framework for discretization-decoupled neural operator learning.




Abstract:In this paper, we propose a novel approach named by Discriminative Principal Component Analysis which is abbreviated as Discriminative PCA in order to enhance separability of PCA by Linear Discriminant Analysis (LDA). The proposed method performs feature extraction by determining a linear projection that captures the most scattered discriminative information. The most innovation of Discriminative PCA is performing PCA on discriminative matrix rather than original sample matrix. For calculating the required discriminative matrix under low complexity, we exploit LDA on a converted matrix to obtain within-class matrix and between-class matrix thereof. During the computation process, we utilise direct linear discriminant analysis (DLDA) to solve the encountered SSS problem. For evaluating the performances of Discriminative PCA in face recognition, we analytically compare it with DLAD and PCA on four well known facial databases, they are PIE, FERET, YALE and ORL respectively. Results in accuracy and running time obtained by nearest neighbour classifier are compared when different number of training images per person used. Not only the superiority and outstanding performance of Discriminative PCA showed in recognition rate, but also the comparable results of running time.




Abstract:There are two problems need to be dealt with for Non-negative Matrix Factorization (NMF): choose a suitable rank of the factorization and provide a good initialization method for NMF algorithms. This paper aims to solve these two problems using Singular Value Decomposition (SVD). At first we extract the number of main components as the rank, actually this method is inspired from [1, 2]. Second, we use the singular value and its vectors to initialize NMF algorithm. In 2008, Boutsidis and Gollopoulos [3] provided the method titled NNDSVD to enhance initialization of NMF algorithms. They extracted the positive section and respective singular triplet information of the unit matrices {C(j)}k j=1 which were obtained from singular vector pairs. This strategy aims to use positive section to cope with negative elements of the singular vectors, but in experiments we found that even replacing negative elements by their absolute values could get better results than NNDSVD. Hence, we give another method based SVD to fulfil initialization for NMF algorithms (SVD-NMF). Numerical experiments on two face databases ORL and YALE [16, 17] show that our method is better than NNDSVD.