We propose a novelty approach to connect machine learning to causal structure learning by jacobian matrix of neural network w.r.t. input variables. In this paper, we extend the jacobian-based approach to physical system which is the method human explore and reason the world and it is the highest level of causality. By functions fitting with Neural ODE, we can read out causal structure from functions. This method also enforces a important acylicity constraint on continuous adjacency matrix of graph nodes and significantly reduce the computational complexity of search space of graph.
In this paper, we propose a score-based normalizing flow method called DAG-NF to learn dependencies of input observation data. Inspired by Grad-CAM in computer vision, we use jacobian matrix of output on input as causal relationships and this method can be generalized to any neural networks especially for flow-based generative neural networks such as Masked Autoregressive Flow(MAF) and Continuous Normalizing Flow(CNF) which compute the log likelihood loss and divergence of distribution of input data and target distribution. This method extends NOTEARS which enforces a important acylicity constraint on continuous adjacency matrix of graph nodes and significantly reduce the computational complexity of search space of graph.