The exponential growth of scientific literature requires effective management and extraction of valuable insights. While existing scientific search engines excel at delivering search results based on relational databases, they often neglect the analysis of collaborations between scientific entities and the evolution of ideas, as well as the in-depth analysis of content within scientific publications. The representation of heterogeneous graphs and the effective measurement, analysis, and mining of such graphs pose significant challenges. To address these challenges, we present AceMap, an academic system designed for knowledge discovery through academic graph. We present advanced database construction techniques to build the comprehensive AceMap database with large-scale academic publications that contain rich visual, textual, and numerical information. AceMap also employs innovative visualization, quantification, and analysis methods to explore associations and logical relationships among academic entities. AceMap introduces large-scale academic network visualization techniques centered on nebular graphs, providing a comprehensive view of academic networks from multiple perspectives. In addition, AceMap proposes a unified metric based on structural entropy to quantitatively measure the knowledge content of different academic entities. Moreover, AceMap provides advanced analysis capabilities, including tracing the evolution of academic ideas through citation relationships and concept co-occurrence, and generating concise summaries informed by this evolutionary process. In addition, AceMap uses machine reading methods to generate potential new ideas at the intersection of different fields. Exploring the integration of large language models and knowledge graphs is a promising direction for future research in idea evolution. Please visit \url{https://www.acemap.info} for further exploration.
Compared with CINE phase contrast MRI (CINE-PC), echo-planar imaging phase contrast (EPI-PC) can achieve realtime quantification of blood flow, with lower SNR. In this study, the pulsating real model of the simulated cerebral vasculature was used to verify the accuracy of EPI-PC. The imaging time of EPI-PC was 62ms/image at 100*60 spatial resolution. The reconstructed EPI-PC flow curve was extracted by homemade post-processing software. After comparison with the CINE-PC flow curve, it was concluded that EPI-PC can provide an average flow with less than 3% error, and its flow curve will be similar to the CINE-PC flow curve in shape.
It is still debated how breathing interacts with the CSF. New Phase contrast MRI sequence based on Echo Planar imaging (EPI-PC) can now produce continuously during minutes a velocity map, more or less every 100 ms. We did not found in the literature quantitative evaluation of the CSF stroke volume change during breathing. The aim of this work is to quantify CSF dynamics change in the aqueduct and in the spinal canal during the breathing and cardiac period using EPI-PC.
Cerebral arterial blood flow (CABF) can be investigated in few seconds without any synchronization by Real-Time phase contrast. Significant changes in CABF were found between expiration and inspiration during normal breathing of healthy volunteers. Synopsis (100/100) Real-time phase contrast MRI has been applied to investigate cerebral arterial blood flow (CABF) during normal breathing of healthy volunteers. We developed a novel time-domain analysis method to quantify the effect of normal breathing on several parameters of CABF. We found the existence of a delay between the recorded respiratory signal from the belt sensor and the breathing frequency component present in the reconstructed arterial blood flows. During the expiratory, the mean flow rate of CABF increased by 4.4$\pm$1.7%, stroke volume of CABF increased by 9.8$\pm$3.1% and the duration of the cardiac period of CABF increased by 8.1$\pm$3%.
Flow 2.0 is an end-to-end easy-of-use software that allows us to quickly, robustly and accurately perform a batch process real-time phase contrast data and multivariate analysis of the effect of respiration on cerebral fluids circulation. Synopsis (99/100) Real-time phase contrast sequences (RT-PC) have potential value as a scientific and clinical tool in quantifying the effects of respiration on cerebral circulation. To simplify its complicated post-processing process, we developed Flow 2.0 software, which provides a complete post-processing workflow including converting DICOM data, image segmentation, image processing, data extraction, background field correction, antialiasing filter, signal processing and analysis and a novel time-domain method for quantifying the effect of respiration on the cerebral circulation. This end-to-end software allows us to quickly, robustly and accurately perform batch process RT-PC and multivariate analysis of the effects of respiration on cerebral circulation.
Loss function are an essential part in modern data-driven approach, such as bi-level training scheme and machine learnings. In this paper we propose a loss function consisting of a $r$-order (an)-isotropic total variation semi-norms $TV^r$, $r\in \mathbb{R}^+$, defined via the Riemann-Liouville (R-L) fractional derivative. We focus on studying key theoretical properties, such as the lower semi-continuity and compactness with respect to both the function and the order of derivative $r$, of such loss functions.
Divide-and-conquer is a general strategy to deal with large scale problems. It is typically applied to generate ensemble instances, which potentially limits the problem size it can handle. Additionally, the data are often divided by random sampling which may be suboptimal. To address these concerns, we propose the $DC^2$ algorithm. Instead of ensemble instances, we produce structure-preserving signature pieces to be assembled and conquered. $DC^2$ achieves the efficiency of sampling-based large scale kernel methods while enabling parallel multicore or clustered computation. The data partition and subsequent compression are unified by recursive random projections. Empirically dividing the data by random projections induces smaller mean squared approximation errors than conventional random sampling. The power of $DC^2$ is demonstrated by our clustering algorithm $rpfCluster^+$, which is as accurate as some fastest approximate spectral clustering algorithms while maintaining a running time close to that of K-means clustering. Analysis on $DC^2$ when applied to spectral clustering shows that the loss in clustering accuracy due to data division and reduction is upper bounded by the data approximation error which would vanish with recursive random projections. Due to its easy implementation and flexibility, we expect $DC^2$ to be applicable to general large scale learning problems.