Abstract:Row-Column Arrays (RCAs) offer an attractive alternative to fully wired 2D-arrays for 3D-ultrasound, due to their greatly simplified wiring. However, conventional RCAs face challenges related to their long elements. These include an inability to image beyond the shadow of the aperture and an inability to focus in both transmit and receive for desired scan planes. To address these limitations, we recently developed bias-switchable RCAs, also known as Top Orthogonal to Bottom Electrode (TOBE) arrays. These arrays provide novel opportunities to read out from every element of the array and achieve high-quality images. While TOBE arrays and their associated imaging schemes have shown promise, they have not yet been directly compared experimentally to conventional RCA imaging techniques. This study aims to provide such a comparison, demonstrating superior B-scan and volumetric images from two electrostrictive relaxor TOBE arrays, using a method called Fast Orthogonal Row-Column Electronic scanning (FORCES), compared to conventional RCA imaging schemes, including Tilted Plane Wave (TPW) compounding and Virtual Line Source (VLS) imaging. The study quantifies resolution and Generalized Contrast to Noise Ratio (gCNR) in phantoms, and also demonstrates volumetric acquisitions in phantom and animal models.
Abstract:Active Learning (AL) and Few Shot Learning (FSL) are two label-efficient methods which have achieved excellent results recently. However, most prior arts in both learning paradigms fail to explore the wealth of the vast unlabelled data. In this study, we address this issue in the scenario where the annotation budget is very limited, yet a large amount of unlabelled data for the target task is available. We frame this work in the context of histopathology where labelling is prohibitively expensive. To this end, we introduce an active few shot learning framework, Myriad Active Learning (MAL), including a contrastive-learning encoder, pseudo-label generation, and novel query sample selection in the loop. Specifically, we propose to massage unlabelled data in a self-supervised manner, where the obtained data representations and clustering knowledge form the basis to activate the AL loop. With feedback from the oracle in each AL cycle, the pseudo-labels of the unlabelled data are refined by optimizing a shallow task-specific net on top of the encoder. These updated pseudo-labels serve to inform and improve the active learning query selection process. Furthermore, we introduce a novel recipe to combine existing uncertainty measures and utilize the entire uncertainty list to reduce sample redundancy in AL. Extensive experiments on two public histopathology datasets show that MAL has superior test accuracy, macro F1-score, and label efficiency compared to prior works, and can achieve a comparable test accuracy to a fully supervised algorithm while labelling only 5% of the dataset.