Abstract:This paper introduces a novel physical annotation system designed to generate training data for automated optical inspection. The system uses pointer-based in-situ interaction to transfer the valuable expertise of trained inspection personnel directly into a machine learning (ML) training pipeline. Unlike conventional screen-based annotation methods, our system captures physical trajectories and contours directly on the object, providing a more intuitive and efficient way to label data. The core technology uses calibrated, tracked pointers to accurately record user input and transform these spatial interactions into standardised annotation formats that are compatible with open-source annotation software. Additionally, a simple projector-based interface projects visual guidance onto the object to assist users during the annotation process, ensuring greater accuracy and consistency. The proposed concept bridges the gap between human expertise and automated data generation, enabling non-IT experts to contribute to the ML training pipeline and preventing the loss of valuable training samples. Preliminary evaluation results confirm the feasibility of capturing detailed annotation trajectories and demonstrate that integration with CVAT streamlines the workflow for subsequent ML tasks. This paper details the system architecture, calibration procedures and interface design, and discusses its potential contribution to future ML data generation for automated optical inspection.
Abstract:This paper presents the design and evaluation of a physical support structure for the OptiTrack X22 tracking systems, constructed from carbon fiber-reinforced polymer (CFRP) and Invar steel. These materials were chosen for their low thermal expansion, ensuring geometric stability and rigidity necessary for accurate spatial measurements. The support system is scalable and adaptable for various applications and setups. The study further investigates the effects of camera placement and separation in near-parallel configurations on measurement accuracy and precision. Experimental results show a significant correlation between camera distance and measurement precision - closer camera setups yield higher precision. The optimized camera arrangement allowed the prototype to achieve accuracies of +/-0.74 mm along the camera's line of sight and +/-0.12 mm in orthogonal directions. The experiments show that the standard deviation of the noise on a single measurement plane orthogonal to the camera's line of sight vary between 0.02 and 0.07, indicating that the measurement noise is not constant for every point on that specific plane in the meanurement space. Details of the system's design and validation are provided to enhance reproducibility and encourage further development in areas like industrial automation and medical device tracking. By delivering a modular solution with validated accuracy, this work aims to promote innovation and practical application in precision tracking technology, facilitating broader adoption and iterative improvements. This approach enhances the accessibility and versatility of high-precision tracking technology, supporting future progress in the field.