Selecting learning machines such as classifiers is an important task when using AI in the clinic. K-fold crossvalidation is a practical technique that allows simple inference of such machines. However, the recipe generates many models and does not provide a means to determine the best one. In this paper, a modified recipe is presented, that generates more consistent machines with similar on-average performance, but less extra-sample loss variance and less feature bias. A use case is provided by applying the recipe onto the atrial flutter localization problem.
The use of motion capture has increased from last decade in a varied spectrum of applications like film special effects, controlling games and robots, rehabilitation system, animations etc. The current human motion capture techniques use markers, structured environment, and high resolution cameras in a dedicated environment. Because of rapid movement, elbow angle estimation is observed as the most difficult problem in human motion capture system. In this paper, we take elbow angle estimation as our research subject and propose a novel, markerless and cost-effective solution that uses RGB camera for estimating elbow angle in real time using part affinity field. We have recruited five (5) participants to perform cup to mouth movement and at the same time measured the angle by both RGB camera and Microsoft Kinect. The experimental results illustrate that markerless and cost-effective RGB camera has a median RMS errors of 3.06{\deg} and 0.95{\deg} in sagittal and coronal plane respectively as compared to Microsoft Kinect.