Abstract:Ambient field suppression is critical for accurate magnetic field measurements, and a requirement for certain low-field sensors to operate. The difference in magnitude between noise and signal (up to 10$^9$) makes the problem challenging, and solutions such as passive shielding, post-hoc processing, and most active shielding designs do not address it completely. Zero field active shielding (ZFS) achieves accurate field suppression with a feed-forward structure in which correction coils are fed by reference sensors via a matrix found using data-driven methods. Requirements are a sufficient number of correction coils and reference sensors to span the ambient field at the sensors, and to zero out the coil-to-reference sensor coupling. The solution assumes instantaneous propagation and mixing, but it can be extended to handle convolutional effects. Precise calculations based on sensor and coil geometries are not necessary, other than to improve efficiency and usability. The solution is simulated here but not implemented in hardware.
Abstract:Data are rapidly growing in size and importance for society, a trend motivated by their enabling power. The accumulation of new data, sustained by progress in technology, leads to a boundless expansion of stored data, in some cases with an exponential increase in the accrual rate itself. Massive data are hard to process, transmit, store, and exploit, and it is particularly hard to keep abreast of the data store as a whole. This paper distinguishes three phases in the life of data: acquisition, curation, and exploitation. Each involves a distinct process, that may be separated from the others in time, with a different set of priorities. The function of the second phase, curation, is to maximize the future value of the data given limited storage. I argue that this requires that (a) the data take the form of summary statistics and (b) these statistics follow an endless process of rescaling. The summary may be more compact than the original data, but its data structure is more complex and it requires an on-going computational process that is much more sophisticated than mere storage. Rescaling results in dimensionality reduction that may be beneficial for learning, but that must be carefully controlled to preserve relevance. Rescaling may be tuned based on feedback from usage, with the proviso that our memory of the past serves the future, the needs of which are not fully known.