The libraries also read the ADC output data and convert it into voltage values

For example, Adafruit sells ADCs with resolutions ranging from 8‐ to 16‐bits and includes open‐ source software libraries and hardware design for interfacing them with a Raspberry Pi or other single‐board computer. Some sensing applications, however, such as thermocouple psychrometry and load cell measurement, involve the detection of small changes within a large measurement range. These applications may require higher than 16‐bit resolution, thus necessitating an ADC with higher resolution along with low‐noise and low‐drift electronic components. The DAQ system described here provides high resolution at a significantly lower cost than commercial laboratory DAQ systems with similar specifications. It uses the ADS1262 ADC , which has 32‐bit resolution, very low noise and drift , as well as many built‐in features . The ADS1262 and ADS1263 are identical except that the ADS1263 also includes one additional independently controlled 24‐bit ADC. The ADS1262 was used in the system described here, but the ADS1263 can also be used and will perform the same. These features allow this ADC to be used with many different types of sensors; however, the manufacturer of this ADC does not provide a software library that allows it to be easily interfaced with a Linux computer like the Raspberry Pi. To address this, we describe here the open‐source software libraries we developed to provide this interface, called piadcs, and the electronic system design required to use this ADC with a Raspberry Pi to make ultra‐ high‐resolution measurements.The DAQ system consists of a relatively simple hardware design based around a Raspberry Pi and the ADS1262/3 , 10 plastic plant pots and the piadcs software libraries enable users to easily configure the ADS1262/3 and to collect, convert, and store the output data.

The ADS1262/3 is a good choice for a custom DAQ system because of its extremely high resolution and many features that give it flexibility . Functions to modify these settings are found in both piadcs libraries. There are functions for both reading data continuously or reading on command. Communication between the Raspberry Pi and the ADS1262/3 uses a combination of SPI and GPIO. The Go version of the library uses the Periph library to control the SPI and GPIO interfaces, and the Python version uses Spidev and RPi.GPIO . There are other ADCs on the market that use highly similar programming to the ADS1262/3, and the piadcs libraries are extendable to such ADCs. The libraries also contain documentation and examples that provide a template for programming other ADCs.There are two functionally equivalent versions of the piadcs library: one is written in Python and the other in Go. The two versions offer different advantages and disadvantages due to differences between the two languages. Python is one of the most widely used programming languages, but as an interpreted language, it runs slower and does not have support for concurrency. Go is less commonly used but still among the top 20 most‐used programming languages . Go is a compiled language, which provides performance advantages over Python and is simpler to read than other compiled languages . Both versions of the piadcs library are available on GitHub.Both are installed as packages, and detailed instructions for installation and usage can be found in the README file on GitHub. In brief, the Python library is installed from the command line using the “pip3” command and the Go library is installed using the “go get” command. The “Examples” folders found in the libraries contain code examples showing how to use the different functions in the library to change ADC settings and collect data from the ADC.

These libraries were designed to run on a Raspberry Pi model 4B or 3B+ running Raspberry Pi OS. It may also be possible to run them on other models provided they are running an up‐to‐date version of the same operating system, but we have not tested this. There are many helpful guides available on how to get started with a Raspberry Pi and the Raspberry Pi OS. For an official source, the documentation section of the Raspberry Pi website provides a detailed guide .The DAQ system can be built using the materials listed in Table 1 for about US$100. The wiring diagram shown in Figure 1 illustrates how the components are connected. The ADS1262/3 can be connected to the Raspberry Pi using one SPI bus and three GPIO pins. In our setup , they are connected via the Raspberry Pi’s SPI_0 and GPIO pins 4, 22, and 27, but any of the available SPI interfaces and GPIO pins could be used. These connections must be specified in the code. Several breakout boards are available for connecting the ADS1262/3 to solderless breadboards for prototyping. A breakout board from ProtoCentral was used in the development of these libraries, but an alternative from Olimex would also be suitable . A Raspberry Pi–specific “HAT” for the ADS1263 is available from Waveshare . It is compatible with the piadcs libraries, but it is not suitable for low‐noise measurements in that one cannot electrically isolate the Raspberry Pi and the ADC because they share the same power supply; moreover, this board situates heat‐generating components near the temperature sensor of the ADC. High‐resolution measurements require low system noise. In our DAQ system, low noise is achieved by electrically isolating the ADS1262/3 from the Raspberry Pi. This requires separate power supplies. The Raspberry Pi is powered by a standard USB‐C wall adapter , which is usually sold with the computer, whereas the ADS1262/3 is powered by 9‐V batteries connected to linear voltage regulators. Batteries are preferable to wall supplies because they do not generate any 60 Hz AC noise. The ADS1262/3 draws less than 6.5 mA, and so a battery lasts for several weeks. The digital communication channels between the ADS1262/3 and the Raspberry Pi are also isolated using the ADuM4151 7‐channel SPIsolator .

All the components for this custom DAQ system can be wired onto a solderless breadboard or made into a printed circuit board. A solderless breadboard setup was used for the test measurement shown in Figure 2.It is possible to make very low noise measurements with this DAQ system as long as the aforementioned electrical considerations are addressed. System performance was tested with and without sensors connected. We first measured baseline system noise with no sensors connected and found that, at slower data rates , the system noise is remarkably low and the system has better than 1 ppm precision. As the data rate increases, the noise increases and precision decreases somewhat but is still very good . The ADS1262/3 has 16 different data rates available ranging from 2.5 to 38,400 samples/second, but our DAQ system only performs well at data rates up to 14,400 samples/ second. This is due to a breakdown in serial communication that occurs at higher speeds and could potentially be solved in future releases. Although the ADC used in this system has a nominal resolution of 32‐bits, the actual system precision is lower, especially as the data rate increases. Noise remaining in the system , along with the inherent tradeoff between ADC speed and resolution, causes the effective resolution of an ADC to be lower than its nominal resolution . The noise floor and, therefore, the effective resolution of our system is very close to that specified in the ADS1262 datasheet for the data rates and digital filters tested , plastic pots large and is a major improvement over other existing open‐source DAQ systems for the Raspberry Pi. System performance with a connected sensor was tested by measuring the output of a K‐type thermocouple submerged in an ice bath using our DAQ system. This setup was able to measure the ice bath temperature with a noise level of less than ±0.01°C ; this was achieved using only the analog front end provided on the ADS1262 with the PGA set to the maximum setting of 32 V/V. An external ultra‐low‐noise amplifier set to a higher gain could be used instead of the onboard PGA to further decrease noise for applications requiring very low‐level measurements . Our DAQ system has significantly better noise performance than other Raspberry Pi thermocouple DAQ systems. For example, the MCC 134 Thermocouple DAQ HAT for Raspberry Pi has greater than 0.5°C measurement error with the same type of thermocouple. This is likely due partly to large thermal gradients caused by placing the DAQ board on top of the heat‐generating components of the Raspberry Pi.Two assumptions have guided the study of concept learning ever since Hull . The first is that category learning amounts to learning a common label for sets of objects. This assumption is explicit in the ubiquitous supervised classification task, in which people receive feedback when classifying visually presented stimuli. This paradigm has been used to determine, for example, whether prototype models are superior to exemplar models . Over the years, researchers have taught people to group objects into sets and have examined the resulting representations. A second assumption has been that information about a category learned in one context, should not transfer well to another. Consider the goal of distinguishing roses from raspberry bushes. If the most diagnostic feature is the presence of berries, then people will learn that the berry feature should receive the most attention weight . However, when one later has to distinguish raspberry from cranberry bushes, thorns suddenly become diagnostic, because while both have red berries, only the raspberry bush has thorns.

The problem is that optimizing attention for one category contrast is not always optimal for another . The consequence of ignoring irrelevant dimensions for one set of category contrasts means that the learner has to re-attend those dimensions when familiar categories are contrasted in novel ways. That is, the learner has to relearn about raspberries. In this manner, the heralded powers of selective attention assumed by present theories are predicted to harm performance when previously irrelevant dimensions become relevant. The mechanisms of attention allocation in many computational models of category learning suggest that people learn to attend to only that information needed to distinguish the two categories being acquired. The problem we raise is that after learning one classification in which, say, cue A is most diagnostic, people should have trouble learning a second classification in which B is the good cue, because prior classifications have taught people to ignore it . We ask two questions in this study. First, how rigid are learners’ representations across different learning tasks? Second, can attention provide an explanatory variable for differences in what is learned between tasks? We speculate that flexible category representations are necessary for everyday classification, since particular category contrasts are not always known ahead of time by the categorizer. Previous research points to inference as being a likely candidate for producing flexible representations. To the extent that inference but not classification produces flexible category representations, it may reflect a more ecologically valid task for studying the kinds of concepts that people use everyday.Other tasks, where the goal is not to classify, but to learn about the properties of categories, may yield a flexible representation that can handle novel contrasts. Research that has expanded the array of concept acquisition tasks led us to consider a task that may produce flexible conceptual representations. Whereas classification involves predicting the category label from features, feature inference learning involves predicting a missing feature from other features and the category label. So rather than determining that a plant is a raspberry bush,the inference task asks learners to determine whether a raspberry bush has thorns, or some other property. Comparisons of the feature inference task with supervised classification are of current interest, with evidence that inference produces different representations. It has been found that inference produces: increased sensitivity to within-category correlations of features , increased sensitivity to nondiagnostic, prototypical features , more prototypical-feature inferences, and faster learning of linearly separable categories . Thus, in spite inference and classification tasks being formally identical , it is possible that the resulting flexiblity of category representations can also differ. The above-cited evidence suggests that whereas classification learning may foster attention to the diagnostic dimensions that serve to distinguish between categories, inference learning may focus categorizers on within category information. Our hypothesis is that because the within-category information acquired by inference learners is not tied to any particular set of contrast categories, such knowledge yields a more general and flexible representation. As a consequence, with respect to novel contrasts, inference learners may be at an advantage over classification learners.