It also provides for standard OS features like data logging and directory structures

All the architectures they review are modular with the low level control functions deployed on a microcontroller and higher level tasks performed by various systems—either ROS or a custom-designed middleware. The real-time control in two cases use a HAL, the others do not. Within the Autonomous Systems Laboratory there is an important related work called the Santa Cruz Low-Cost UAV GNC Subsystem, which was an early autopilot intended for UAVs. It was programmed using the embedded code generation capability of Matlab and Simulink. This project, though successful, was never widely adopted. One reason is that Matlab and Simulink are expensive commercial software packages and come with a significant learning curve. Another is that the project wasn’t open source limiting access to vehicle developers or potential contributors. These limitations motivate this work which is open source, both hardware and firmware.This dissertation is organized as follows. Chapter 2 is a review of the requirements for the real-time controller. Chapter 3 details the hardware development process, architecture, electrical design, and the final controller. Chapter 4 details the software design, architecture, and development process. In Chapter 5 we propose a benchmark algorithm to quantify the performance of the real-time controllers. Chapter 6 shows the results of using the benchmark on four different processors, three real-time systems using micro-controllers and one SBC running Debian Linux. Chapter 7 details the development of an AGV from the chassis to autonomous navigation used as the main test platform for the controller. Chapter 8 covers three new cases for different autonomous systems that use this architecture or are in the process of deploying it. Chapter 9 demonstrates several areas where this work can be extended. Finally, Chapter 10 provides the conclusions.

There are two supplementary appendices.Appendix A is a derivation of the complementary filter, nft growing system the basic algorithm behind the benchmark. Appendix B details the application of the autoregression with external inputs method for system identification using a simple motor model as an example. The primary motivation for this work is to provide a means for an autonomous vehicle developer to rapidly prototype the real-time guidance, navigation and control system of a new vehicle. To enable this we decided at the inception of the project to make the design and firmware open source and named it the Open Source Autonomous Vehicle Controller, or OSAVC for short. The advantages provided by open source methodologies to a would-be vehicle developer are that the designs and algorithms are freely accessible, they are easily modifiable , and can be sourced from numerous vendors. From the project side open sourcing allows contributors from all over the globe to add new functionality, to test and provide feedback, and iterate on the designs. Indeed, during the development of this project we were able to participate in the Google Summer of Code program which hires student interns to contribute to open source projects. Over the course of two summer sessions, the OSAVC project received over 45 applications from students all over the globe, and was able to provide six paid internships. Additionally, the OSAVC prototype was used by two other graduate researchers for an ASV and a quadcopter which was an early demonstration of the capability and modularity of the project. The OSAVC provides real-time control of the vehicle actuators and measurement of the sensors. In the context of control systems, real-time refers to a fixed output period of the controller, i.e., the period between updating the vehicle actuators is deterministic and constant. All modern control systems are digital, that is they do not provide continuous outputs, instead they update the outputs on a fixed clock cycle. The systems under control, however, exist in the continuous real world.

If the controller clock cycle, τc, is short compared to the dynamics of the vehicle, then the digital controller approximates a continuous controller. In other words, if the rise time1 of a system in response to a step change in input is τv, then the digital control system approximates the continuous case when τc << τv. In the frequency domain a usual figure of merit is that the update rate is 1/20th-1/30th the bandwidth of the system. A finite clock cycle introduces a lag in the response of the controlled system. In addition to having as small a lag as is practical, it is also important to minimize the variation in the clock cycle. Small variations are inevitable but larger variations can lead to poor control or, in the worst case, instability. Not all tasks in an autonomous system require real-time control. For example, logging of telemetry data for post-mission use does not require a fixed clock—the system can buffer the data until it is convenient to store it. Thus, when designing a distributed control system it is natural to divide tasks into real-time and non-real-time categories. Real-time tasks are the domain of the real-time controller and all others belong to a processor with an OS. Note that some control systems attempt to handle all tasks within the same processor by using an RTOS. We discuss the tradeoffs of this type of OS in Section 2.6.1. Strictly real-time tasks are ones that support the control of the vehicle. These comprise the measurement of all the inputs and computation of the algorithms necessary to provide the actuator output. These include any sensor that is used to estimate the vehicle’s state . A non-comprehensive list of these sensors include wheel encoders, airspeed sensors, GPS, gyroscopes, accelerometers, magnetometers, barometers, etc. Accordingly, the navigation tasks, which include state estimation, are also real-time as they provide the input to the controller.

Although mapping of the environment isn’t a strictly real-time task, simultaneous localization and mapping is, as it provides vehicle state information . Communications generally do not have real-time requirements with the exception of remote control, when the system is operated manually using a radio. Finally, the controller itself must operate on a fixed cycle. All other tasks are non-real-time. Note that this doesn’t imply that they are not time-sensitive. For example, a range sensor might be used for obstacle avoidance, or a camera may be used to identify landmarks in the environment. In both cases, these data are needed to support guidance of the vehicle. Guidance is the process of determining the vehicle trajectory and therefore must be done quickly, but not necessarily on a fixed clock. Generally, guidance tasks are not real-time but are typically time sensitive. Other tasks, such as logging of data, measurement of sensors not used for state estimation, and general communications are non-real-time. A generic taxonomy of the tasks of an autonomous system is summarized in Table 2.1 showing the classification of real-time versus non-real-time as discussed above.From a high level, the OSAVC should support the main hardware and communication interfaces used by commercially-available components. The components needed for a vehicle will include sensors, radios , and actuators . Additionally, as a real-time subsystem, vertical hydroponic nft system the OSAVC module should communicate efficiently and quickly to the larger vehicle control system. For example, a vehicle may have an onboard guidance and navigation SBC that requires periodic communication of the vehicle state variables in order to plan a route. The module should support the real-time requirements of latency and speed for a vehicle. That is, it should be able to operate as quickly as needed by the vehicle and meet the deterministic latency requirement for the control algorithms. A final requirement is that the OSAVC provide power, signal conditioning, and power management for the sensors and processor, and optionally for any external peripherals or modules . The most common hardware interfaces are the inter-integrated circuit interface, the serial peripheral interface , the universal serial bus , and the universal asynchronous receiver-transmitter —commonly known as a serial port. Less common but used extensively in automobiles and robotics is the controller area network bus . Other peripherals needed for the OSAVC are hardware timers, input capture modules —used for decoding incoming digital signals, and output compare modules used for generating pulse-width modulated signals needed for motor and servo actuator control. Finally, general purpose input-output pins are useful for turning on LEDs, setting switches, and other requirements. The OSAVC should have most or all of these hardware peripherals in order to support the broadest range of applications; while tradeoffs can be made with respect to capability versus size, weight, and power, we typically fall on the side of greater capability in this work. The firmware is written entirely in the C programming language for speed and efficiency. The most important requirement of the firmware is modularity. Each sensor driver, estimation algorithm, and control algorithm is modular. Each module has a header file which provides the interfaces to public methods and a source file containing the code; each source file has a built-in test harness to allow for module troubleshooting outside of an application.

To maximize efficient utilization of the processor each sensor module is non-blocking. This requirement forces the use of vectored interrupts and each sensor has a unique interrupt priority assigned to it. The firmware library contains a selection of commonly used sensors needed for vehicle autonomy both to provide a minimum of functionality for a new developer as well as a template for new sensor drivers. The application itself should be as simple as possible to achieve the desired control. An example application for an AGV will be provided as a template for other applications. The application is a bare metal application consisting of an infinite loop with a hardware timer for task scheduling. This avoids the complexity and overhead of an RTOS. This requirement merit some discussion and is covered in Section 2.6.1. An RTOS is a piece of software that brings in real-time and non-real-time tasks into the same controller. It uses a HAL that allows for programmers with no expertisein embedded programming to communicate with various sensors without any understanding of the underlying hardware. The difficulty in using an RTOS lies in modifying the real-time tasks. Even with well-structured code, debugging the latency of a new realtime algorithm can be challenging. Autopilot firmware running on an RTOS typically implements the algorithms with tuneable parameters . This may work well for many cases, but modifying an algorithm for faster operation or for different parameters is challenging. Developing a new algorithm is even more difficult. Of course any OS increases the complexity of an application as well using more computational resources. A bare metal application, on the other hand, allows for strict control over the sensing and control elements of an autonomous system. Measuring the latency is straightforward using a hardware timer of the processor, as is managing processor bandwidth. Debugging the function is similarly simplified. The challenge in programming on bare metal is the need for an understanding of the hardware in order to configure it properly, although device manufacturers often offer configuration apps or software development kits to ease this process. Another benefit of bare metal programming is efficiency due to the much lower overhead. The main downside for bare metal applications is the challenge in duplicating the common features of a standard OS. For this reason, a bare metal application is suited better for a distributed architecture where the real-time and non-real-time tasks are strictly segregated between the real-time core and the SBC as discussed earlier. The design of the OSAVC PCB consisted of two development efforts. As we had no prior experience designing a PCB we decided to start with a ‘daughter board’—a simpler I/O PCB that could interface with a commercially available development board1 . The first daughter board was hand-wired to determine an initial layout and to kick start the development process. This allowed us to develop PCB layout and assembly skills and at the same time acted as a platform to test sensor driver modules. The I/O board—a simple two layer PCB—required two revisions before its performance was satisfactory. We also selected a vehicle platform for the AGV test bed on which to validate OSAVC. To complement the AGV we purchased a suite of sensors that are widely used for small autonomous vehicles and proceeded to develop a library of sensor drivers and integrated the sensors onto the vehicle. The discussion of the software design is found in Chapter 4.