UAV (Unmanned Aerial Vehicle) remote sensing differs in a few ways from the ‘classical’ satellite or manned airborne remote sensing. One such difference in the predominant use of snapshot cameras, and the specific processing through structure-from-motion (SfM) software associated with it. During each flight, each object is typically recorded 10-40 times, and each of these observations occurs under different viewing angles relative to the sun and sensor position. The SfM software typically makes use of only one observation of each object to construct orthophotos. Yet, the observations of how viewing angle influences the reflectance hold information of the plant structure and composition. In this project, we want to make use of the full information of all observations under different sun-sensor viewing angles of multispectral and thermal imagery. Through the use of advanced radiative transfer models, we want to replace traditional vegetation indices with more tangible and interpretable information, such as transpiration, leaf area and nutrient levels in canopies. This can lead to a range of applications in precision agriculture and forest ecology. We furthermore will investigate how this information can be used to improve orthophoto generation, through standardisation and seamline correction.