We report flow statistics and visualizations from molecular-gas-dynamics simulations using the direct simulation Monte Carlo (DSMC) method for turbulent Couette flow in a minimal domain where the lower wall is replaced by an idealized permeable fibrous substrate representative of thermal-protection-system materials for which the Knudsen number is O(10-1). Comparisons are made with smooth-wall DSMC simulations and smooth-wall direct numerical simulations (DNS) of the Navier-Stokes equations for the same conditions. Roughness, permeability, and noncontinuum effects are assessed. In the range of Reynolds numbers considered herein, the scalings of the skin friction on the permeable substrate and of the mean flow within the substrate suggest that they are dominated by viscous effects. While the regenerative cycle characteristic of smooth-wall turbulence remains intact for all cases considered, we observe that the near-wall velocity fluctuations are modulated by the permeable substrate with a wavelength equal to the pore spacing. Additionally, the flow within the substrate shows significant rarefaction effects, resulting in an apparent permeability that is 13% larger than the intrinsic permeability. In contrast, the smooth-wall DSMC and DNS simulations exhibit remarkably good agreement for the statistics examined, despite the Knudsen number based on the viscous length scale being as large as O(10-1). This latter result is at variance with classical estimates for the breakdown of the continuum assumption and calls for further investigations into the interaction of noncontinuum effects and turbulence.
Evolutionary algorithms have been shown to be an effective method for training (or configuring) spiking neural networks. There are, however, challenges to developing accessible, scalable, and portable solutions. We present an extension to the Fugu framework that wraps the NEAT framework, bringing evolutionary algorithms to Fugu. This approach provides a flexible and customizable platform for optimizing network architectures, independent of fitness functions and input data structures. We leverage Fugu's computational graph approach to evaluate all members of a population in parallel. Additionally, as Fugu is platform-agnostic, this population can be evaluated in simulation or on neuromorphic hardware. We demonstrate our extension using several classification and agent-based tasks. One task illustrates how Fugu integration allows for spiking pre-processing to lower the search space dimensionality. We also provide some benchmark results using the Intel Loihi platform.
In turbulent flows, kinetic energy is transferred from the largest scales to progressively smaller scales, until it is ultimately converted into heat. The Navier-Stokes equations are almost universally used to study this process. Here, by comparing with molecular-gas-dynamics simulations, we show that the Navier-Stokes equations do not describe turbulent gas flows in the dissipation range because they neglect thermal fluctuations. We investigate decaying turbulence produced by the Taylor-Green vortex and find that in the dissipation range the molecular-gas-dynamics spectra grow quadratically with wave number due to thermal fluctuations, in agreement with previous predictions, while the Navier-Stokes spectra decay exponentially. Furthermore, the transition to quadratic growth occurs at a length scale much larger than the gas molecular mean free path, namely in a regime that the Navier-Stokes equations are widely believed to describe. In fact, our results suggest that the Navier-Stokes equations are not guaranteed to describe the smallest scales of gas turbulence for any positive Knudsen number.
Image-based simulation, the use of 3D images to calculate physical quantities, relies on image segmentation for geometry creation. However, this process introduces image segmentation uncertainty because different segmentation tools (both manual and machine-learning-based) will each produce a unique and valid segmentation. First, we demonstrate that these variations propagate into the physics simulations, compromising the resulting physics quantities. Second, we propose a general framework for rapidly quantifying segmentation uncertainty. Through the creation and sampling of segmentation uncertainty probability maps, we systematically and objectively create uncertainty distributions of the physics quantities. We show that physics quantity uncertainty distributions can follow a Normal distribution, but, in more complicated physics simulations, the resulting uncertainty distribution can be surprisingly nontrivial. We establish that bounding segmentation uncertainty can fail in these nontrivial situations. While our work does not eliminate segmentation uncertainty, it improves simulation credibility by making visible the previously unrecognized segmentation uncertainty plaguing image-based simulation.
Image-based simulation, the use of 3D images to calculate physical quantities, relies on image segmentation for geometry creation. However, this process introduces image segmentation uncertainty because different segmentation tools (both manual and machine-learning-based) will each produce a unique and valid segmentation. First, we demonstrate that these variations propagate into the physics simulations, compromising the resulting physics quantities. Second, we propose a general framework for rapidly quantifying segmentation uncertainty. Through the creation and sampling of segmentation uncertainty probability maps, we systematically and objectively create uncertainty distributions of the physics quantities. We show that physics quantity uncertainty distributions can follow a Normal distribution, but, in more complicated physics simulations, the resulting uncertainty distribution can be surprisingly nontrivial. We establish that bounding segmentation uncertainty can fail in these nontrivial situations. While our work does not eliminate segmentation uncertainty, it improves simulation credibility by making visible the previously unrecognized segmentation uncertainty plaguing image-based simulation.
This project combines several new concepts to create a boundary layer transition prediction capability that is suitable for analyzing modern hypersonic flight vehicles. The first new concept is the use of ''optimization'' methods to detect the hydrodynamic instabilities that cause boundary layer transition; the use of this method removes the need for many limiting assumptions of other methods and enables quantification of the interactions between boundary layer instabilities and the flow field imperfections that generate them. The second new concept is the execution of transition analysis within a conventional hypersonics CFD code, using the same mesh and numerical schemes for the transition analysis and the laminar flow simulation. This feature enables rapid execution of transition analysis with less user oversight required and no interpolation steps needed.