Data-driven subgrid-scale parameterization of turbulence in the small-data limit

Date:

More information here

In this work, we develop a data-driven subgrid-scale (SGS) model for large eddy simulation of turbulence using a fully convolutional neural network (CNN). We first conduct direct numerical simulation (DNS) and obtain training, validation, and testing data sets by applying a Gaussian spatial filter to the DNS solution. With the filtered DNS (FDNS) data in hand, we train the CNN with the filtered state variables, i.e., vorticity and stream function as inputs and the nonlinear SGS term as an output. A priori analysis shows that the CNN-predicted SGS term accurately captures the inter-scale energy transfer. A posteriori analysis indicates that the LES-CNN outperforms the physics-based models in both short-term prediction and long-term statistics [1,2]. Although the CNN-based model is promising in predicting the SGS term, it requires big data to perform satisfactorily. In the small-data limit, the LES-CNN generates artificial instabilities and thus leads to unphysical results. We propose three remedies for the CNN to work in the small-data limit, i.e., data augmentation and group convolution neural network, leveraging the rotational equivariance of the SGS term and incorporating a physical constraint on the SGS enstrophy transfer. The SGS term is both translational and rotational equivariant in a square periodic flow field. While primitive CNN can capture the translational equivariance, the rotational equivariance can be accounted for by either including rotated snapshots in the training data set or by a GCNN that enforces rotational equivariance as a hard constraint. Additionally, The SGS enstrophy transfer constraint can be implemented in the loss function of the CNN. A priori and a posteriori analyses show that the CNN/GCNN with knowledge/constraints of rotational equivariance and SGS enstrophy transfer enhances the SGS model and allows the data-driven model to work stably and accurately in a small-data limit.