APP下载

Sample size adaptive strategy for time-dependent Monte Carlo particle transport simulation

2023-07-05DanHuaShangGuanWeiHuaYanJunXiaWeiZhiMingGaoYiBingChenZhiChengJi

Nuclear Science and Techniques 2023年4期

Dan-Hua ShangGuan ·Wei-Hua Yan ·Jun-Xia Wei ·Zhi-Ming Gao ·Yi-Bing Chen ·Zhi-Cheng Ji

Abstract When multiphysics coupling calculations contain time-dependent Monte Carlo particle transport simulations, these simulations often account for the largest part of the calculation time, which is insufferable in certain important cases.This study proposes an adaptive strategy for automatically adjusting the sample size to fulfil more reasonable simulations.This is realized based on an extension of the Shannon entropy concept and is essentially different from the popular methods in timeindependent Monte Carlo particle transport simulations, such as controlling the sample size according to the relative error of a target tally or by experience.The results of the two models show that this strategy can yield almost similar results while significantly reducing the calculation time.Considering the efficiency, the sample size should not be increased blindly if the efficiency cannot be enhanced further.The strategy proposed herein satisfies this requirement.

Keywords Time-dependent Monte Carlo particle transport simulation ·Shannon entropy ·Adaptive strategy

1 Introduction

Multiphysics processes have been extensively studied in several important fields.Owing to its complexity, an analytical solution is hardly possible, and numerical calculations are treated as efficient tools instead of expensive experiments.When time-dependent processes governed by different physical rules are coupled, one apparent solution is to divide the entire calculation into many steps, in which different control equations should be solved individually.Some data must be transmitted from one process to another to mimic the coupling effect.In certain important cases, the timedependent particle transport is an indispensable part of the multiphysics process.

The coupling between fluid mechanics and neutron transport was used as an example, which was used to demonstrate the sample-size adaptive strategy.In many cases,other physical processes must also be considered.However,considering only these two processes is not trivial and is meaningful in certain important situations.The governing equations for fluid mechanics are as follows:

When neutrons fly in a medium, nuclear energy is released by reactions between the neutrons and the nucleus.These reactions change the temperature, density, and velocity of the medium.However, because these reactions rely on their relative velocities, the thermal and hydrodynamic motions of the nucleus influence the neutron transport process.Therefore, the correct transport equation should include the distribution function of the nucleus, except for the neutron flux,which should be determined by the corresponding transport equation.This created a complex problem involving a group of simultaneous transport equations.Fortunately, because the local thermal equilibrium hypothesis is suitable, the use of Maxwell distribution to describe the thermal motion of a nucleus is sufficiently precise.After averaging, the coeffi-cients of the neutron transport equation rely only on the relative velocity of the neutron and the hydrodynamic motion of the medium.Therefore, the neutron transport equation that considers the effect of fluid motion is given by [1]

Owing to the accuracy of the Monte Carlo method for geometry [2], physics modeling, and other merits, this method is often used to simulate particle transport [3];however, it is extremely time-consuming.When there is a problem with multi-materials with large deformations and the scale of the model’s grid number is large, several steps are required for a reliable calculation, and each step contains a Monte Carlo simulation of a large model.Therefore, in this case, the calculation time is insufferable even with large-scale parallelization using a powerful supercomputer.Similar to the traditional Monte Carlo simulation of the time-independent particle transport problem, the core objective is to achieve the highest possible efficiency.This means finding a method with high calculation accuracy but not time-consuming.In other words, when efficiency flattens with an increase in sample size, there is no need to perform more calculations if efficiency, not calculation accuracy, is the goal.For the Monte Carlo method, it is well known that when the sample size tends to infinity, the calculation accuracy can reach any level because of the law of large numbers,which is impossible.

Apparently, it is difficult to set different reasonable sample sizes for each step based on experience because each step is a different particle transport problem.If a fixed, sufficiently large sample size is set for all steps, which is the original sample size setting method of the JUPITER platform, many calculations may be wasted because the effi-ciency factor will be flat as the sample size is increased to a certain level.Therefore, experience-based methods are not considered to be efficient.Moreover, because the Monte Carlo simulation provides several feedback quantities with very different fluctuation levels, the sample size controlled by the relative error of one tally may not be sufficient or may be excessively large for another tally.Therefore, errorbased methods [4] are unsuitable.Because the calculation of time-dependent particle transport is expensive, Monte Carlo simulations still need to develop an adaptive strategy to scientifically set the sample size, even with the rapid progress of supercomputers.

The remainder of this paper is organized as follows: The second section describes all aspects of a sample size adaptive strategy based on an extension of the Shannon entropy concept.The third section contains some active numerical results from the calculations of the two models.The final section provides the concluding remarks.

2 Sample size adaptive strategy

2.1 Research basics

Entropy is a notion existing in both physics and mathematics.Clausius is considered the father of the concept of entropy.The important contributions to the development of this concept were made by Boltzmann, Gibbs, and Planck.Moreover, mathematicians, including Shannon, Kolmogorov, Sinai, Rényi, Holevo, and Tsallis, were also preoccupied with the concept.On the history of different paths to entropy concept, see, e.g., [5-7].Therefore, there are various definitions of entropy.In this study, the entropy concept used is the one proposed by Shannon in [8], which has some novel applications in various fields [9, 10].For example, Ref.[9]showed that Shannon entropy is an efficient dynamic indicator that provides a direct measure of the diffusion rate and,thus, a timescale for the instabilities arising when dealing with chaos.

we obtain the Shannon entropyHori.For the power iteration method of eigenvalue calculation, if there areKiteration steps and,k=1,…,Kare entropies of each step, then the convergence of this entropy sequence is a necessary condition for the convergence of the fission source distribution.It is primarily used as follows: When the entire calculation has been completed and the Shannon entropy of each step has been obtained, a step numbernis selected by judging the entropy sequence after this step is perceived to have converged.If the number of the first active steps is greater thann, this calculation is considered reasonable.Otherwise, it is judged that the inactive iteration steps are insufficient, and the results are systematically biased.

In recent years, there have been some attempts to explore on-the-fly judgment methods [15-18] for the convergence of the fission source distribution, which belongs to the time-independent particle transport problem.It has already been noted [19] that the Shannon entropy concept can be problematic when it indicates the convergence of the fission source distribution in a slow-convergence system.Fortunately, our concerns did not include this type of problem.

2.2 Analysis for an reasonable adaptive strategy for sample size

Two important facts should be considered.(1) The surviving particles of the current step should be transferred to the next step as an external source.The attributes of these particles are samples of unknown distribution that change with the number of steps.This distribution can be called survival particle distribution for convenience; (2) Monte Carlo simulation of particle transport should provide global tallies (one tally per cell)for mechanical calculation as a source item.These facts have a general sense of many multiphysics coupling processes.It would be interesting to develop a sample size adaptive strategy to balance small errors and calculation costs.The basic considerations of this study are as follows: Suppose a sufficiently large sample size of one step is divided into batches, and each batch contains a fixed number of samples, the Monte Carlo simulation should finish all batches one at a time.However,if the number of surviving particles is sufficient, and global tallies have converged according to some judgment rules after some of these batches have been finished, the remaining batches can be omitted because the safely finished batches are sufficient to obtain reasonable results.This point can be reasonably verified by the fact that when stops in advance in one step, the global efficiency of global tallies, according to some index, is flat with batches that have already been finished.

To realize these ideas, three questions must be answered.The first question is whether the surviving particles are sufficient.The second one is by which indicator the global tallies can be judged to converge sufficiently.The last one is which on-the-fly diagnostic can be used to stop the Monte Carlo simulation in advance.The solutions to these problems are presented in the following three subsections.These methods are based on the Shannon entropy concept.

2.3 Indicator for judging whether survival particles are enough

Based on the definition of Shannon entropy, an indicator can be designed to show if the number of surviving particles are sufficient.Suppose there areNcells in total for the system andGenergy groups so that the total number of phase-space grids isNG.LetVi(i=1,…,N) be the volume of celliandmijbe the number of surviving particles in celliand energy groupj.If defining

then there areLHtvalues in total forLfinished batches.Here,Htis the entropy corresponding to survival particle distribution.Note thatpij≥0 and ∑ij pij=1 ; thus,pijcan be considered as a discrete probability distribution.When batches are individually simulated, theHtvalues form a sequence.If the sequence converges, it can be said that theseLbatches have made the surviving particles suffi-cient.Apparently, finer phase-space grids represent stricter standards.In Eq.(8), dividing byViis not trivial because it represents the concept of number density.Note thatmijin batch k should containmijin all batchesk1satisfyingk1<k.

2.4 Indicator for judging whether global tallies have converged sufficiently

When particle transport is coupled with other processes, the Monte Carlo simulation must provide feedback quantities to those processes as an input.The precision of these inputs,also known as tallies in the Monte Carlo method, significantly influence the confidence of the coupling calculation.In most cases, these quantities can be expressed as integrals of the flux and the corresponding functions.Therefore, if global flux tallies converge sufficiently, these feedback quantities are credible to a certain degree.Focusing on global flux tallies can also avoid the problem of dealing with different types of tallies in different multiphysics processes.IfFiis the flux in celli, let

Here,Hfis the entropy corresponding to global flux.The convergence ofHfsequence formed by finished batches indicates the global convergence of the flux tallies.In addition,Fiin batchkshould contain the contributions of the particles in all batchesk1that satisfyk1<k.To the best of our knowledge, this is the first time an indicator like Eq.(11)has been introduced into the Monte Carlo simulation of particle transport to indicate the convergence of global tallies.This has the advantage that there is no need to calculate the relative error.This advantage saves significant memory and calculation costs, particularly for large-scale models.

2.5 On-the-fly diagnostic for judging whether entropy sequence has converged

Based on the stochastic oscillator indicator, which is commonly used in the technical analysis of financial markets,a posterior diagnostic method has been proposed to assess fission source convergence in Monte Carlo criticality calculations [18].The stochastic oscillator indicator of thenthiteration step of criticality calculation is defined as follows:

In other words, this criterion requires that bothKnand its average over the next m steps should be within∈of 0.5.If the active cycle starts after the iteration step, satisfying the aforementioned inequality, the entire calculation should be accepted.However, this rule can be modified to be an onthe-fly diagnostic [20] for the convergence of the entropy sequence corresponding to the survival particle distribution and global flux tally, as explained in the last two subsections.This point is as follows: If we decompose all samples intoSbatches and simulate all batches individually, we can obtain a sequence of entropy values and stochastic oscillator indicators calledHi(i=1,…,S) andKi(i=p,…,S).When completing thenth(n≥m+p-1) batch and the following inequality is satisfied:

the surviving particle distribution and global flux tallies can then be perceived to have converged.The parametersp,mwere set to 20 and 50, respectively, but the∈value was set by the user.The smaller the∈is, the stricter the rule is.Therefore, a conservative or radical strategy can be developed by using different∈values.

2.6 Description of the adaptive strategy

The entire mechanism for automatically adjusting the sample size is summarized as follows: In each step of the timedependent Monte Carlo particle transport simulation, a sufficiently large sample size is initially set and divided intoMbatches.After finishing one batch, newHtandHfvalues are obtained.When a finished batch number exceedsm+p-1 ,the on-the-fly diagnostic is invoked to decide if it needs to stop the Monte Carlo simulation in advance according to both the entropy sequences.Only when both judgments return a true value, the simulation of this step is stopped in advance.If stopped in advance in stepL, there will not be enough survival particles to be used in stepL+1 although the current survival particles can be seen as sufficiently sampled from the survival particle distribution.In this case, we randomly select from the current surviving particle bank to supply more surviving particles as source particles in stepL+1.This operation is considered reasonable based on the probability theory.Note that in parallel computing,frequently calculating these entropies should use the method proposed in [21]; otherwise, the time cost of obtaining these entropies will offset the benefit of the sample size adjustment.In other words, this method uses local data in each process to calculate the local entropy and the average of the entropies of all processes as the final entropy value.Thus,massive data broadcasts can be avoided.It can be verified that when the sample size approaches infinity, the entropy is the same as the original entropy.Even with a finite sample size, this new entropy can be used to indicate the convergence of the corresponding survival particle distribution or global tallies.Although there is a statistical estimation method for the Shannon entropy [22], we have not adopted it because of its complexity; however, this will be the focus of our future studies.

It is worthwhile to provide four annotations.(1) When an under-sampling problem [23] exists for a specific model,an insufficient history number causes the Shannon entropy corresponding to the survival particle distribution to fluctuate more drastically.Therefore, the inequality Eq.(14) cannot be satisfied easily, and the entire calculation will not be stopped in advance.(2) Theoretically, the method in this study may fail for some symmetric and oscillating models.However, for time-dependent particle transport simulation,strict symmetric and oscillating models are rare; hence, it is not a serious issue.(3) The Fourier fundamental mode coefficient works similar to the traditional Shannon entropy;therefore, the recently proposed Fourier diagnosis [24] can be a substitute for the Shannon entropy diagnosis.Because we adopted an efficient method for calculating Shannon entropy in MPI parallel environments and did not perform a numerical comparison, we do not know which method is better.(4) The recently proposed batch-size growth method[25, 26] is consistent with our method.However, our work aims to improve the efficiency of time-dependent particle transport simulations using multiphysics calculations.The two aforementioned studies aim to improve the efficiency of time-independent criticality calculations.The existing difficulties are different for these two types of problems.Theoretically, the concepts proposed in these studies are suitable for transplantation.These performances can only be compared by future numerical experiments.

3 Numerical results

It is worth mentioning again that our goal in designing this sample size adaptive strategy is to avoid unnecessary calculations when the results are sufficiently good when evaluated by a standard.In each step, when the original sample size is divided into several batches, and each batch is simulated individually according to the original plan, we expect that this strategy can stop in advance when the efficiency evaluated by some standard becomes flat.Apparently, because of the inevitable stochastic nature of this strategy, we could not precisely capture the stop point.However, capturing it approximately can still save considerable calculation time while keeping the results approximately unchanged when compared to the original case.The two models discussed in the next sections were used to demonstrate this effect.The Arbitrary Lagrange-Euler (ALE) algorithm was used to perform mechanical calculations.

3.1 Model one

Model one was a sector piece with an open angle of 2°and a radius of 6.45 cm.There are two concentric layers.Each layer is composed of a mixture of U235and U238but with different densities.In total, 600 cells were used.The initial sample size was 12,800,000 samples per step, which was sufficiently large for this model.These samples were divided into 640 batches with 20,000 samples per batch.The calculations used a total of 32 cores.We perform two independent calculations with∈=0.1 and∈=0.05.All the other conditions were identical to those described above.Figure 1 shows how the neutron number(Np) in the model changes with time for three cases, with and without sample size adaptation but with two different∈values).Table 1 shows the comparison of neutron numbers (Np) at the end of the calculations for these three cases and the time costs.It can be observed that the proposed sample size adaptive strategy significantly decreases the calculation time while maintaining an almost unchanged result.When the judging rule is stricter, similar to the∈=0.05 case, the result is closer to the original case;however, the reduction in the calculation time is smaller.For a more detailed comparison, we randomly chose a step to determine whether the adaptive strategy could stop the calculation in advance when the efficiency flattened.We used the following equation:

Fig.1 Comparison of neutron number

Table 1 Comparison of neutron number(Np ) for different cases at 0.252 microsecond

Fig.2 Global efficiency of all flux tallies in some step

to evaluate the efficiency of global flux tallies [27].Here,N=600 is the total cell number,Tis the calculation time,andRiis the relative error of the flux in celli.In this step,the∈=0.1 case stops at the 416thbatch, whereas the∈=0.05 case cannot stop in advance; that is, all 640 batches are simulated in this case.From Fig.2, we see that the sample size adaptive strategy stops in advance when the global efficiency is almost flat with batch numbers.Considering the efficiency,when global efficiency becomes flat, it is not necessary to do more simulations.Figures 3 and 4 show the two Shannon entropies (HtandHf) for the same step.As expected,the sample size adaptive strategy stopped the entire calculation when these entropies flattened.Figure 5 shows the real sample size that was simulated in different cases.The reduction in calculation time was evidently due to a decrease in the sample size.However, the results were almost the same and the efficiency remained almost unchanged because the simulated sample size was already sufficient.We cannot say which is the best∈because of inevitable randomness;however, it can be said that a small∈means a conservative strategy, whereas a large∈means a radical strategy.This is a characteristic of most of the algorithms.However, the results indicate that∈can maintain the concerned physical quantities.Therefore, if used appropriately, the method proposed in this study can definitely improve the calculation efficiency and provide a tool for different users.

Fig.3 Shannon entropy Ht in some step

Fig.4 Shannon entropy Hf in some step

Fig.5 The real sample size in different cases

Table 2 Comparison of λ for different cases

3.2 Model two

Model two is a cuboid which contains 200,000 hexahedron cells.It is divided into three layers that contain four types of isotopes of U (234U,235U,236U,238U) but with different densities.The mechanical calculation is done in one step,and the Monte Carlo particle transport simulation is iterated for 100 steps to calculate

We performed 20 calculations without sample size adaptation but with different random number seeds to obtain the reference answer.Each calculation used 1,600 batches with 30,000 samples per batch, and 32 cores were utilized.From Table 2, we can see that the sample size adaptive strategy obtains a reasonableλvalue while significantly decreasing the calculation time at approximately only 22% of the original case without adaptive sample size adjustment.Note that the number in parentheses is the standard deviation of 20λvalues, and theλvalue of the sample size adaptive strategy is located in the 95% confidence interval obtained from these 20λvalues.

4 Conclusion

Based on the extension of the Shannon entropy concept, two indicators are designed to show whether samples of survival particle distribution are sufficient and whether global flux tallies have converged sufficiently.An on-the-fly diagnostic method based on the stochastic oscillator indicator for the convergence of these two indicators is proposed to avoid manual intervention, thus forming a complete adaptive strategy.This is the first time that it has become possible to adjust the sample size according to certain definite and reasonable rules.Based on the results of the two models displayed in the previous section, this self-adjustment mechanism maintains reasonable precision while reducing the calculation time.This indicator and mechanism can be easily generalized to other multiphysics calculations, including time-dependent Monte Carlo particle transport simulations.

Author contributionsAll authors contributed to the study conception and design.Material preparation, data collection and analysis were performed by Dan-Hua ShangGuan, Wei-Hua Yan, Jun-Xia Wei, Zhi-Ming Gao, Yi-Bing Chen, and Zhi-Cheng Ji.The first draft of the manuscript was written by Dan-Hua ShangGuan and all authors commented on previous versions of the manuscript.All authors read and approved the final manuscript.

Data availabilityThe data that support the findings of this study are openly available in Science Data Bank at https:// www.doi.org/ 10.57760/ scien cedb.j00186.00061 and http:// resol ve.pid21.cn/ 31253.11.scien cedb.j00186.00061.