Initialising ...
Initialising ...
Initialising ...
Initialising ...
Initialising ...
Initialising ...
Initialising ...
Tada, Kenichi; Endo, Tomohiro*
Journal of Nuclear Science and Technology, 60(11), p.1397 - 1405, 2023/11
Times Cited Count:1 Percentile:35.82(Nuclear Science & Technology)The probability table method is a well-known method for addressing self-shielding effects in the unresolved resonance region. A long computational time is required to generate the probability table. The effective way to reduce the generation time of the probability table is the reduction of the number of ladders. The purpose of this study is the estimation of the optimal number of ladders using the statistical uncertainty in the probability table. To this end, the statistical uncertainty quantification method of the probability table was developed and the convergence behavior of the statistical uncertainty was investigated. The product of the probability table and the average cross section was considered the target of the statistical uncertainty. The convergence rate was affected by the average level spacing and reduced neutron width. The generation time of the probability table was less than half when the input parameter was changed from the number of ladders to the tolerance value.
Tada, Kenichi; Yamamoto, Akio*; Kunieda, Satoshi; Nagaya, Yasunobu
JAEA-Data/Code 2022-009, 208 Pages, 2023/02
The nuclear data processing code has an important role to connect evaluated nuclear data libraries and neutronics calculation codes. Japan Atomic Energy Agency (JAEA) has developed the nuclear data processing code FRENDY since 2013 to generate cross section files from evaluated nuclear data libraries, such as JENDL, ENDF/B, JEFF, and TENDL. The first version of FRENDY was released in 2019. FRENDY version 1 generates ACE files which are used for continuous energy Monte Carlo codes such as PHITS, Serpent, and MCNP. FRENDY version 2 generates multi-group neutron cross-section files from ACE files. The other major improvements are as follows: (1) uncertainty quantification for the probability tables of the unresolved resonance cross-section; (2) perturbation of the ACE file for the uncertainty quantification using a continuous Monte Carlo code; (3) modification of the ENDF-6 formatted nuclear data file. This report describes an overview of the nuclear data processing methods and input instructions for FRENDY.
Fujimoto, Nozomu*; Tada, Kenichi; Ho, H. Q.; Hamamoto, Shimpei; Nagasumi, Satoru; Ishitsuka, Etsuo
Annals of Nuclear Energy, 158, p.108270_1 - 108270_8, 2021/08
Times Cited Count:3 Percentile:35.51(Nuclear Science & Technology)Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio
EPJ Web of Conferences, 146, p.02028_1 - 02028_5, 2017/09
Times Cited Count:4 Percentile:89.21(Nuclear Science & Technology)JAEA has started to develop new nuclear data processing system FRENDY (FRom Evaluated Nuclear Data libralY to any application). In this presentation, the outline of the development of FRENDY is presented. And functions and performances of FRENDY are demonstrated by generation and validation of the continuous energy cross section data libraries for MVP, PHITS and MCNP codes.
Tonoike, Kotaro; Izawa, Naoki; Okazaki, Shuji; Sugikawa, Susumu; Takeshita, Isao;
ICNC 95: 5th Int. Conf. on Nuclear Criticality Safety,Vol. II, 0, p.10.25 - 10.32, 1995/00
no abstracts in English
JAERI-M 92-207, 645 Pages, 1992/12
no abstracts in English
; Iida, Hiromasa
JAERI-M 8818, 21 Pages, 1980/04
no abstracts in English
; ; ;
JAERI-M 8349, 68 Pages, 1979/08
no abstracts in English
Tada, Kenichi
no journal, ,
This presentation explains the recent development work of the nuclear data processing code FRENDY. This presentation shows the major new functions in FRENDY version 2 which was released in Jan. 2022 and the overview of the new functions implemented in the revised version which was released in Nov. 2022.
Guo, Z.; Nishida, Akemi; Choi, B.; Nakajima, Norihiro
no journal, ,
In the field of seismic analysis of nuclear facilities, large-scale parallel analyses using numerical models with several hundred millions of DOFs are becoming possible by the recent advances in high-performance parallel computing technologies. In dealing with such three dimensional time series data, the post-processing may be often more difficult than the seismic response simulation itself. The purpose of the current study is to develop a parallel visualization application, which can visualize large-scale simulation results (distributed time series data) effectively. In this report, we show an approach to increase the efficiency of parallel visualization by more than 200 times by using appropriate pre-processing for this kind of large-scale distributed time series data.