Kaku Deta Nyusu (Internet), (122), p.9 - 21, 2019/02
This paper reports the overview of the technical meeting of nuclear data processing in IAEA to Japanese researchers. In this technical meeting, the current status of nuclear data processing codes and verification of them are described.
Robutsuri No Kenkyu (Internet), (71), 13 Pages, 2019/02
The nuclear data processing is very important to connect between the evaluated nuclear data library and the particle transport calculation code. However, many nuclear engineers do not know well about the nuclear data processing. This paper describes the overview of nuclear data processing and our nuclear data processing code FRENDY. This paper also lists references about the nuclear data processing.
Tada, Kenichi; Kunieda, Satoshi; Nagaya, Yasunobu
JAEA-Data/Code 2018-014, 106 Pages, 2019/01
A new nuclear data processing code FRENDY has been developed in order to process the evaluated nuclear data library JENDL. Development of FRENDY helps to disseminate JENDL and various nuclear calculation codes. FRENDY is developed not only to process the evaluated nuclear data file but also to implement the FRENDY functions to other calculation codes. Users can easily use many functions e.g., read, write, and process the evaluated nuclear data file, in their own codes when they implement the classes of FRENDY to their codes. FRENDY is coded with considering maintainability, modularity, portability and flexibility. The processing method of FRENDY is similar to that of NJOY. The current version of FRENDY treats the ENDF-6 format and generates the ACE file which is used for the continuous energy Monte Carlo codes such as PHITS and MCNP. This report describes the nuclear data processing methods and input instructions for FRENDY.
Yamashita, Susumu; Tada, Kenichi; Yoshida, Hiroyuki; Suyama, Kenya
Nippon Genshiryoku Gakkai Wabun Rombunshi, 17(3/4), p.99 - 105, 2018/12
In order to reveal melt relocation behaviors of core internals phenomenologically and to reduce the uncertainties of the melt relocation analysis in existing SA analysis codes, in JAEA, the numerical simulation code for melt relocation and accumulation behaviors based on computational fluid dynamics named JUPITER has been developed. In this paper, to consider the estimation method for fuel debris composition and its re-criticality, we performed the melt accumulating and spreading simulation to the pedestal region by JUPITER and also performed re-criticality analysis by Monte Carlo Codes for Neutron Transport Calculations based on Continuous Energy and Multi-group Methods (MVP) using detailed fuel debris composition data obtained by JUPITER. From the coupled analysis on fuel debris distribution by JUPITER and MVP, we had prospects for a detailed possibility of re-criticality of fuel debris with detailed fuel debris distribution.
Iwamoto, Osamu; Iwamoto, Nobuyuki; Kimura, Atsushi; Yokoyama, Kenji; Tada, Kenichi
Kaku Deta Nyusu (Internet), 120, p.35 - 46, 2018/06
We report 30th WPEC meeting, expert group meeting, and subgroup meeting in Paris, May 14-18, 2018.
Proceedings of Reactor Physics Paving the Way Towards More Efficient Systems (PHYSOR 2018) (USB Flash Drive), p.2929 - 2939, 2018/04
JAEA develops a new nuclear data processing system FRENDY. We investigated all processing methods and we focused on the probability table generation using the ladder method which is adopted in the PURR module in NJOY. To improve the probability table generation, the more sophisticated method was introduced in the calculation methods of the Chi-Squared random numbers and the complex error function. We also investigated the appropriate ladder number. To investigate the impact of the difference of the complex error function calculation method, the K values of the benchmark experiments with the probability tables by the both methods were compared. The calculation results indicated that the appropriate ladder number is 100 and the difference of the calculation methods of the Chi-Squared random numbers and the complex error function has no significant impact on the neutronics calculation.
Kikuchi, Takeo; Tada, Kenichi; Sakino, Takao; Suyama, Kenya
JAEA-Research 2017-021, 56 Pages, 2018/03
The criticality management of the fuel debris is one of the most important research issues in Japan. The current criticality management adopts the fresh fuel assumption. The adoption of the fresh fuel assumption for the criticality control of the fuel debris is difficult because the k of the fuel debris could exceed 1.0 in most of cases which the fuel debris contains water and does not contain neutron absorbers such as gadolinium. Therefore, the adoption of the burnup credit is considered. The prediction accuracy of the isotopic composition of used nuclear fuel must be required to adopt the burnup credit for the treatment of the fuel debris. JAEA developed a burnup calculation code SWAT4.0 to obtain reference calculation results of the isotopic composition of the used nuclear fuel. This code is used to evaluate the composition of fuel debris. In order to investigate the prediction accuracy of SWAT4.0, we analyzed the PIE of BWR obtained from 2F2DN23.
Tada, Kenichi; Kosako, Kazuaki*; Yokoyama, Kenji; Konno, Chikara
Nippon Genshiryoku Gakkai-Shi, 60(3), p.168 - 172, 2018/03
The neutronics calculation codes cannot treat the evaluated nuclear data file directly. The nuclear data processing is required to use the nuclear data file in the neutronics calculation codes. The nuclear data processing is not just a converter but also many processes to evaluate the physical values for the neutronics calculation codes. In this paper, we describe the overview of the nuclear data processing and validation of the nuclear data.
Tada, Kenichi; Kikuchi, Takeo*; Sakino, Takao; Suyama, Kenya
Journal of Nuclear Science and Technology, 55(2), p.138 - 150, 2018/02
The criticality safety of the fuel debris in Fukushima Daiichi Nuclear Power Plant is one of the most important issues and the adoption of the burnup credit is desired for the criticality analysis. The assay data of used nuclear fuel irradiated in 2F2 is evaluated to validate SWAT4.0 for BWR fuel burnup problem. The calculation results revealed that number density of many heavy nuclides and FPs showed good agreement with the experimental data except for U, Np, Pu and Sm isotopes. The cause of the difference is assumption of the initial number density and void ratio and overestimation of the capture cross section of Np. The C/E-1 values do not depend on the types of fuel rods (UO or UO-GdO) and it is similar to that for the PWR fuel. These results indicate that SWAT4.0 appropriately analyzes the isotopic composition of the BWR fuel and it has sufficient accuracy to be adopted in the burnup credit evaluation of the fuel debris.
Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio
EPJ Web of Conferences (Internet), 146, p.02028_1 - 02028_5, 2017/09
JAEA has started to develop new nuclear data processing system FRENDY (FRom Evaluated Nuclear Data libralY to any application). In this presentation, the outline of the development of FRENDY is presented. And functions and performances of FRENDY are demonstrated by generation and validation of the continuous energy cross section data libraries for MVP, PHITS and MCNP codes.
Konno, Chikara; Tada, Kenichi; Kwon, Saerom*; Ota, Masayuki*; Sato, Satoshi*
EPJ Web of Conferences (Internet), 146, p.02040_1 - 02040_4, 2017/09
So far we pointed out that KERMA factors and DPA cross-section data of a lot of nuclei in the official ACE file were different among nuclear data libraries for the following reasons; (1) incorrect nuclear data, (2) NJOY bugs, (3) huge helium production cross section data, (4) mf6 mt102 data, (5) no secondary particle data (energy-angular distribution data). Now we compare the KERMA factors and DPA cross section data included in the official ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 in more detail. As a result, we find out new reasons of differences among the KERMA factors and DPA cross section data in the three nuclear data libraries. The reasons are categorized to no secondary charged particle data, no secondary data, wrong secondary spectra, wrong production yields and mf12-15 mt3 data for the capture reaction, some of which seem to be unsupported with NJOY. The ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 with these problems should be revised based on this study.
Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio
Journal of Nuclear Science and Technology, 54(7), p.806 - 817, 2017/07
JAEA has developed an evaluated nuclear data library JENDL and several nuclear analysis codes such as MARBLE2, SRAC, MVP and PHITS. Though JENDL and these computer codes have been widely used in many countries, the nuclear data processing system to generate the data library for application programs had not been developed in Japan and foreign nuclear data processing systems, e.g., NJOY and PREPRO are used. To process the new library for JAEA's computer codes immediately and independently, JAEA started to develop the new nuclear data processing system FRENDY in 2013. In this paper, outline, function, and verification of FRENDY are described.
Kaku Deta Nyusu (Internet), (117), p.23 - 29, 2017/06
Validation of the nuclear data library using the criticality experiments and nuclear reactor experiments, i.e., integral experiments, is one of the most important process. The analyses of the integral experiments become the more important along with the accuracy improvement of the library. This validation procedure is mainly carried out by the specialists of the reactor physics because it is complicated for the nuclear data evaluators. Furthermore, it takes a long time and much effort even if the specialists carry it out. To realize the efficient nuclear data validation cycle for the next version of JENDL, the automatic nuclear data validation system VACANCE (Validation Environment for Comprehensive and Automatic Neutronics Calculation Execution) is developed. In this presentation, the outline and functions of VACANCE are demonstrated in detail and examples of the new nuclear data evaluation and subsequent integral experiment analyses will be shown.
Harada, Hideo; Iwamoto, Osamu; Kimura, Atsushi; Yokoyama, Kenji; Tada, Kenichi
Kaku Deta Nyusu (Internet), (117), p.36 - 51, 2017/06
no abstracts in English
Tada, Kenichi; Suyama, Kenya
Proceedings of 2017 International Congress on Advances in Nuclear Power Plants (ICAPP 2017) (CD-ROM), 4 Pages, 2017/04
JAEA provides the evaluated nuclear data library JENDL. Usually, the integral experiments are used for the validation. Since this validation process takes long time and much effort, the automated system has been desired. To realize the automated system, nuclear data processing, analysis of the integral experiments and editing calculation results are required. With regard to the nuclear data processing, JAEA has started to develop the new nuclear data processing system FRENDY. Using FRENDY, the nuclear data can be automatically processed. Taking advantage of FRENDY, we developed the automatic nuclear data validation system VACANCE. VACANCE has many functions, e.g., searching and modifying the input file, available for the parallel computation and restart calculation, editing the calculation results. Combination of FRENDY and VACANCE enables us to carry out the efficient nuclear data validation cycle. In this presentation, the outline and functions of VACANCE are demonstrated.
Kaku Deta Nyusu (Internet), (113), p.7 - 23, 2016/02
This paper reports the IAEA's Consultants Meeting (CM) in Oct. 5-9, 2015. The title of the CM is "The New Evaluated Nuclear Data File Processing Capabilities".
Kaku Deta Nyusu (Internet), (113), p.41 - 45, 2016/02
Author prized the incentive award for nuclear data division, Atomic Energy Society of Japan in 2015. This report introduces the overview of the award-winning work.
Suyama, Kenya; Sugawara, Takanori; Tada, Kenichi; Chiba, Go*; Yamamoto, Akio*
JAEA-Conf 2014-003, 76 Pages, 2015/03
Japan Atomic Energy Agency organized an international conference PHYSOR 2014 on the reactor physics which is one of basic researches in the nuclear engineering, in cooperation with Research Reactor Institute of Kyoto University. PHYSOR is the world's largest scale international conference in the reactor physics field. It originates in the conference held in Marseille, France in 1990, which originally had been organized in the United States as a Physics of Reactors Topical Meeting of the reactor physics division of the American Nuclear Society every two years. More than 500 papers had been submitted and finally 472 papers were presented in the conference after the paper review process. This report contains the presented papers, which the PHYSOR organizing committee has decided to publish in an official JAEA report with the permission by authors, except for several selected papers to be published in the Journal of Nuclear Science and Technology of the Atomic Energy Society of Japan.
Tada, Kenichi; Hagura, Naoto*
Robutsuri No Kenkyu (Internet), (67), 105 Pages, 2015/03
In 2014 AESJ autumn meeting, the session, that the title name is current condition and future status of the human resources for the reactor physics field, was presented. In this session, the employment status was investigated and the questionnaire was desigined. In this paper, the detail result of the emplyment status and the questionnaire was described.