Refine your search:     
Report No.
 - 
Search Results: Records 1-20 displayed on this page of 68

Presentation/Publication Type

Initialising ...

Refine

Journal/Book Title

Initialising ...

Meeting title

Initialising ...

First Author

Initialising ...

Keyword

Initialising ...

Language

Initialising ...

Publication Year

Initialising ...

Held year of conference

Initialising ...

Save select records

Journal Articles

Investigation of appropriate ladder number on probability table generation

Tada, Kenichi

Proceedings of International Conference on the Physics of Reactors; Transition To A Scalable Nuclear Future (PHYSOR 2020) (USB Flash Drive), 8 Pages, 2020/03

The probability table is widely used for continuous energy Monte Carlo calculation codes to treat the self-shielding effect in the unresolved resonance region. The ladder method is used to calculate the probability table. This method generates a lot of pseudo resonance structures using random numbers based on the averaged resonance parameters. The probability table affects the ladder number. i.e., number of pseudo resonance structures. The ladder number has large impact on the generation time of the cross section library. In this study, the appropriate ladder number is investigated. The probability table of all nuclides prepared in JENDL-4.0 is used to investigate the appropriate ladder number. The comparison results indicate that the differences of the probability table are enough small when the ladder number is 100.

Journal Articles

Nuclear data processing code FRENDY

Tada, Kenichi

JAEA-Conf 2019-001, p.29 - 34, 2019/11

JAEA has developed a new nuclear data processing code FRENDY (FRom Evaluated Nuclear Data librarY to any application) to generate a cross-section data library from evaluated nuclear data library JENDL. In this presentation, author explains how to generate cross-section data library and overview and features of FRENDY.

Journal Articles

Report of 31st Meeting of the Working Party on International Nuclear Data Evaluation Co-operation (WPEC)

Iwamoto, Osamu; Iwamoto, Nobuyuki; Kimura, Atsushi; Yokoyama, Kenji; Tada, Kenichi

Kaku Deta Nyusu (Internet), (124), p.23 - 34, 2019/10

The 31st annual meeting and the subgroup meeting of the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) under the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA) was held at the head quarter of OECD/NEA located at Boulogne-Billancourt near Paris from 24 to 28 in June in 2019. The activities about nuclear data measurement and evaluation of each region or country were reported at the annual meeting, and the SG activities were discussed at the subgroup meetings. The summary of these meetings are reported.

Journal Articles

Investigation of the impact of the prediction accuracy of the burn-up code system SWAT4.0 on neutronics calculation

Tada, Kenichi; Sakino, Takao*

Proceedings of 11th International Conference on Nuclear Criticality Safety (ICNC 2019) (Internet), 9 Pages, 2019/09

Criticality safety of the fuel debris is one of the most important issues, and the adoption of burnup credit is desired. To adopt the burnup credit, validation of the burnup calculation codes is required. In this study, assay data of the used nuclear fuel (2F2DN23, 2F1ZN2, and 2F1ZN3) are evaluated to validate the SWAT4.0 code. The calculation results revealed that the number densities of many heavy nuclides and fission products show good agreement with the experimental data. To investigate the applicability of SWAT4.0 to the criticality safety evaluation of fuel debris, we evaluated the effect of isotopic composition difference on $$k_{eff}$$. The differences in the number densities of U-235, Pu-239, Pu-241, and Sm-149 have a large impact on $$k_{eff}$$. However, the reactivity uncertainty related to the burnup analysis was less than 3%. SWAT4.0 appropriately analyses the isotopic composition of BWR fuel, and it has sufficient accuracy to be adopted in the burnup credit evaluation of fuel debris.

Journal Articles

FRENDY; Nuclear data processing system

Tada, Kenichi

Nuclear Data Newsletter (Internet), (67), P. 2, 2019/07

This is an advertisement of our nuclear data processing system FRENDY for Nuclear Data Newsletter published by IAEA nuclear data section.

Journal Articles

Development of a handy criticality analysis tool for fuel debris

Tada, Kenichi

Proceedings of 27th International Conference on Nuclear Engineering (ICONE-27) (Internet), 4 Pages, 2019/05

The decommissioning of Fukushima Daiichi Nuclear Power Plant accident is one of the most important issues in Japan. The criticality safety of fuel debris is imperative to prevent exposure of workers. The investigating criticality monitoring system cannot detect the criticality of fuel debris quickly. The estimation of criticality of fuel debris is required for the fuel debris retrieval. Though the expert knowledge of reactor physics is necessary to estimate the criticality of fuel debris, many people who make a plan of fuel debris retrieval may not know well about criticality analysis. We developed a handy criticality analysis tool HAND to quickly estimate the criticality of fuel debris without expert knowledge of reactor physics. Since the input data of HAND is so simple and users can intuitively understand the calculation results, this tool is expected to be the effective tool to estimate the criticality of fuel debris.

Journal Articles

Report on the IAEA Technical Meeting "Nuclear Data Processing"

Tada, Kenichi

Kaku Deta Nyusu (Internet), (122), p.9 - 21, 2019/02

This paper reports the overview of the technical meeting of nuclear data processing in IAEA to Japanese researchers. In this technical meeting, the current status of nuclear data processing codes and verification of them are described.

Journal Articles

Development of next generation nuclear data processing code FRENDY

Tada, Kenichi

Robutsuri No Kenkyu (Internet), (71), 13 Pages, 2019/02

The nuclear data processing is very important to connect between the evaluated nuclear data library and the particle transport calculation code. However, many nuclear engineers do not know well about the nuclear data processing. This paper describes the overview of nuclear data processing and our nuclear data processing code FRENDY. This paper also lists references about the nuclear data processing.

JAEA Reports

Nuclear data processing code FRENDY version 1

Tada, Kenichi; Kunieda, Satoshi; Nagaya, Yasunobu

JAEA-Data/Code 2018-014, 106 Pages, 2019/01

JAEA-Data-Code-2018-014.pdf:1.76MB
JAEA-Data-Code-2018-014-appendix(DVD-ROM).zip:6.99MB

A new nuclear data processing code FRENDY has been developed in order to process the evaluated nuclear data library JENDL. Development of FRENDY helps to disseminate JENDL and various nuclear calculation codes. FRENDY is developed not only to process the evaluated nuclear data file but also to implement the FRENDY functions to other calculation codes. Users can easily use many functions e.g., read, write, and process the evaluated nuclear data file, in their own codes when they implement the classes of FRENDY to their codes. FRENDY is coded with considering maintainability, modularity, portability and flexibility. The processing method of FRENDY is similar to that of NJOY. The current version of FRENDY treats the ENDF-6 format and generates the ACE file which is used for the continuous energy Monte Carlo codes such as PHITS and MCNP. This report describes the nuclear data processing methods and input instructions for FRENDY.

Journal Articles

ACE library of JENDL-4.0/HE

Matsuda, Norihiro; Kunieda, Satoshi; Okamoto, Tsutomu*; Tada, Kenichi; Konno, Chikara

Progress in Nuclear Science and Technology (Internet), 6, p.225 - 229, 2019/01

Journal Articles

Implementation of random sampling for ACE-format cross sections using FRENDY and application to uncertainty reduction

Kondo, Ryoichi*; Endo, Tomohiro*; Yamamoto, Akio*; Tada, Kenichi

Proceedings of International Conference on Mathematics and Computational Methods applied to Nuclear Science and Engineering (M&C 2019) (CD-ROM), p.1493 - 1502, 2019/00

A perturbation capability of ACE formatted cross section files was developed using the modules of FRENDY. Uncertainty quantification using MCNP was carried out for the Godiva critical experiment by the RS method. We verified the results of the RS method by comparing with those obtained by the conventional sensitivity analyses. Moreover, uncertainty reduction using the bias factor method with the RS technique was applied to kinetic parameter, i.e., neutron generation time.

Journal Articles

Coupled analysis of fuel debris distribution and recriticality by both multiphase/multicomponent flow and continuous energy neutron transport Monte Carlo simulations

Yamashita, Susumu; Tada, Kenichi; Yoshida, Hiroyuki; Suyama, Kenya

Nippon Genshiryoku Gakkai Wabun Rombunshi, 17(3/4), p.99 - 105, 2018/12

In order to reveal melt relocation behaviors of core internals phenomenologically and to reduce the uncertainties of the melt relocation analysis in existing SA analysis codes, in JAEA, the numerical simulation code for melt relocation and accumulation behaviors based on computational fluid dynamics named JUPITER has been developed. In this paper, to consider the estimation method for fuel debris composition and its re-criticality, we performed the melt accumulating and spreading simulation to the pedestal region by JUPITER and also performed re-criticality analysis by Monte Carlo Codes for Neutron Transport Calculations based on Continuous Energy and Multi-group Methods (MVP) using detailed fuel debris composition data obtained by JUPITER. From the coupled analysis on fuel debris distribution by JUPITER and MVP, we had prospects for a detailed possibility of re-criticality of fuel debris with detailed fuel debris distribution.

Journal Articles

Report on the 30th Meeting of Working Party on International Nuclear Data Evaluation Co-operation

Iwamoto, Osamu; Iwamoto, Nobuyuki; Kimura, Atsushi; Yokoyama, Kenji; Tada, Kenichi

Kaku Deta Nyusu (Internet), (120), p.35 - 46, 2018/06

We report 30th WPEC meeting, expert group meeting, and subgroup meeting in Paris, May 14-18, 2018.

Journal Articles

Improvement of probability table generation using ladder method for a new nuclear data processing system FRENDY

Tada, Kenichi

Proceedings of Reactor Physics Paving the Way Towards More Efficient Systems (PHYSOR 2018) (USB Flash Drive), p.2929 - 2939, 2018/04

JAEA develops a new nuclear data processing system FRENDY. We investigated all processing methods and we focused on the probability table generation using the ladder method which is adopted in the PURR module in NJOY. To improve the probability table generation, the more sophisticated method was introduced in the calculation methods of the Chi-Squared random numbers and the complex error function. We also investigated the appropriate ladder number. To investigate the impact of the difference of the complex error function calculation method, the K$$_{rm eff}$$ values of the benchmark experiments with the probability tables by the both methods were compared. The calculation results indicated that the appropriate ladder number is 100 and the difference of the calculation methods of the Chi-Squared random numbers and the complex error function has no significant impact on the neutronics calculation.

JAEA Reports

Analysis of post irradiation examination of used BWR fuel with SWAT4.0

Kikuchi, Takeo; Tada, Kenichi; Sakino, Takao; Suyama, Kenya

JAEA-Research 2017-021, 56 Pages, 2018/03

JAEA-Research-2017-021.pdf:2.15MB

The criticality management of the fuel debris is one of the most important research issues in Japan. The current criticality management adopts the fresh fuel assumption. The adoption of the fresh fuel assumption for the criticality control of the fuel debris is difficult because the k$$_{rm eff}$$ of the fuel debris could exceed 1.0 in most of cases which the fuel debris contains water and does not contain neutron absorbers such as gadolinium. Therefore, the adoption of the burnup credit is considered. The prediction accuracy of the isotopic composition of used nuclear fuel must be required to adopt the burnup credit for the treatment of the fuel debris. JAEA developed a burnup calculation code SWAT4.0 to obtain reference calculation results of the isotopic composition of the used nuclear fuel. This code is used to evaluate the composition of fuel debris. In order to investigate the prediction accuracy of SWAT4.0, we analyzed the PIE of BWR obtained from 2F2DN23.

Journal Articles

Cutting-edge studies on nuclear data for continuous and emerging need, 6; Processing and validation of nuclear data

Tada, Kenichi; Kosako, Kazuaki*; Yokoyama, Kenji; Konno, Chikara

Nippon Genshiryoku Gakkai-Shi, 60(3), p.168 - 172, 2018/03

The neutronics calculation codes cannot treat the evaluated nuclear data file directly. The nuclear data processing is required to use the nuclear data file in the neutronics calculation codes. The nuclear data processing is not just a converter but also many processes to evaluate the physical values for the neutronics calculation codes. In this paper, we describe the overview of the nuclear data processing and validation of the nuclear data.

Journal Articles

Analysis of used BWR fuel assay data with the integrated burnup code system SWAT4.0

Tada, Kenichi; Kikuchi, Takeo*; Sakino, Takao; Suyama, Kenya

Journal of Nuclear Science and Technology, 55(2), p.138 - 150, 2018/02

 Times Cited Count:1 Percentile:72.34(Nuclear Science & Technology)

The criticality safety of the fuel debris in Fukushima Daiichi Nuclear Power Plant is one of the most important issues and the adoption of the burnup credit is desired for the criticality analysis. The assay data of used nuclear fuel irradiated in 2F2 is evaluated to validate SWAT4.0 for BWR fuel burnup problem. The calculation results revealed that number density of many heavy nuclides and FPs showed good agreement with the experimental data except for $$^{235}$$U, $$^{237}$$Np, $$^{238}$$Pu and Sm isotopes. The cause of the difference is assumption of the initial number density and void ratio and overestimation of the capture cross section of $$^{237}$$Np. The C/E-1 values do not depend on the types of fuel rods (UO$$_{2}$$ or UO$$_{2}$$-Gd$$_{2}$$O$$_{3}$$) and it is similar to that for the PWR fuel. These results indicate that SWAT4.0 appropriately analyzes the isotopic composition of the BWR fuel and it has sufficient accuracy to be adopted in the burnup credit evaluation of the fuel debris.

Journal Articles

FRENDY; A New nuclear date processing system being developed at JAEA

Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio

EPJ Web of Conferences (Internet), 146, p.02028_1 - 02028_5, 2017/09

 Times Cited Count:0 Percentile:100

JAEA has started to develop new nuclear data processing system FRENDY (FRom Evaluated Nuclear Data libralY to any application). In this presentation, the outline of the development of FRENDY is presented. And functions and performances of FRENDY are demonstrated by generation and validation of the continuous energy cross section data libraries for MVP, PHITS and MCNP codes.

Journal Articles

Important comments on KERMA factors and DPA cross-section data in ACE files of JENDL-4.0, JEFF-3.2 and ENDF/B-VII.1

Konno, Chikara; Tada, Kenichi; Kwon, Saerom*; Ota, Masayuki*; Sato, Satoshi*

EPJ Web of Conferences (Internet), 146, p.02040_1 - 02040_4, 2017/09

 Times Cited Count:0 Percentile:100

So far we pointed out that KERMA factors and DPA cross-section data of a lot of nuclei in the official ACE file were different among nuclear data libraries for the following reasons; (1) incorrect nuclear data, (2) NJOY bugs, (3) huge helium production cross section data, (4) mf6 mt102 data, (5) no secondary particle data (energy-angular distribution data). Now we compare the KERMA factors and DPA cross section data included in the official ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 in more detail. As a result, we find out new reasons of differences among the KERMA factors and DPA cross section data in the three nuclear data libraries. The reasons are categorized to no secondary charged particle data, no secondary $$gamma$$ data, wrong secondary $$gamma$$ spectra, wrong production yields and mf12-15 mt3 data for the capture reaction, some of which seem to be unsupported with NJOY. The ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 with these problems should be revised based on this study.

Journal Articles

Development and verification of a new nuclear data processing system FRENDY

Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio

Journal of Nuclear Science and Technology, 54(7), p.806 - 817, 2017/07

AA2016-0417.pdf:1.93MB

 Times Cited Count:6 Percentile:22.67(Nuclear Science & Technology)

JAEA has developed an evaluated nuclear data library JENDL and several nuclear analysis codes such as MARBLE2, SRAC, MVP and PHITS. Though JENDL and these computer codes have been widely used in many countries, the nuclear data processing system to generate the data library for application programs had not been developed in Japan and foreign nuclear data processing systems, e.g., NJOY and PREPRO are used. To process the new library for JAEA's computer codes immediately and independently, JAEA started to develop the new nuclear data processing system FRENDY in 2013. In this paper, outline, function, and verification of FRENDY are described.

68 (Records 1-20 displayed on this page)