Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Training Neural Network Potentials for Atomistic Calculations on Carbon Materials: An initial study on diamond structures
Luleå University of Technology, Department of Engineering Sciences and Mathematics.
2018 (English)Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Machine Learning (ML) and especially implementations of  Neural Networks (NNs) is growing in popularity  across numerous application areas. One of which is the use of a trained NN as an interatomic potential in Atomistic Simulations (AS), a NN applied in this manner is referred to as a Neural Network Potential (NNP).

A well established method of atomistic calculations is the use of the first principle Density Functional Theory (DFT). DFT can very precisely model properties of nanomaterials, but for large systems of atoms DFT is not a feasible method because of its heavy computational load. The use of NNPs enables accurate simulations of big systems of atoms with a reasonable low computational cost.

Previous work by students at Luleå University of Technology (LTU) where NNs were trained on fullerenes and carbon nanotubes (CNTs) demonstrated promising results. The NNs trained by the use of Atomistic Machine-Learning Package (AMP) managed to predict energies with considerable accuracy (100 meV/atom), but the force predictions were problematic and did not reach desired accuracy (the force RMSE reached was 6 eV/Å). Attempts made to run AS such as Molecular Dynamics (MD) and Geometry Optimization (GO) were unsuccessful, likely due to a poor representation of forces. 

This work aims to improve the performance of NNs on carbon materials by studying diamond structures using AMP, such that working AS can be achieved.This was done in two stages, first a feasibility study was made to find appropriate hyperparameters. Moreover a study was made, where NNs was trained with the hyperparameters found. Two types of feature mapping descriptors were considered here, Gaussian and Zernike.The NNs trained was used as NNPs to perform MD and GO simulations as a mean of evaluation. The NNPs were also used to calculate the phonon dispersion curve of diamond.The trained NNPs in this work managed to perform AS and calculate the phonon dispersion curve with varying success. The best performing NN trained on 333 super-cells of diamond reached an accuracy of 120 meV/atom when predicting energies, and 640 meV/Å predicting forces. A NNP trained with Gaussian descriptors turned out to be 10 times faster than the reference simulation done with DFT, compared while performing a single step in a GO. The phonon dispersion curve produced by the Gaussian NNP displayed a striking resemblance to the reference produced by using DFT. Phonon dispersion curves produced by the Zernike NNP was distorted and involved a great deal of imaginary frequencies, but the correct amplitude was reached.The Gaussian NNPs trained in this work turned out to be faster and better in almost all regards compared to the Zernike alternative. The only time Zernike outperformed Gaussian descriptors were in the total energy reached in a GO simulation applying the NNPs from the study. Compared to DFT results the Zernike error was 0.26 eV (0.05%) and the Gaussian error was 0.855 eV (0.17%). MD simulations where the NNPs was used worked well for the Gaussian variant but not for the Zernike.With the AS up and running (at least for the Gaussian NNP) the following step is either to improve the performance on diamond structures. Or to include more carbon materials in the studies such as CNT and fullerenes.

Place, publisher, year, edition, pages
2018. , p. 40
Keywords [en]
Machine Learning, Neural Networks, Atomistic Simulations, Neural Network Potentials, Density Functional Theory.
National Category
Condensed Matter Physics Other Computer and Information Science
Identifiers
URN: urn:nbn:se:ltu:diva-70100OAI: oai:DiVA.org:ltu-70100DiVA, id: diva2:1233205
Educational program
Engineering Physics and Electrical Engineering, master's level
Supervisors
Examiners
Available from: 2018-08-14 Created: 2018-07-16 Last updated: 2018-08-14Bibliographically approved

Open Access in DiVA

No full text in DiVA

Search in DiVA

By author/editor
Magnusson, Jens
By organisation
Department of Engineering Sciences and Mathematics
Condensed Matter PhysicsOther Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 660 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf