0% found this document useful (0 votes)
37 views

Crystal GCN

Uploaded by

Aman Jalan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Crystal GCN

Uploaded by

Aman Jalan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/320726915

Crystal Graph Convolutional Neural Networks for Accurate and Interpretable


Prediction of Material Properties

Article  in  Physical Review Letters · October 2017


DOI: 10.1103/PhysRevLett.120.145301

CITATIONS READS

253 2,074

2 authors, including:

Tian Xie
Massachusetts Institute of Technology
20 PUBLICATIONS   475 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Tian Xie on 08 June 2018.

The user has requested enhancement of the downloaded file.


PHYSICAL REVIEW LETTERS 120, 145301 (2018)

Crystal Graph Convolutional Neural Networks for an Accurate


and Interpretable Prediction of Material Properties
Tian Xie and Jeffrey C. Grossman
Department of Materials Science and Engineering, Massachusetts Institute of Technology,
Cambridge, Massachusetts 02139, USA
(Received 18 October 2017; revised manuscript received 15 December 2017; published 6 April 2018)

The use of machine learning methods for accelerating the design of crystalline materials usually requires
manually constructed feature vectors or complex transformation of atom coordinates to input the crystal
structure, which either constrains the model to certain crystal types or makes it difficult to provide chemical
insights. Here, we develop a crystal graph convolutional neural networks framework to directly learn material
properties from the connection of atoms in the crystal, providing a universal and interpretable representation
of crystalline materials. Our method provides a highly accurate prediction of density functional theory
calculated properties for eight different properties of crystals with various structure types and compositions
after being trained with 104 data points. Further, our framework is interpretable because one can extract the
contributions from local chemical environments to global properties. Using an example of perovskites, we
show how this information can be utilized to discover empirical rules for materials design.

DOI: 10.1103/PhysRevLett.120.145301

Machine learning (ML) methods are becoming increas- demonstrate the interpretability of the CGCNN by
ingly popular in accelerating the design of new materials by extracting the energy of each site in the perovskite structure
predicting material properties with accuracy close to ab initio from the total energy, an example of learning the contri-
calculations, but with computational speeds orders of mag- bution of local chemical environments to the global
nitude faster [1–3]. The arbitrary size of crystal systems property. The empirical rules generalized from the results
poses a challenge as they need to be represented as a fixed are consistent with the common knowledge for discovering
length vector in order to be compatible with most ML more stable perovskites and can significantly reduce the
algorithms. This problem is usually resolved by manually search space for high throughput screening.
constructing fixed length feature vectors using simple The main idea in our approach is to represent the crystal
material properties [1,3–6] or designing symmetry-invariant structure by a crystal graph that encodes both atomic
transformations of atom coordinates [7–9]. However, the information and bonding interactions between atoms,
former requires a case-by-case design for predicting different and then build a convolutional neural network on top of
properties, and the latter makes it hard to interpret the models the graph to automatically extract representations that are
as a result of the complex transformations. optimum for predicting target properties by training with
In this Letter, we present a generalized crystal graph DFT calculated data. As illustrated in Fig. 1(a), a crystal
convolutional neural networks (CGCNN) framework for graph G is an undirected multigraph which is defined by
representing periodic crystal systems that provides both nodes representing atoms and edges representing connec-
material property prediction with density functional theory tions between atoms in a crystal (the method for determin-
(DFT) accuracy and atomic level chemical insights. Recent ing atom connectivity is explained in the Supplemental
advances in “deep learning” have enabled learning from a Material [12]). The crystal graph is unlike normal graphs
very raw representation of data, e.g., pixels of an image, since it allows multiple edges between the same pair of end
making it possible to build general models that outperform nodes, a characteristic for crystal graphs due to their
traditionally expert designed representations [10]. By periodicity, in contrast to molecular graphs. Each node i
looking into the simplest form of crystal representation, is represented by a feature vector vi, encoding the property
i.e., the connection of atoms in the crystal, we directly build of the atom corresponding to node i. Similarly, each edge
convolutional neural networks on top of crystal graphs ði; jÞk is represented by a feature vector uði;jÞk correspond-
generated from crystal structures. The CGCNN achieves ing to the kth bond connecting atom i and atom j.
similar accuracy with respect to DFT calculations as DFT The convolutional neural networks built on top of the
compared with experimental data for eight different proper- crystal graph consist of two major components: convolu-
ties after being trained with data from the Materials Project tional layers and pooling layers. Similar architectures have
[11], indicating the generality of this method. We also been used for computer vision [22], natural language

0031-9007=18=120(14)=145301(6) 145301-1 © 2018 American Physical Society


PHYSICAL REVIEW LETTERS 120, 145301 (2018)

property y, defined by a cost function Jðy; ŷÞ. The whole


CGCNN can be considered as a function f parametrized
by weights W that maps a crystal C to the target property
ŷ. Using backpropagation and stochastic gradient
descent (SGD), we can solve the following optimization
problem by iteratively updating the weights with DFT
calculated data:

minJ(y; fðC; WÞ) ð3Þ


W

the learned weights can then be used to predict material


properties and provide chemical insights for future materi-
als design.
FIG. 1. Illustration of the crystal graph convolutional neural
In the Supplemental Material (SM) [12], we use a simple
networks. (a) Construction of the crystal graph. Crystals are
converted to graphs with nodes representing atoms in the unit cell
example to illustrate how a CGCNN composed of one
and edges representing atom connections. Nodes and edges are linear convolution layer and one pooling layer can differ-
characterized by vectors corresponding to the atoms and bonds in entiate two crystal structures. With multiple convolution
the crystal, respectively. (b) Structure of the convolutional neural layers, pooling layers, and hidden layers, the CGCNN can
network on top of the crystal graph. R convolutional layers and extract any structure differences based on the atom con-
L1 hidden layers are built on top of each node, resulting in a new nections and discover the underlaying relations between
graph with each node representing the local environment of each structure and property.
atom. After pooling, a vector representing the entire crystal is To demonstrate the generality of the CGCNN, we train
connected to L2 hidden layers, followed by the output layer to the model using calculated properties from the Materials
provide the prediction.
Project [11]. We focus on two types of generality in this
work: (1) the structure types and chemical compositions for
processing [23], molecular fingerprinting [24] and general which our model can be applied and (2) the number of
graph-structured data [25,26], but not for crystal property properties that our model can accurately predict.
prediction to the best of our knowledge. The convolutional The database we used includes a diverse set of inorganic
layers iteratively update the atom feature vector vi by crystals ranging from simple metals to complex minerals.
“convolution” with surrounding atoms and bonds with a After removing ill-converged crystals, the full database has
nonlinear graph convolution function, 46 744 materials covering 87 elements, 7 lattice systems,
  and 216 space groups. As shown in Fig. 2(a), the materials
ðtþ1Þ ðtÞ ðtÞ
vi ¼ Conv vi ; vj ; uði;jÞk ; ði; jÞk ∈ G: ð1Þ consist of as many as seven different elements, with 90%
of them binary, ternary, and quaternary compounds. The
number of atoms in the primitive cell ranges from 1 to 200,
After R convolutions, the network automatically learns the
ðRÞ and 90% of crystals have less than 60 atoms (Fig. S2).
feature vector vi for each atom by iteratively including its Considering most of the crystals originate from the Inorganic
surrounding environment. The pooling layer is then used Crystal Structure Database [27], this database is a good
for producing an overall feature vector vc for the crystal, representation of known stoichiometric inorganic crystals.
which can be represented by a pooling function, The CGCNN is a flexible framework that allows
ð0Þ ð0Þ ð0Þ ðRÞ
variance in the crystal graph representation, neural network
vc ¼ Poolðv0 ; v1 ; …; vN ; …; vN Þ ð2Þ architecture, and training process, resulting in different f in
Eq. (3) and prediction performance. To choose the best
that satisfies permutational invariance with respect to atom model, we apply a train-validation scheme to optimize the
indexing and size invariance with respect to unit cell prediction of formation energies of crystals. Each model is
choice. In this work, a normalized summation is used as trained with 60% of the data and then validated with 20% of
the pooling function for simplicity, but other functions can the data, and the best-performing model in the validation
also be used. In addition to the convolutional and pooling set is selected. In our study, we find that the neural network
layers, two fully connected hidden layers with the depths architecture, especially the form of convolution function in
of L1 and L2 are added to capture the complex mapping Eq. (1), has the largest impact on prediction performance.
between crystal structure and property. Finally, an output We start with a simple convolution function,
layer is used to connect the L2 hidden layer to predict the
target property ŷ. X  
ðtþ1Þ ðtÞ ðtÞ ðtÞ ðtÞ ðtÞ
The training is performed by minimizing the difference vi ¼g vj ⊕ uði;jÞk W c þ vi W s þ b ; ð4Þ
between the predicted property ŷ and the DFT calculated j;k

145301-2
PHYSICAL REVIEW LETTERS 120, 145301 (2018)
(a) (b)
function, a significant improvement compared to Eq. (4).
In Fig. S3, we compare the effects of several other hyper-
parameters on the MAE which are much smaller than the
effect of the convolution function.
Figures 2(b) and 2(c) show the performance of the two
models on 9350 test crystals for predicting the formation
energy per atom. We find a systematic decrease of the
(c) (d) MAE of the predicted values compared with DFT calcu-
lated values for both convolution functions as the number
of training data is increased. The best MAEs we achieved
with Eqs. (4) and (5) are 0.136 and 0.039 eV=atom,
respectively, and 90% of the crystals are predicted within
0.3 and 0.08 eV=atom errors. In comparison, Kirklin et al.
reports that the MAE of the DFT calculation with respect to
experimental measurements in the Open Quantum
FIG. 2. Performance of CGCNN on the Materials Project Materials Database is 0.081–0.136 eV=atom depending
database [11]. (a) Histogram representing the distribution of on whether the energies of the elemental reference states
the number of elements in each crystal. (b) Mean absolute error as are fitted, although they also find a large MAE of
a function of training crystals for predicting formation energy per
0.082 eV=atom between different sources of experimental
atom using different convolution functions. The shaded area
denotes the MAEs of DFT calculations compared with experi- data. Given the comparison, our CGCNN approach pro-
ments [28]. (c) 2D histogram representing the predicted for- vides a reliable estimation of DFT calculations and can
mation per atom against DFT calculated value. (d) Receiver potentially be applied to predict properties calculated by
operating characteristic curve visualizing the result of metal- more accurate methods like GW [30] and quantum
semiconductor classification. It plots the proportion of correctly Monte Carlo calculations [31].
identified metals (true positive rate) against the proportion of After establishing the generality of the CGCNN with
wrongly identified semiconductors (false positive rate) under respect to the diversity of crystals, we next explore its
different thresholds. prediction performance for different material properties.
We apply the same framework to predict the absolute
where ⊕ denotes concatenation of atom and bond feature energy, band gap, Fermi energy, bulk moduli, shear moduli,
ðtÞ ðtÞ and Poisson ratio of crystals using DFT calculated data
vectors, W c , W s , and bðtÞ are the convolution weight
from the Materials Project [11]. The prediction perfor-
matrix, self-weight matrix, and bias of the tth layer,
mance of Eq. (5) is improved compared to Eq. (4) for all six
respectively, and g is the activation function for introducing
properties (Table S4). We summarize the performance in
nonlinear coupling between layers. By optimizing hyper-
Table I and the corresponding 2D histograms in Fig. S4.
parameters in Table S1, the lowest mean absolute error
As we can see, the MAEs of our model are close to or
(MAE) for the validation set is 0.108 eV=atom. One
higher than DFT accuracy relative to experiments for most
limitation of Eq. (4) is that it uses a shared convolution
ðtÞ properties when ∼104 training data are used. For elastic
weight matrix W c for all neighbors of i, which neglects properties, the errors are higher since less data are available,
the differences of interaction strength between neighbors. and the accuracy of DFT relative to experiments can be
To overcome this problem, we design a new convolution expected if ∼104 training data are available (Fig. S5).
function that first concatenates neighbor vectors
ðtÞ ðtÞ ðtÞ
zði;jÞ ¼ vi ⊕ vj ⊕ uði;jÞk , then perform convolution by
k TABLE I. Summary of the prediction performance of seven
different properties on test sets.
ðtþ1Þ ðtÞ
X  ðtÞ ðtÞ ðtÞ

vi ¼ vi þ σ zði;jÞ W f þ bf # of train
k
j;k
  Property data Unit MAEmodel MAEDFT
ðtÞ ðtÞ ðtÞ
⊙g zði;jÞ W s þ bs ; ð5Þ Formation 28 046 eV=atom 0.039 0.081–0.136 [28]
k
energy
Absolute 28 046 eV=atom 0.072 
where ⊙ denotes element-wise multiplication and σ energy
denotes a sigmoid function. In Eq. (5), the σð·Þ functions Band gap 16 458 eV 0.388 0.6 [32]
as a learned weight matrix to differentiate interactions Fermi energy 28 046 eV 0.363 
ðtÞ Bulk moduli 2041 log(GPa) 0.054 0.050 [13]
between neighbors and adding vi makes learning deeper
Shear moduli 2041 log(GPa) 0.087 0.069 [13]
networks easier [29]. We achieve MAE on the validation
Poisson ratio 2041  0.030 
set of 0.039 eV=atom using the modified convolution

145301-3
PHYSICAL REVIEW LETTERS 120, 145301 (2018)

Recently, Jong et al. [33] developed a statistical learning (a) (b)


(SL) framework using multivariate local regression on
crystal descriptors to predict elastic properties using the
same data from the Materials Project. By using the same
number of training data, our model achieves root mean
squared error (RMSE) on test sets of 0.105 log(GPa) and
0.127 log(GPa) for the bulk and shear moduli, which is
similar to the RMSE of SL on the entire data set of 0.0750
log(GPa) and 0.1378 log(GPa). Comparing the two meth- (c)
ods, the CGCNN predicts properties by extracting features
only from the crystal structure, while SL depends on crystal
descriptors like cohesive energy and volume per atom.
Recently, 1585 new crystals with elastic properties have
been uploaded to the Materials Project database. Our model
in Table I achieves MAE of 0.077 log(GPa) for bulk
moduli and 0.114 log(GPa) for shear moduli on these
crystals, showing good generalization to materials from
(d)
potentially different crystal groups.
In addition to predicting continuous properties, the
CGCNN can also predict discrete properties by changing
the output layer. By using a softmax activation function
for the output layer and a cross entropy cost function, we
can predict the classifications of metal and semiconductor
with the same framework. In Fig. 2(d), we show the
receiver operating characteristic curve of the prediction FIG. 3. Extraction of site energy of perovskites from total
on 9350 test crystals. Excellent prediction performance is formation energy. (a) Structure of perovskites. (b) 2D histogram
achieved with the area under the curve at 0.95. By choosing representing the predicted total energy above hull against DFT
a threshold of 0.5, we get metal prediction accuracy at 0.80, calculated value. (c),(d) Periodic table with the color of each
semiconductor prediction accuracy at 0.95, and overall element representing the mean of the site energy when the
prediction accuracy at 0.90. element occupies A site (c) or B site (d).
Model interpretability is a desired property for any ML
algorithm applied in materials science, because it can provide nonradioactive metals and X sites can be one or several
additional information for material design which may be elements from O, N, S, and F. We use the CGCNN with a
more valuable than simply screening a large number of linear pooling to predict the total energy above hull of
materials. However, nonlinear functions are needed to learn perovskites in the database, using Eq. (4) as the convolution
the complex structure-property relations, resulting in ML function. The resulting MAE on 3787 test perovskites is
models that are difficult to interpret. The CGCNN resolves 0.130 eV=atom as shown in Fig. 3(b), which is slightly
this dilemma by separating the convolution and pooling higher than using a complete pooling layer and L2 hidden
layers. After the R convolutional and L1 hidden layers, we layers (0.099 eV=atom as shown in Fig. S6) due to the
ðRÞ
map the last atom feature vector vi to a scalar ṽi and additional constraints introduced by the simplified pooling
perform a linear pooling to predict the target property directly layer. However, this CGCNN allows us to learn the energy of
without the L2 hidden layers (details discussed in SM [12]). each site in the crystal while training with the total energy
Therefore, we can learn the contribution of different local above hull, providing additional insights for material design.
chemical environments, represented by ṽi for each atom, to Figures 3(c) and 3(d) visualize the mean of the predicted
the target property while maintaining a model with high site energies when each element occupies the A and B site,
capacity to ensure the prediction performance. respectively. The most stable elements that occupy the
We demonstrate how this local chemical environment A site are those with large radii due to the space needed for
related information can be used to provide chemical insights 12 coordinations. In contrast, elements with small radii like
and guide the material design by a specific example: learning Be, B, and Si are the most unstable for occupying the A site.
the energy of each site in perovskites from the total energy For the B site, elements in groups 4, 5, and 6 are the most
above hull data. Perovskite is a crystal structure type with the stable throughout the periodic table. This can be explained
form of ABX3 , where the site A atom sits at a corner position, by crystal field theory, since the configuration of d
the site B atom sits at a body centered position, and site X electrons of these elements favors the octahedral co-
atoms sit at face centered positions [Fig. 3(a)]. The database ordination in the B site. Interestingly, the visualization
[34] we use includes the energy above hull of 18 928 shows that large atoms from groups 13–15 are stable in the
perovskite crystals, in which A and B sites can be any A site, in addition to the well-known region of groups 1–3

145301-4
PHYSICAL REVIEW LETTERS 120, 145301 (2018)

elements. Inspired by this result, we applied a combina- [5] L. M. Ghiringhelli, J. Vybiral, S. V. Levchenko, C. Draxl,
torial search for stable perovskites using elements from and M. Scheffler, Phys. Rev. Lett. 114, 105503 (2015).
groups 13–15 as the A site and groups 4–6 as the B site. [6] O. Isayev, D. Fourches, E. N. Muratov, C. Oses, K. Rasch,
Because of the theoretical inaccuracies of DFT calculations A. Tropsha, and S. Curtarolo, Chem. Mater. 27, 735 (2015).
and the possibility of metastable phases that can [7] K. T. Schütt, H. Glawe, F. Brockherde, A. Sanna, K. R.
Müller, and E. K. U. Gross, Phys. Rev. B 89, 205118
be stabilized by temperature, defects, and substrates, many
(2014).
synthesizable inorganic crystals have positive calculated [8] F. Faber, A. Lindmaa, O. A. von Lilienfeld, and R.
energies above hull at 0 K. Some metastable nitrides Armiento, Int. J. Quantum Chem. 115, 1094 (2015).
can even have energies up to 0.2 eV=atom above hull as [9] A. Seko, H. Hayashi, K. Nakayama, A. Takahashi, and I.
a result of the strong bonding interactions [35]. In this work, Tanaka, Phys. Rev. B 95, 144110 (2017).
since some of the perovskites are also nitrides, we choose to [10] Y. LeCun, Y. Bengio, and G. Hinton, Nature (London) 521,
set the cutoff energy for potential synthesizability at 436 (2015).
0.2 eV=atom. We discovered 33 perovskites that fall within [11] A. Jain, S. P. Ong, G. Hautier, W. Chen, W. D. Richards, S.
this threshold out of 378 in the entire data set, among which 8 Dacek, S. Cholia, D. Gunter, D. Skinner, G. Ceder et al.,
are within the cutoff out of 58 in the test set (Table S5). Many APL Mater. 1, 011002 (2013).
of these compounds like PbTiO3 [36], PbZrO3 [36], SnTaO3 [12] See Supplemental Material at http://link.aps.org/
[37], and PbMoO3 [38] have been experimentally syn- supplemental/10.1103/PhysRevLett.120.145301 for further
details, which includes Refs. [4,13–21].
thesized. Note that PbMoO3 has calculated energy of
[13] M. De Jong, W. Chen, T. Angsten, A. Jain, R. Notestine,
0.18 eV=atom above hull, indicating that our choice of A. Gamst, M. Sluiter, C. K. Ande, S. Van Der Zwaag, J. J.
cutoff energy is reasonable. In general, chemical insights Plata et al., Sci. Data 2, 150009 (2015).
gained from the CGCNN can significantly reduce the search [14] R. Sanderson, Science 114, 670 (1951).
space for high throughput screening. In comparison, there [15] R. Sanderson, J. Am. Chem. Soc. 74, 4792 (1952).
are only 228 potentially synthesizable perovskites out of [16] B. Cordero, V. Gómez, A. E. Platero-Prats, M. Revés, J.
18 928 in our database: the chemical insight increased the Echeverría, E. Cremades, F. Barragán, and S. Alvarez,
search efficiency by a factor of 7. Dalton Trans. 21, 2832 (2008).
In summary, the crystal graph convolutional neural net- [17] A. Kramida, Y. Ralchenko, J. Reader et al., Atomic Spectra
works present a flexible machine learning framework for Database (National Institute of Standards and Technology,
material property prediction and design knowledge extrac- Gaithersburg, MD, 2013).
tion. The framework provides a reliable estimation of DFT [18] W. M. Haynes, CRC Handbook of Chemistry and Physics
(CRC Press, Boca Raton, FL, 2014).
calculations using around 104 training data for eight proper-
[19] D. Kingma and J. Ba, arXiv:1412.6980.
ties of inorganic crystals with diverse structure types and [20] N. Srivastava, G. E. Hinton, A. Krizhevsky, I. Sutskever,
compositions. As an example of knowledge extraction, we and R. Salakhutdinov, J. Mach. Learn. Res. 15, 1929 (2014).
apply this approach to the design of new perovskite materials [21] V. A. Blatov, Crystallography Reviews 10, 249 (2004).
and show that information extracted from the model is [22] A. Krizhevsky, I. Sutskever, and G. E. Hinton, in Advances
consistent with common chemical insights and significantly in Neural Information Processing Systems (MIT Press,
reduces the search space for high throughput screening. Cambridge, MA, 2012), pp. 1097–1105.
The code for the CGCNN is available from Ref. [39]. [23] R. Collobert and J. Weston, in Proceedings of the 25th
International Conference on Machine Learning (ACM,
This work was supported by Toyota Research Institute. New York, 2008), pp. 160–167.
Computational support was provided through the National [24] D. K. Duvenaud, D. Maclaurin, J. Iparraguirre, R. Bombarell,
Energy Research Scientific Computing Center, a DOE T. Hirzel, A. Aspuru-Guzik, and R. P. Adams, in Advances
Office of Science User Facility supported by the Office of in Neural Information Processing Systems (MIT Press,
Science of the U.S. Department of Energy under Contract Cambridge, MA, 2015), pp. 2224–2232.
No. DE-AC02-05CH11231, and the Extreme Science and [25] M. Henaff, J. Bruna, and Y. LeCun, arXiv:1506.05163.
Engineering Discovery Environment, supported by National [26] J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and
Science Foundation Grant No. ACI-1053575. G. E. Dahl, Proceedings of the 34th International Conference
on Machine Learning, 2017, http://proceedings.mlr.press/
v70/gilmer17a.html.
[27] M. Hellenbrandt, Crystallography Reviews 10, 17 (2004).
[1] A. Seko, A. Togo, H. Hayashi, K. Tsuda, L. Chaput, and I. [28] S. Kirklin, J. E. Saal, B. Meredig, A. Thompson, J. W. Doak,
Tanaka, Phys. Rev. Lett. 115, 205901 (2015). M. Aykol, S. Rühl, and C. Wolverton, npj Comput. Mater. 1,
[2] F. A. Faber, A. Lindmaa, O. A. von Lilienfeld, and R. 15010 (2015).
Armiento, Phys. Rev. Lett. 117, 135502 (2016). [29] K. He, X. Zhang, S. Ren, and J. Sun, in Proceedings
[3] D. Xue, P. V. Balachandran, J. Hogden, J. Theiler, D. Xue, of the IEEE Conference on Computer Vision and Pattern
and T. Lookman, Nat. Commun. 7, 11241 (2016). Recognition (IEEE, New York, 2016), pp. 770–778.
[4] O. Isayev, C. Oses, C. Toher, E. Gossett, S. Curtarolo, and [30] M. S. Hybertsen and S. G. Louie, Phys. Rev. B 34, 5390
A. Tropsha, Nat. Commun. 8, 15679 (2017). (1986).

145301-5
PHYSICAL REVIEW LETTERS 120, 145301 (2018)

[31] W. Foulkes, L. Mitas, R. Needs, and G. Rajagopal, Rev. [35] W. Sun, S. T. Dacek, S. P. Ong, G. Hautier, A. Jain, W. D.
Mod. Phys. 73, 33 (2001). Richards, A. C. Gamst, K. A. Persson, and G. Ceder, Sci.
[32] A. Jain, G. Hautier, C. J. Moore, S. P. Ong, C. C. Fischer, T. Adv. 2, e1600225 (2016).
Mueller, K. A. Persson, and G. Ceder, Comput. Mater. Sci. [36] G. Shirane, K. Suzuki, and A. Takeda, J. Phys. Soc. Jpn. 7,
50, 2295 (2011). 12 (1952).
[33] M. De Jong, W. Chen, R. Notestine, K. Persson, G. [37] J. Lang, C. Li, X. Wang et al., Mater. Today: Proc. 3, 424
Ceder, A. Jain, M. Asta, and A. Gamst, Sci. Rep. 6, (2016).
34256 (2016). [38] H. Takatsu, O. Hernandez, W. Yoshimune, C. Prestipino, T.
[34] I. E. Castelli, T. Olsen, S. Datta, D. D. Landis, S. Dahl, K. S. Yamamoto, C. Tassel, Y. Kobayashi, D. Batuk, Y. Shibata,
Thygesen, and K. W. Jacobsen, Energy Environ. Sci. 5, A. M. Abakumov et al., Phys. Rev. B 95, 155105 (2017).
5814 (2012). [39] CGCNN website, https://github.com/txie-93/cgcnn.

145301-6

View publication stats

You might also like