An Introduction to Density Functional Theory without Equations
Chemists regularly use Density Functional Theory (DFT) calculations to understand reaction mechanisms, predict molecular properties, and design new materials. These calculations can reveal insights that would be difficult or impossible to obtain through experiments alone. When studying chemical reactions, DFT helps identify transition states and intermediate structures along reaction pathways. The calculations can predict activation energies and reaction energetics, helping chemists understand why some reactions occur readily while others are slow or forbidden. This understanding guides the development of new catalysts and the optimization of reaction conditions. In materials science, DFT predicts properties such as band gaps, magnetic behavior, and surface chemistry. These predictions help guide the development of new materials for applications ranging from solar cells to battery electrodes. The calculations can screen thousands of potential materials before synthesis, saving considerable time and resources in the laboratory. Another important use case is to interpret features of complicated spectra such as UV-vis for electronic states, IR for molecular vibration, or NMR for nuclear environments and molecular structure.
This article explains the background, derivation, and application of DFT without using equations. These “big ideas” can provide a solid foundation for any chemist or serve as the starting point for your journey to becoming a quantum chemist. Are you ready?
1 Understanding Atomic and Molecular Structure
Before we dive into Density Functional Theory (DFT), let’s understand what we’re trying to describe. Virtually every piece of matter we deal with on earth is made up of atoms, and atoms consist of positively charged nuclei surrounded by negatively charged electrons. These electrons are responsible for chemical bonding and most chemical properties we observe.
When atoms come together to form molecules, their electrons interact in complex ways. Understanding these interactions is crucial for predicting chemical behavior, but it presents an enormous challenge. The electrons move around the nuclei, repel each other, and follow the strange rules of quantum mechanics rather than classical physics.
1.1 The Challenge of Describing Electrons
In the everyday world, we can easily describe the motion of objects using Newton’s laws. We can calculate exactly where a ball will land when thrown, or how planets orbit the sun. However, electrons behave differently and cannot be described in this classical picture. We know this because various scientists tried it and it led to a mismatch between experiment and theory. This has led to the discovery of Quantum Mechanics as a framework to describe very small particles such as electrons. Curiously, it is a framework that we are not used to from our everyday experience at all. That’s why it often feels counterintuitive - humans never had a chance to develop an intuition about electrons, but a lifetime of experience with falling objects, flying balls, and stopping cars helped us understand Newton’s laws.
It turns out that the correct quantum mechanical framework to describe electrons is that of a mathematical wave function \( \Psi \) (greek letter psi) that has the property that its (absolute) squared value \( \left | \Psi \right|^2 \) multiplied by a tiny volume element \( dV \) describes the probability of finding the electron in this tiny volume. This means we can never know exactly where an electron is. Instead, we can only describe the probability of finding an electron in a particular region of space.
1.2 The Many-Body Problem
When dealing with just one electron, calculating its wave function is relatively straightforward (=take a quarter of CHEM 151 at SCU). However, real chemical systems contain many electrons, and this is where things get complicated. Each electron interacts with all other electrons, creating what is called the “many-body problem.” This problem already shows up in classical mechanics when you think how planets move with their gravity pulling on each other: if you want to predict where one planet will go, you need to know where all the other planets are, but those planets are also moving and affecting each other. Add the extra complexity of the quantum world and the problem becomes even more daunting than it already is in classical physics.
1.3 The Hamiltonian: A Mathematical Description of Energy
In quantum mechanics, we use a mathematical operator called the Hamiltonian to describe the total energy of a system. While this might sound abstract, we can break it down into understandable parts. The Hamiltonian includes:
- Kinetic Energy: This represents the energy of motion of the electrons and nuclei.
- Potential Energy: This includes several types of interactions:
- The attraction between electrons and nuclei (negative and positive charges attract)
- The repulsion between electrons (negative charges repel)
- The repulsion between nuclei (positive charges repel)
When written mathematically, the Hamiltonian looks complicated because it needs to account for all these interactions. Each term in the equation represents one of these energy contributions:
H = Kinetic Energy of Electrons + Kinetic Energy of Nuclei + Electron-Nuclear Attraction + Electron-Electron Repulsion + Nuclear-Nuclear Repulsion
1.4 The Traditional Quantum Mechanical Approach
The traditional approach to solving this problem involves finding the wave function that describes all electrons in the system. This wave function must satisfy the Schrödinger equation, which is the fundamental equation of quantum mechanics connecting the Hamiltonian with the wave function and the total energy of the molecule.
However, this approach faces a severe limitation: it becomes impossibly complicated for systems with more than a few electrons. Even with modern supercomputers, exactly solving the Schrödinger equation for a molecule as simple as ethanol is beyond our capabilities.
2. Enter Electron Density
This is where Density Functional Theory offers an elegant alternative. Instead of trying to solve for the complicated equation for the wave function that describes all electrons individually, DFT focuses on a simpler quantity: the electron density \( \rho \) (greek letter rho)
2.1 What is the Electron Density?
The electron density tells us how many electrons we expect to find at each point in space (this can be non-integer numbers smaller than 1 because those are just averaged values). It’s like a 3D map showing where electrons are likely to be found. For example:
- Near atomic nuclei, the density is very high (because positive and negative charges attract each other)
- Between bonded atoms, we see moderate density
- Far from atoms, the density approaches zero
Below is a plot for a diatomic molecule with the atoms positioned in the x/y plane. The z axis is used to plot the electron density.

To understand electron density, consider these everyday analogies:
- Population Density: Just as we can describe how many people live per square kilometer in different areas of a city, the electron density tells us how many electrons we expect to find in different regions of a molecule.
- Weather Maps: Like how weather maps show the distribution of temperature or pressure across a region, electron density maps show how electrons are distributed in space.
- Topographic Maps: Similar to how elevation maps show the height of terrain, electron density maps show regions of high and low electron concentration.
2.2 Advantages of the Electron Density
Conceptually and computationally, there are many advantages of working with just the density, because compared to the wave function:
- The density is a function of just three spatial coordinates (x, y, z) and not the positions of all individual electrons
- The density is a physically observable quantity
- The density has a clear physical interpretation (see analogies above)
2.3 Measuring the Electron Density
In practice, one can actually measure the electron density through various experimental techniques. X-ray diffraction, for example, works because X-rays interact with electrons. When we shoot X-rays at a crystal, the resulting diffraction pattern gives us a map of the electron density. This is how crystallographers determine molecular structures. Similarly, when we measure properties like molecular polarizability (how easily electrons can be moved around by an electric field) or NMR chemical shifts (how nuclei experience their electronic environment), we are really measuring effects that depend on the distribution of electron, i.e. the electron density.
Being experimentally measurable is a nice “bonus feature” of the electron density but as we shall see below, there is no need to measure the density to run calculations based on it.
3. The Foundation of DFT: The Hohenberg-Kohn Theorems
Now that we discussed what electron density is, let’s explore how it relates to the energies of molecular systems: In 1964, Hohenberg and Kohn made a revolutionary discovery. They proved mathematically that the electron density contains all the information needed to determine any property of a molecule or material. This might seem surprising - how can simply knowing how many electrons are at each point tell us everything about a chemical system? To understand this, let’s think about what the electron density tells us:
- The locations of atomic nuclei appear as sharp peaks in the electron density.
- The height of these peaks tells us the nuclear charge, identifying which elements are present.
- The total number of electrons comes from integrating (adding up) the density over all space.
This is all the information we need to fully characterize a molecular system. But Hohenberg and Kohn proved something even more profound: they showed that there’s a unique relationship between the electron density and the energy of the system. This means if we know the exact electron density, we can - in principle - calculate the exact energy, and from the energy, we can determine any other property we might be interested in.
To understand how DFT works in practice, we need to break down the total energy into its components similar to how we did for the Hamiltonian operator when discussing the wave function above. The total energy comes from several sources: The kinetic energy of the electrons represents their motion. Just as a moving car has kinetic energy, electrons in motion have kinetic energy, though they follow quantum mechanical rules rather than classical physics. The potential energy comes from various attractions and repulsions: Electrons are attracted to the nuclei (opposites attract); Electrons repel other electrons (like charges repel); Nuclei repel other nuclei (like charges repel). The challenging part is that all these components are interconnected. The way electrons move depends on how they repel each other and how they’re attracted to the nuclei. This interdependence is what makes the problem so difficult to solve – and why it has not been solved exactly to date.
Math Background: The mathematical language for DFT is that of functionals. This must no be confused with the word function. They are both mathematical concepts that maps inputs to outputs, assigning a single value to each input. An example for a function is \( f(x)=x^2 \), which takes a number as input (e.g. 3) and produces another number as output (the number 9 for the given input). A functional, on the other hand, is a higher-level concept that takes an entire function as its input (for DFT this function input is the electron density) and produces a single value as output (for DFT this is number is the energy). To illustrate this, imagine a curve representing some property, such as temperature over time. A functional might take that curve and calculate a single value from it, e.g. the average temperature over a period. While functions work with individual values, functionals operate on entire functions, capturing broader patterns or properties in a single number. One can loosely say: “A functional is a function of a function”.
4. The Kohn-Sham Approach: A Clever Solution
One year later, in 1965, Kohn and Sham developed an ingenious method to handle this complexity. Instead of trying to deal with the real system of interacting electrons directly, they invented a fictious system that’s much easier to solve. The idea is to first “cheat” a little bit and later correct for the deviation from the actual system.
And it works like that: Imagine replacing all the electrons, which interact with each other in complicated ways, with a set of non-interacting electrons moving in a special potential (an electric field). This potential is designed so that these non-interacting electrons produce exactly the same electron density as the real system. This is helpful because the energies of non-interacting electrons are much easier to calculate.
The Kohn-Sham approach breaks the energy calculation into manageable pieces:
- The kinetic energy of the non-interacting electrons, which we can calculate exactly
- The classical electrostatic interactions assuming the non-interacting electrons would again start to interact, which can also be calculated exactly
Even though this is based on the fictious Kohn-Sham system it works surprisingly well giving us ~99% of the total energy of the real system. However, higher accuracy is needed for chemically useful predictions so we need a third part:
- A correction term called the exchange-correlation functional that accounts for everything else that was neglected and simplified above.
The exchange-correlation functional includes all the complex quantum mechanical effects that we can’t calculate exactly. It is the price we pay for “cheating” in the beginning. This is where approximations come into play, and different types of DFT calculations mainly differ in how they approximate this term.
5. The Exchange-Correlation Challenge
The exchange-correlation energy accounts for two main effects:
Exchange energy arises from the quantum mechanical requirement that electrons with the same spin must avoid each other. This is known as the Pauli exclusion principle, and it helps explain why electrons arrange themselves in pairs in the orbitals around atomic nuclei. Spin is an intrinsic quantum property of particles that is only called “spin” because it has similar mathematical properties as an angular momentum (nothing is really “spinning”). There is up-spin (also called \( \alpha \) spin) and down-spin (also called \( \beta\) spin). It’s best to not try to imagine how spin “looks like”. Instead, it’s better to accept it as a basic property of matter just like the “plus or minus charge” of a particle is widely accepted as one of particles’ basic properties but also offers no clear picture of what it means for a particle to be “negatively charged”. We can just observe the effects of this phenomenon.
Correlation energy comes from the way electrons coordinate their motions to avoid each other. Think of people moving through a crowded room - they adjust their paths to avoid collisions. Electrons do something similar, but following quantum mechanical rules.
Next, let’s explore how we approximate these effects in actual calculations. Remember, if we knew the exact exchange-correlation energy, DFT would give exact results. Since we don’t, we need to develop practical approximations.
5.1 The Local Density Approximation: The Simplest Model
The first and simplest approach to approximating exchange-correlation effects is called the Local Density Approximation (LDA). The idea behind LDA is straightforward: we pretend that the electrons at each point in space behave like a uniform gas of electrons. To understand this, imagine dividing space into tiny cubes. In each cube, we look at how many electrons are present (the density) and calculate the exchange-correlation energy as if that density filled all of space uniformly. We then add up these contributions from all the cubes to get the total exchange-correlation energy.
For the exchange part, we have an exact expression for a uniform electron gas: The exchange energy per electron is proportional to the cube root of the density. This makes intuitive sense: as electrons get closer together (higher density), they interact more strongly. The correlation part is more complicated. Physicists have performed extremely accurate quantum calculations for the uniform electron gas at different densities. We use these results to create a fitted functional formula that tells us the correlation energy for any density value.
The LDA works surprisingly well for many systems, especially metals where the electron density varies slowly. It provides fairly good predictions for:
- Bond lengths in molecules
- Crystal structures of solids
- Vibrational frequencies
However, LDA has important limitations. It tends to:
- Overestimate bond strengths
- Make atoms too small
- Fail for systems with rapidly varying density
These limitations arise because real electron densities are not uniform – they have peaks, valleys, and rapid variations that LDA can only capture using information from the density at any given point in space.
5.2 The Gradient Expansion: Including Density Changes
To improve upon LDA, quantum chemists realized they needed to account for how rapidly the density changes from point to point. This led to the development of the Generalized Gradient Approximation (GGA). GGA considers not just the density at each point, but also how steeply the density is changing – its gradient (a 3D analogue of the first derivative). This is like considering not just the height of terrain on a topographic map, but also how steep the slopes are. The mathematical expression for GGA builds upon LDA by including these gradient terms.
The exchange-correlation energy now depends on both the density and its gradient. This allows GGA to better describe:
- Chemical bonds
- Molecular geometries
- Energy differences in chemical reactions
- Surface properties of materials
5.3 Modern Functional Development
Quantum chemists and physicists continue to develop more sophisticated approximations. These include: Meta-GGA functionals: These consider additional information about the density, including how it curves (its second derivative) and the kinetic energy density. This helps capture more subtle electronic effects. Hybrid functionals: These mix some exact exchange calculated from quantum mechanics with GGA exchange and correlation. The most famous hybrid functional, B3LYP, has been so popular during the early 2000s that it was considered the “workhorse of computational chemistry” because it provides good accuracy for many molecular properties. Today it is mostly considered outdated. According to recent literature, there are usually better functionals available and blindly using B3LYP exposes you to the risk of being harshly criticized by experts in the field. Range-separated functionals: These treat electron interactions differently at short and long inter-electronic distances, helping to describe phenomena like charge transfer more accurately. Local hybrid functionals: Local Hybrids (LHs) are a class of hybrid functionals that replace the constant admixture of exact exchange in conventional hybrid functionals with a position-dependent local mixing function. This aims to fine-tune intricate electronic effects such as self-interaction correction and so called “nondynamical” correlation. They are a central research topic in the lab of Dr. Grotjahn.
6 Practical Implementation: How DFT Calculations Actually Work
6.1 The SCF Method
When we perform a DFT calculation, we follow an iterative procedure known as the self-consistent field (SCF) method. Here’s how it works:
-
Start with an initial guess for the electron density. This might come from overlapping atomic densities or a simpler calculation. In a Kohn-Sham calculation, the mathematical form to express the density is based on a set of orbitals. Orbitals are mathematical functions that describe where individual electrons are likely to be found.
-
Use this density to construct the effective potential that the electrons interact with. This includes:
- The electron-nucleus attraction
- The classical electron-electron repulsion
- The exchange-correlation potential from our chosen functional
-
Solve the Kohn-Sham equations to find a new orbitals.
-
Calculate a new electron density from these orbitals by adding up the probability of finding electrons from each occupied orbital.
-
Compare the new density with the starting density. If they’re different, use the new density to create an improved guess and return to step 2.
-
Repeat until the density or the associated energy stops changing (converges).
6.2 Basis Sets: The Mathematical Building Blocks of Complicated Functions
To solve the Kohn-Sham equations on a computer, we need to express the orbitals in terms of simpler mathematical functions called basis functions. The most common type of basis functions are Gaussian functions: Their radial part look like bell curves and their 3D shape looks a lot like the hydrogen atom orbitals you might have seen in a General Chemistry lecture (spherical s-orbitals, dumbbell-shaped p-orbitals, four-leaf clover shaped d-orbitals, etc.) In general, the more basis functions are used to describe the orbitals and thus the densities, the more accurately the intricate shapes of the true orbitals and their interactions can be captured.
7 Applications
7.1 Scope
The true power of DFT emerges when we apply it to real chemical problems. Chemists regularly use DFT calculations to understand reaction mechanisms, predict molecular properties, and design new materials. These calculations can reveal insights that would be difficult or impossible to obtain through experiments alone. When studying chemical reactions, DFT helps identify transition states and intermediate structures along reaction pathways. The calculations can predict activation energies and reaction energetics, helping chemists understand why some reactions occur readily while others are slow or forbidden. This understanding guides the development of new catalysts and the optimization of reaction conditions. In materials science, DFT predicts properties such as band gaps, magnetic behavior, and surface chemistry. These predictions help guide the development of new materials for applications ranging from solar cells to battery electrodes. The calculations can screen thousands of potential materials before synthesis, saving considerable time and resources in the laboratory. Another important use case is to interpret features of complicated spectra such as UV-vis for electronic states, IR for molecular vibration, or NMR for nuclear environments and molecular structure.
7.2 Selecting Computational Methods
The choice of exchange-correlation functional and basis set profoundly impacts both the accuracy of results and the computational cost. This selection requires careful consideration of the chemical system and the properties of interest. For molecular systems, hybrid functionals often provide the best balance of accuracy and computational cost. They perform particularly well for properties such as bond energies, molecular geometries, and reaction barriers. However, these functionals become computationally demanding for large systems, especially those with periodic boundary conditions (e.g. crystals with an infinite lattice). The basis set selection depends on the desired accuracy and the nature of the chemical system. Larger basis sets generally provide more accurate results but at increased computational cost. The key is to choose a basis set that adequately describes the chemical properties of interest without wasting computational resources on unnecessary precision.
7.3 Understanding Computational Results
Interpreting DFT results requires careful consideration of the method’s approximations and limitations. The calculated properties always contain some level of uncertainty due to the approximate nature of the exchange-correlation functional and finite basis set.
Energy differences typically provide more reliable information than absolute energies. When comparing different molecular structures or reaction pathways, systematic errors often cancel out, leading to more accurate relative energies. This makes DFT particularly valuable for predicting reaction energetics and conformational preferences.
Geometric parameters like bond lengths and angles generally show good agreement with experimental values. However, weak interactions such as van der Waals forces may require special treatment through additional corrections to the functional or the use of specialized functionals designed for these interactions.
8 Current Challenges and Future Directions
Despite its successes, DFT continues to face important challenges. Strong correlation effects, found in transition metal compounds and bond-breaking processes, remain difficult to describe accurately with current functionals. Charge transfer excitations and long-range interactions also present ongoing challenges. Research continues to develop improved functionals and computational approaches. Machine learning techniques are beginning to play a role but are still very limited in scope due to the steeply increasing training data requirements for extrapolative ML models. The future of DFT could also involve integration with other theoretical methods: Hybrid approaches combining DFT with wave function methods show promise for challenging systems.
DFT has fundamentally changed how chemists approach molecular and materials research. It provides a theoretical framework that allows to predict reaction energies and chemical properties with an excellent cost-performance ratio. This connection enables rational design of new materials and chemical processes based on fundamental understanding rather than trial and error. The method’s impact extends beyond traditional chemistry into fields such as biology, materials science, and catalysis. As computational power increases and methods improve, DFT will continue to evolve, providing deeper insights into chemical processes and enabling new discoveries in molecular science.