Hardware-based AI with 3-dimensional ferroelectric memories (3DFerroKI)
Project duration: 2024 - 2025
Artificial intelligence (AI) is considered a cross-industry driver of innovation and growth, but powerful models lead to a sharp increase in energy and water consumption due to cooling in data centers. The training of a single advanced AI model, for example used in ChatGPT (OpenAI) or Gemini (Google), currently requires up to 10 GWh and up to 1 million liters of water. This is equivalent to the annual energy consumption of around 1000 private households or the energy production of a nuclear power plant in 10 hours.
The increasingly widespread use of AI, i.e. the inference of AI models, results in a large consumption of resources in the data center and telecommunications sector, estimated to be up to 20 times higher than a conventional Google search. This is incompatible with CO2 emission targets.
Existing flash memories have reached physical scaling limits. In addition, they require operating voltages of 12 V to 20 V, which, with the resulting energy consumption, exceeds the typical application area of “edge computing” and also runs counter to the vision of decarbonization. There is currently no technology available to run powerful AI models efficiently and reliably. Ferroelectric memories offer the fundamental potential to change this.
The project is therefore investigating how ferroelectric memories can be used in 3D capacitors, which are already used today in scalable, high-density, commercially available but volatile DRAM memories. The implementation of ferroelectric HfO2- and ZrO2-based films in the established process flow of a high-volume memory product would be a breakthrough for numerous AI applications.