This podcast examines the substantial costs associated with AI infrastructure and the factors underlying these expenses. Experts elucidate the cost of training AI models, emphasizing the significance of parameters, floating point operations, and AI accelerator capabilities. They delve into the economics of AI, highlighting the disparity between the high cost of training and the relatively low cost of inference. The discussion explores the impact of model size and training data on cost and considers the potential for cost reduction in the future.