Modern AI data centers consume enormous amounts of power, and it looks like they will get even more power-hungry in the coming years as companies like Google, Microsoft, Meta, and OpenAI strive towards artificial general intelligence (AGI). Oracle has already outlined plans to use nuclear power plants for its 1-gigawatt datacenters. It looks like Microsoft plans to do the same as it just inked a deal to restart a nuclear power plant to feed its data centers, reports Bloomberg.

  • @[email protected]
    link
    fedilink
    English
    68 hours ago

    I think when you start looking at how expensive other forms of green energy are (like wind) long term, nuclear looks really good. Short term, yeah it’s expensive, but we need long term solutions.

    • @[email protected]
      link
      fedilink
      English
      13 hours ago

      I don’t think that math works out, even when looking over the entire 70+ year life cycle of a nuclear reactor. When it costs $35 billion to build two 1MW reactors, even if it will last 70 years, the construction cost being amortized over every year or every megawatt hour generated is still really expensive, especially when accounting for interest.

      And it bakes in that huge cost irreversibly up front, so any future improvements will only make the existing plant less competitive. Wind and solar and geothermal and maybe even fusion will get cheaper over time, but a nuclear plant with most of its costs up front can’t. 70 years is a long time to commit to something.

      • @[email protected]
        link
        fedilink
        English
        23 hours ago

        Can you explain how wind and solar get cheaper over time? Especially wind, those blades have to be replaced fairly often and they are expensive.