are there any studies or data on the actual consumption of electricity that AI uses? I know that training consumes a lot, but if the usage of it doesn't consume much, it doesn't really matter since it's a one time cost.
I don't know of any studies unfortunately, but I did want to point out that training is not quite a one time cost in practice, because training has already been done loads times and is still being done! I'm theory, if we stopped training all AI and just kept the ones we have, then indeed the training cost would be bounded just like you say, I'm just afraid we're quite far from that.
It would be on the order of aN intensive video game, maybe. Depends on the size of the model, etc.
Training is definitely expensive but you are right in that it's a one-time cost.
Overall, the challenge is that it's very inefficient. To use a machine learning algorithm to do something that could be implemented deductively is not ideal (On the other hand, if it saves human effort...)
To a degree, trained models can also be retrained on newer data (eg freezing layers, LoRa, GaLore, Hypernetworks etc). Also newer data can be injected into a prompt to make sure that the responses are aligned with newer versions of software, for example.
The electricity consumption is a concern, but it's probably not going to be the end of the world.
They can also be really good for quickly writing code if you line up a whole bunch of tests and line up all the types and then copy and paste that a few times, maybe with a macro in Vim.
The LLM will fill in the middle correctly, like 90% of the time. Compare it in git, make sure the tests pass, and then that's an extra 20 minutes I get to spend with my wife and kids.
Yeah, that's always a risk, but as you said, humans make mistakes too. And if you change your approach to software development by writing more tests and using strict interfaces or type annotations, etc., it is pretty reliable and definitely saves time.