ECON OF AI PRODUCTOIN
You might think the AI wars are absurd now, but Amazon is releasing compute-time to anyone willing to pay. This means people like me (us) and thousands of others, will build custom AI’s for different purposes. And as we’ve seen over the past few months, these AI’s are better at training a new generation of AI’s than are people. So AI’s will get dramatically better with lower cost compute.
Why? Will this happen? Because, as we said in the 80s, the tech isn’t the problem, it’s the hardware. A decent engineer can learn the software model for LLMs in about two weeks.
The only ‘moat’ that the big companies have, is the cost of the hardware from Nvidia in particular. And Tesla is building it’s own. And more will come on line over time. So the shortage of and cost of compute will continue to decrease until it’s the power consumption of the training algorithms that’s the cost barrier. (And it’s not cheap).
Then when neuromorphic chips are available they’ll destroy both the energy cost and the competitors producing the current technology. Because, as Turing stated, many tiny simple processors with light or hardware connections, operating at low power, will defeat any set of cores in existing processors without effort.
Source date (UTC): 2023-10-05 16:35:22 UTC
Original post: https://twitter.com/i/web/status/1709970546270547974
Leave a Reply