Welcome to the #1 Online Finance & Investment Banking Community for
the UK and EMEA!

By registering, you'll be able to contribute to discussions, send private messages to other members of the community and much more.

Sign Up Now

Elliott says Nvidia is in a ‘bubble’ and AI is ‘overhyped’

Canary Wharfian

Administrator
Jul
34
1
Staff member
Hedge fund tells clients many supposed applications of the technology are ‘never going to actually work’


OK, so I work in tech and wanted to share this one as it's quite close to "home". I largely agree with the sentiment of this article, I don't think general AI will ever materialise and become a reality. No matter how hard interested parties are pushing the idea of it happening. There is a limit to how much one can push statistical and probabilistic models, even with infinite amount of computing power.

I think this story will play out similarly how bitcoin did, although we haven't seen the end of it yet. Lots comparing it to other bubbles that happened in history, like the dotcom at the turn of the century or Tulip back in the middle ages, and yet, bitcoin is still here.

Another interesting parallel story: https://finance.yahoo.com/news/intel-ceo-fires-back-nvidia-040943532.html

AI really consumes a ton of energy and I think this might put a limit on the growth of the industry, especially now that ESG is receiving a lot more attention (https://www.cnbc.com/2024/05/15/mic...-since-2020-due-to-data-center-expansion.html). Data centers (the buildings powering LLMs) are actually a pretty significant contributor to global warming (around 5%).

Now, I had seen some great applications of generative AI (which is one of the main flavors of AI in general) in many different domains, such as patient consultancy at healthtech startups, contract review at legal tech startups and assistance with writing tenders. The reason AI is really useful is that it saves time. It's not that it's super smart. Also, it can summarize long texts really well. That's about it.

AI (or machine learning, or natural language processing in technical terms) is also not something new. In fact, it had been around for multiple decades in some way, shape or form. The only thing stopping it from becoming commercialised was the lack of powerful enough computing capacity for deployment at scale. It was recent advancements in tech hardware that enabled this big data revolution, not some groundbreaking academic research. So betting on AI is really betting on tech, and the ability to generate a lot more data AND at the same time being able to process them will deliver great value to businesses.
 
Back
Top