Semiconductor designer Qualcomm (QCOM) showcased its latest generation of chips and AI models for mobile devices at the 2024 World Mobile Congress in Barcelona, Spain. Qualcomm CFO and COO Akash Palkhiwala joined Yahoo Finance Live to discuss what he thinks the future looks like for large language models and how it affects Qualcomm’s business. Segment is from February 26, 2024.
#yahoofinance #finance #news #youtubeshorts #youtube #shorts #shortsvideo #investingforbeginners #investing #stocks #ai #artificialintelligence #aistocks #qualcomm #tech #technology #technologynews
I don’t think large language models will be “deployed” on the edge in smartphones and PC. I think they will continue to run mainly in data centers and will be accessed over the net on smartphones and PCs. The reason is that LLMs are way to compute intensive to run on phones and even just the inference computations are massive so I think training and inference will be primarily done at large data centers.