
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Luke James is a freelance writer and journalist.\u00a0 Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.\u00a0 ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-17/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Luke James Social Links Navigation Contributor Luke James is a freelance writer and journalist. Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.
jp7189 ..and here I thought most of OpenAI's first mover advantages had been eroded by everyone else catching up. I really look for any excuse to not use Google products, but most times when I run the same task against multiple models, Google is coming out ahead since 3.0 and now even more so with 3.1. Thats said, for local models, I'll take OpenAI's OSS all day everyday against Google's Gemma. Heck for specific tasks, there's probably a llama3 for that. That model is so versatile and trainable. I dont think any of that is attracting investment dollar though. Reply
rluker5 jp7189 said: ..and here I thought most of OpenAI's first mover advantages had been eroded by everyone else catching up. I really look for any excuse to not use Google products, but most times when I run the same task against multiple models, Google is coming out ahead since 3.0 and now even more so with 3.1. Thats said, for local models, I'll take OpenAI's OSS all day everyday against Google's Gemma. Heck for specific tasks, there's probably a llama3 for that. That model is so versatile and trainable. I dont think any of that is attracting investment dollar though. I've heard you can run Google's translator locally on their phones. If you were out of the reach of their servers, in a place like a subway or out in the country and didn't have starlink. Maybe some of their camera enhancements can also be run locally? If cloud based AI starts using a subscription model this could throw a monkey wrench in that. Google sneaking in on some exclusivity. I'm guessing the consumer market for AI services will be mostly in the mobile market and I don't see OpenAI doing well there. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/openai-raises-110-billion-in-largest-ever-private-tech-funding-round#main
- https://www.tomshardware.com
- From Radiology to Drug Discovery, Survey Reveals AI Is Delivering Clear Return on Investment in Healthcare
- NVIDIA Brings AI-Powered Cybersecurity to World’s Critical Infrastructure
- Memory vendor under fire for imposing hefty 15% depreciation fee on returns despite skyrocketing RAM value — user expected RMA replacement but gets hit with a l
- Two suspects arrested over theft of $1.5 million in Bitcoin stolen from police custody in outrageous blunder — Korean cops left virtual assets stored with a thi
- HP says memory costs doubled in one quarter, now account for 35% of PC build materials
Informational only. No financial advice. Do your own research.