Go big or go small? Choosing the right AI model
Should enterprises deploy large AI models or do smaller ones work better?
Should enterprises deploy large AI models or do smaller ones work better? Here's what the experts say.
I had the privilege of moderating a panel "Is your AI too big (or too small)" at the 7th CDDO Asia Summit earlier this month.
Here's what Carlos Queiroz, the global head of data science engineering at Standard Chartered Bank, and Sudhanshu Duggal, a longtime CIO of a global FMCG firm, have to say.
- Niche tasks: There are obvious choices in some cases, notes Carlos, such as using certain smaller models designed for unique tasks.
- Beware of model 'sprawl': Be mindful that the deployment of too many separate models simultaneously can substantially increase upkeep.
- Generic models work too: It can sometimes make more sense to use an available model than trying to reinvent the wheel, according to Sudhanshu.
- Not scalable: And while smaller models are generally more efficient, sometimes they might take just too much effort and time to fine-tune properly.
For more of their tips, including how enterprises should get started and the importance of maintaining control over one's data, read my article here.