Fastino Secures $17.5M to Train AI on Gaming GPUs
Fastino, a Palo Alto-based startup, is challenging the prevailing trend of massive, expensive AI models. Instead of relying on large GPU clusters, Fastino trains its smaller, task-specific AI models using affordable gaming GPUs totaling less than $100,000.
This innovative approach has attracted significant investment. Fastino recently closed a $17.5 million seed funding round led by Khosla Ventures, bringing the company's total funding to nearly $25 million. This follows a $7 million pre-seed round last November led by Microsoft's M12 and Insight Partners.
Smaller, Faster, and More Affordable AI
Fastino CEO and co-founder Ash Lewis emphasizes the advantages of their approach: "Our models are faster, more accurate, and cost a fraction to train while outperforming flagship models on specific tasks."
The company offers a suite of these small models to enterprise clients. Each model is designed for a particular business need, such as redacting sensitive information or summarizing documents. While early metrics remain undisclosed, Fastino claims impressive performance and rapid response times.
Fastino's strategy contrasts with competitors like Cohere and Databricks, who also offer specialized AI solutions. However, the trend towards smaller, more focused language models for enterprise applications suggests Fastino's approach may be well-timed.
Khosla Ventures' Investment Signals Confidence
The investment from Khosla Ventures, known for its early backing of OpenAI, signals strong confidence in Fastino's potential. The startup is now focused on expanding its team of AI researchers who share their vision for efficient and practical AI development.
“Our hiring strategy is very much focused on researchers that maybe have a contrarian thought process to how language models are being built right now,” says Lewis.
While the enterprise AI market is competitive, Fastino's focus on affordability and efficiency could position it for success.