In an interview with TechTalks, Andrew Feldman, CEO of Cerebras Systems, discussed the importance of the ecosystem of open-source language models and the new applications they will unlock.
Cerebras has recently released a family of open-source LLMs called Cerebras-GPT, which is fully open source and available for commercial use.
Key discussion points of the interview:
Flexibility: Open-source LLMs provide wide range of options and tradeoffs that provide more flexibility in comparison to closed models
Data over parameters: open-source models have shown that given a fixed budget, you will get better results by training a small model on a big dataset than a big model on a small dataset
Scaling laws: Cerebras-GPT uses a scaling law that can predict how your compute and data budget will affect your model’s performance
Read the full interview on TechTalks.
For more: