Skip to main content

2 posts tagged with "open-source"

View All Tags

· 6 min read
Free Wortley

On March 28th, Cerebras released on HuggingFace a new Open Source model trained on The Pile dataset called "Cerebras-GPT" with GPT-3-like performance. ( Link to press release)

What makes Cerebras interesting?

While Cerebras isn't as capable of a model for performing tasks when compared directly to models like LLaMA, ChatGPT, or GPT-4, it has one important quality that sets it apart: It's been released under the Apache 2.0 licence, a fully permissive Open Source license, and the weights are available for anybody to download and try out.

This is different from other models like LLaMA that, while their weights are freely available, their license restricts LLaMAs usage to only "Non-Commercial" use cases like academic research or personal tinkering.

That means if you'd like to check out LLaMA you'll have to get access to a powerful GPU to run it or use a volunteer-run service like KoboldAI. You can't just go to a website like you can with ChatGPT and expect to start feeding it prompts. (At least without running the risk of Meta sending you a DMCA takedown request.)

Proof-of-Concept to demonstrate Cerebras Training Hardware

The real reason that this model is being released is showcase the crazy silicon that Cerebras has been spending years building.

Cerebras vs NVIDIA silicon die show

A comparison of "one" Cerebras chip compared to an NVIDIA V100 chip.