You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a model called SOLAR. This model follows the same architecture as LLaMA2, but it has more layers which make it outstanding performer better than Mistral and even Mixtral at some points (open LLM Leaderboard)
In this case, what could be the contribution points?
The text was updated successfully, but these errors were encountered:
@tirthasheshpatel is working on finishing up our llama2 implementation. Once it is ready, we could probably just extend our conversion script an d add this as a variant for llama2?
There is a model called SOLAR. This model follows the same architecture as LLaMA2, but it has more layers which make it outstanding performer better than Mistral and even Mixtral at some points (open LLM Leaderboard)
In this case, what could be the contribution points?
The text was updated successfully, but these errors were encountered: