Will Meta release a Llama 3 405B multi-modal open source before the end of 2024?
➕
Plus
13
Ṁ2431
Jan 1
37%
chance

Mutli-modal with any input other than plain UTF-8 text regardless output modes. Model must also be 405B in size not 400B

Get
Ṁ1,000
and
S3.00
Sort by:
sold Ṁ28 YES

3.2 dropped. "The two largest models of the Llama 3.2 collection, 11B and 90B,". Seems likely if they release a larger multimodal model it'll be in line with these and larger than the text only model.

Description requires "Model must also be 405B in size not 400B".

@Kearm20 how would you resolve a >405B model? No because it's not exactly 405?

@Lun Well what is said is said I suppose. I didn't anticipate a 100 layer multi-modal Llama 3 class model with 90B parameters when the question was created. The 400B provision was created because of early leaks but in the spirit of the original question it would have to be a ~405B, as weights are not exactly whole numbers most of the time, Llama 3 class model to resolve as yes before the end of 2024.

bought Ṁ150 YES

The paper explains that they are working on multi-modal models and even went into technical detail on the exact training method they're using.

I also saw a very short video generated by a very early version of Llama 3.1 405B at the offical Meta Llama 3 Hackathon in San Francisco that I hadn't seen any reporting on. I only have my personal conjecture as to why they didn't release any multi-modality yet.

sold Ṁ277 YES

Just sold all shares to be both more objective in resolution and with this year being an election year I personally see no a minefield for AI-at-Meta especially with how they practically stealth released Chameleon 30B.

@Kearm20 resolves YES

Disagree. Have you used Llama 405B? It is not multi-modal as specificed in the paper and the model weights are not multimodal. I have it set to by the end of 2024 so there still is a chance to have it resolve as yes but as of now no resolution is my analysis.

Full transparency I do even have a yes position on this but it simply isn't multimodal.... yet. Hence the wait. @Bayesian

bought Ṁ25 NO

Good point. Do you count it as opensource though? It’s arguably just openweights which im realising is something someone on another market brought up

I do as the weights are on hugging face in .safetensors format as well as the paper is out with all the technical details about how this model came into being. Inference code as well was given as an example. Sure it was "gated" but reuploads are not being taken down and we consider MIT or Apache 2.0 software to be "open source" so this is about as open as a model can be period with an even less restrictive license this time around.

Would you consider the smaller versions of llama 3 to be open source?

@Daniel_MC Considering they had no issue with me jailbreaking their model at the Meta Hackathon this weekend exceeding so. We also got information about the model so I think this is a really strong bet.

bought Ṁ40 YES

Related:

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules