PinnedAi2inAi2 BlogOLMo: Open Language ModelA State-Of-The-Art, Truly Open LLM and FrameworkFeb 11Feb 11
Ai2inAi2 BlogOLMoE: An open, small, and state-of-the-art mixture-of-experts modelWe’re introducing OLMoE, jointly developed with Contextual AI, which is the first mixture-of-experts model to join the OLMo family. OLMoE…Sep 4Sep 4
Ai2inAi2 BlogDigital Socrates: Evaluating LLMs through Explanation CritiquesBlog written by Yuling GuAug 12Aug 12
Ai2inAi2 BlogOpen research is the key to unlocking safer AIThe last few years of AI development have shown the power and potential of generative AI. Naturally, these leaps in machine intelligence…Aug 81Aug 81
Ai2inAi2 BlogLatest and greatest: Ai2’s release notesAlong with our rebrand, we’re excited to debut a new release note process. Because we’re making regular updates and new asset roll-outs in…Aug 5Aug 5
Ai2inAi2 BlogPolygloToxicityPrompts: Multilingual Evaluation of Neural Toxic Degeneration in Large Language…The presence of low-quality data on the internet leads to undesirable, unsafe, or toxic knowledge being instilled in large language models…Jun 24Jun 24
Ai2inAi2 BlogData-driven Discovery with Large Generative ModelsHow do you boil the ocean? That impossible task is what researchers in every field try to accomplish when they sort through the existing…May 161May 161
Ai2inAi2 BlogOLMo 1.7–7B: A 24 point improvement on MMLUToday, we’ve released an updated version of our 7 billion parameter Open Language Model, OLMo 1.7–7B. This model scores 52 on MMLU, sitting…Apr 171Apr 171
Ai2inAi2 BlogMaking a switch — Dolma moves to ODC-BYWe’re moving the Dolma dataset to the ODC-BY license. Here’s why.Apr 15Apr 15