AMD-OLMo Collection AMD-OLMo are a series of 1 billion parameter language models trained by AMD on AMD Instinct™ MI250 GPUs based on OLMo. • 4 items • Updated Dec 5, 2025 • 19
Universal Conditional Logic: A Formal Language for Prompt Engineering Paper • 2601.00880 • Published Dec 31, 2025 • 1
OASIS: Order-Augmented Strategy for Improved Code Search Paper • 2503.08161 • Published Mar 11, 2025 • 2
HiPO: Hybrid Policy Optimization for Dynamic Reasoning in LLMs Paper • 2509.23967 • Published Sep 28, 2025 • 3
SAND-Math: Using LLMs to Generate Novel, Difficult and Useful Mathematics Questions and Answers Paper • 2507.20527 • Published Jul 28, 2025 • 7
view article Article Tiny Agents in Python: a MCP-powered agent in ~70 lines of code +2 May 23, 2025 • 171
OGA_DML_8_6_2025 Collection Models are quantized using quark-0.9, transformers-4.50.0, OGA-0.7.1, ORT-1.21.1 followed by OGA-DML export. • 10 items • Updated Dec 5, 2025 • 1