GPT OpenSource DEMO

Website

A demo of OpenAI's open-weight models, gpt‑oss‑120b and gpt‑oss‑20b, for developers.

About GPT OpenSource DEMO

A few quick facts about these 2 models:

  • architecture: built on the GPT‑4 architecture
  • Knowledge cutoff: Training data includes information up until June 2024
  • Usage scenarios: They can help with a wide range of tasks—answering factual questions, brainstorming ideas, explaining concepts, tutoring in subjects like math or programming, drafting emails or stories, translating text, summarizing long documents, coding snippets, and more.

Highlights

  • Permissive Apache 2.0 license: Build freely without copyleft restrictions or patent risk—ideal for experimentation, customization, and commercial deployment.
  • Configurable reasoning effort: Easily adjust the reasoning effort (low, medium, high) based on your specific use case and latency needs.
  • Full chain-of-thought: Provides complete access to the model's reasoning process, facilitating easier debugging and greater trust in outputs. This information is not intended to be shown to end users.
  • Fine-tunable : Fully customize models to your specific use case through parameter fine-tuning.
  • Agentic capabilities: Use the models' native capabilities for function calling, web browsing, Python code execution, and Structured Outputs.
  • Native MXFP4 quantization: The models are trained with native MXFP4 precision for the MoE layer, allowing gpt-oss-120b to run on a single H100 GPU and gpt-oss-20b to run within 16GB of memory..

more links:

Screenshots