Skip to main content

Introducing OpenAI GPT-5.3-Codex-Spark Powered by Cerebras>

nference

Making the world’s biomedical knowledge computable

With Cerebras’s powerful CS-2 system, we can train transformer models with much longer sequence lengths than we could before, enabling us to iterate more rapidly and build better, more insightful models.

Ajit Rajasekharan
CTO @ nference

Schedule a meeting to discuss your AI vision and strategy.

Performance comparisons are based on third-party benchmarking or internal testing. Observed inference speed improvements versus GPU-based systems may vary depending on workload, configuration, date and models being tested.

1237 E. Arques Ave
 Sunnyvale, CA 94085

© 2026 Cerebras.
All rights reserved.