Verkor's AI agent creates a RISC‑V CPU core from a 219‑word spec in 12 hours
At a glance:
- Design Conductor turned a 219‑word requirements sheet into a verified RISC‑V core in 12 hours
- The CPU, named VerCore, was validated in Spike simulation using the academic ASAP7 PDK, not fabricated in silicon
- Verkor estimates 5–10 human experts are still needed and compute costs rise non‑linearly with design complexity
What Verkor demonstrated
Verkor.io published a March research paper describing Design Conductor, an autonomous AI system that took a concise 219‑word specification and produced a complete RISC‑V CPU core—named VerCore—within a 12‑hour window. Traditional commercial chip projects typically span 18 to 36 months, making the reported speed a dramatic outlier. The team verified the design in simulation rather than fabricating silicon; they ran a uClinux variant on the core using Spike, the reference RISC‑V ISA simulator, and employed the ASAP7 academic process design kit, which mimics a 7 nm node but is not a production‑grade technology.
How the AI system works and where it stumbles
Design Conductor orchestrates the full flow from high‑level spec to GDSII layout. It leverages large language models (LLMs) to generate RTL code, perform synthesis, and iterate on timing closure. The paper is candid about shortcomings: the agent sometimes underestimates task complexity, such as opting to deepen the pipeline when a simpler timing fix would suffice. In another instance, the model treated Verilog—a fundamentally event‑driven language—as if it were sequential, complicating debugging without breaking functional correctness. These quirks highlight that the underlying LLMs still lack deep hardware‑design intuition.
Human oversight remains essential
Despite the automation, Verkor projects that five to ten seasoned engineers will be required to shepherd the AI output toward a production‑ready chip. Human experts must interpret the AI’s suggestions, resolve timing violations, and adapt the design for real‑world constraints like power, area, and manufacturability. Moreover, the compute demand grows non‑linearly as designs become more complex, meaning the current workflow may become impractical for large‑scale commercial projects without significant hardware investment.
Context within the AI‑chip race
Verkor’s achievement joins a short list of AI‑driven chip designs. In 2023, Chinese researchers reported a RISC‑V CPU generated in under five hours, and the QiMeng project later showcased a similar approach. Unlike those efforts, Design Conductor claims end‑to‑end autonomy from specification to GDSII, though all share the limitation of lacking physical silicon. The broader industry is watching to see whether such AI pipelines can compress design cycles enough to offset the high cost of mask sets and fabrication.
Next steps and upcoming demonstrations
Verkor plans to release VerCore’s RTL source code and build scripts by the end of April, inviting the community to reproduce the results. The startup also intends to demonstrate an FPGA implementation of the core at DAC, the annual Electronic Design Automation Conference. If the FPGA run validates performance and power targets, it could pave the way for a future silicon tape‑out, turning the AI‑generated blueprint into a marketable product.
What this means for chip designers
For traditional design houses, the paper suggests a potential shift toward AI‑augmented flows that handle routine RTL generation and early‑stage verification, freeing engineers to focus on system‑level optimization and validation. However, the need for expert oversight and the current simulation‑only status mean the technology is still a complement—not a replacement—for established design methodologies. Companies that can integrate such agents into their existing EDA toolchains may gain a competitive edge in speed‑to‑market.
FAQ
How long did Design Conductor take to generate the VerCore CPU from the specification?
Has the VerCore CPU been fabricated in silicon?
What level of human involvement does Verkor anticipate for future AI‑generated chips?
More in the feed
Prepared by the editorial stack from public data and external sources.
Original article