HomeBlogServicesContact
Home
Blog
Services
Contact
  1. Home/
  2. AI Codegen Blog/
  3. Hardware Copilots in 2025
Hardware Copilots in 2025
Author avatar
Patrick Paul•Follow on
May 5, 2025
AI Codegen

AI-assisted code has exploded since the launch of ChatGPT in late 2022, growing from a conversational chatbot partner and autocomplete engine to now in 2025 "vibe coding" where you can one-shot entire SaaS web applications or browser video games from a single prompt.

How far are we from one-shotting printed circuit boards (PCBs) for hardware prototypes from a single prompt?

Prototyping Hardware Products

Ten years ago when we built Freewrite the conventional wisdom was to build separate "looks like" and "works like" prototypes for any hardware product idea. Building a minimum viable product (MVP) was supposed to be two different undertakings for two different prototypes: a looks-like prototype that would focus solely on the aesthetics, form factor, and user interface (i.e. solid mahogany stereo housing; knobs, switches, buttons) without any working electrical components; and a second works-like prototype that could be a mess of development boards and wires hot-glued into a cardboard shoebox with working firmware and maybe some basic interface like LED lights and switches.

The parallel development of a "looks like" and "works like" prototype removes any of the constraints around fitting electronics into tight spaces and speeds time to market, as well as helps you quickly validate the user experience of the product and evaluate its technical feasibility. You can de-risk some of your product's challenges through rapid iteration, and future spins of the prototype (version two, version three, version seventeen?) would eventually converge into your "works-like-looks-like" prototype that unifies everything.

Gluttons for punishment that we were then and presumably still are, Adam and I of course went to market direct to Kickstarter with our version 1 works-like-and-looks-like prototype ... but, yeah, anyway, this article will use the two-prototypes approach as a springboard to discussing some copilot options available for text-to-CAD and text-to-PCB.

Airband as Guinea Pig

First, a prelude. The last serious hardware prototype I started out on was Airband. This was to be a standalone podcast player. There was a simple monochrome display for playback information; a slide potentiometer for time-scrubbing forwards and backwards; dials for volume and power; and an 8-position rotary switch for subscribing to eight different podcasts. The eight LEDs encircling the rotary switch would go from blue to indicate "new podcast episode available to stream" to green for "new podcast downloaded" to finally orange for "no new episodes for you to listen to right now".

This project was from 2019 into early 2020. I have a really hard time remembering what happened to halt development.

My desk rental at the time was attached a makerspace and someone finally fell ill at the makerspace and Governor Cuomo sent us all home. It's kind of eery looking at my firmware git log for that period and seeing my last commit messages for the project on March 25th.

One part of why I never picked the project back up after the pandemic lockdowns is a lingering ennui that the Golden Age of podcasts had slipped away during the pandemic. Before 2020, we had Serial and a vibrant wave of independents and small studios experimenting with the medium. After the pandemic, the landscape felt different: Joe Rogan went exclusive with Spotify, and the scene became saturated with monetization and clout-chasing personalities who didn't usually have much to add. The rare magic had passed for me. I'm not sure if all the lock-in and isolation contributed to inflated listenership and thereby recruited capital and enshittification to the medium but I'm sure there are some think-pieces on this to be found elsewhere.

The other bigger part of why I never picked it back up is called Life.

Airband-2

My new foray into AI copilots for web development in 2025 has sorta renewed my interest in hardware products again, however. And in a round-about way.

I have been plucking away at autonomous software development through my contributions to ra-aid (recent addition as a core maintainer; 1.9k stars on Github), an AI assistant for coding, and to django-mcp (1 star—just me as creator; stealth mode and not yet announced), a library for hosting MCP tools that chatbots and IDEs like Cursor and Claude Desktop can invoke. Evidenced by my contributions to these two projects, I'm super enthusiastic for generative AI source code and this enthusiasm is turning me back to hardware to try out AI codegen there.

So now I've been re-visiting Airband in a slightly different form factor. Ten years ago we accomplished the smart typewriter project with a lot of hard human work, but maybe the state-of-the-art for generative AI is ready to take over. While I have confidence that I could get this product to the finish line with mine own two hands, maybe it can now be done hands-free?

Looks-like prototype: Text-to-CAD

I'll be brief here as my own enthusiasm is primarily for the electrical engineering side of things, but myself and others have already been using parametric CAD modeling libraries like SolidPython and CadQuery to produce CAD models using just Python. And LLMs including ChatGPT, Claude, and Google Gemini are fantastic at generating Python code.

As a super simple example, here is how you might model a solid for a 3.7V battery:

# pkcell_battery.py
from solid import polygon, linear_extrude

# 60mm x 36mm x 7mm
length = 60
width = 36
height = 7

def battery():
    return linear_extrude(height=height)(
        polygon(points=[
            (0, 0),
            (0, width),
            (length, width),
            (length, 0)
        ])
    )

This battery solid could be imported in other design files for rendering in the GUI OpenSCAD application. CadQuery provides its own GUI application that can render solids and export files suitable for 3D printing.

Zoo.dev is also definitely worthy of a shout-out here, though I am personally happy to use Python directly for modeling for my next prototype.

Zoo Modeling App is an open-source hardware design interface that allows you to generate CAD models, both by editing code and with point-and-click actions.

Works-like prototype: Text-to-PCB

For circuit boards, we have a few different steps to tackle:

  1. Product requirements document (PRD)
    • What does the device need to do? How is it going to do it?
    • Questions like WiFi or Bluetooth LE
  2. Component selection (Block diagram & bill of materials)
    • High-level visual of system architecture
    • Shows functional units: CPU, sensors, power management, memory, etc.
    • Indicates how blocks interact (e.g., SPI lines, USB connection, power rails)
    • No pin numbers yet
  3. Schematic design (datasheets to pin-outs to schematics)
    • Retrieve datasheets for actual components
    • Specify pin numbers, voltages, signal interfaces
    • Generates files for exporting to PCB software
  4. PCB layout
    • Design stencil for circuit board
    • Drag and drop component footprints onto board
    • Route air-wires from the schematic components to actual traces on the circuit board

Product requirements. I've mostly already figured this out either in the shower or taking long walks around the city. When AI agents take over PCB design they will still need to be directed by us humans -- until Skynet surfaces anyway.

Component selection. I've developed a lot of familiarity over the years with ESP32 and STM32 family microcontrollers. I've mostly ironed out component selection out already, also, though a lot of it was handled conversationally with ChatGPT and Claude. It is important to note however that you still need to do a tremendous amount of due diligence as even for the ESP32 family, the LLM models kept providing dubious or flat-out wrong answers on particular SKU parts' support for Bluetooth Classic and Bluetooth LE and similar. You also need to know your own concrete requirements for things like SRAM, bus speed, power budget, etc., which may be informed by your own experience although I think chatbots can be helpful here.

This brings us to the schematic and PCB layout.

Schematic design

After you have selected your microcontroller, your buttons and switches, and any other integrated circuits (ICs) parts, you need to define them as text so that can be manipulated by AI agents. There are two opportunities for AI that I see here, namely, 1) using LLMs to parse and extract pin numbers and other metadata and 2) defining these ICs and their relationships using a quasi- programming language that LLMs understand. I'll start from number two first.

In this post I'm going to highlight one schematic-to-code tool I've played around with and link out to a few other tools in the space. The basic gist is that an LLM chatbot would be able to use the syntax and structure of a code library to generate the schematic from its own recall of which components and pin numbers you are using.

Atopile

Last summer I first came across atopile and had fun trying out schematic design as source code analogous to how I have already developed CAD models in SolidPython and CadQuery.

At the time, Atopile provided their own .ato file syntax that resulted in something like this:

# Pin header for ESP-PROG debugger

component EspProgUart:
    footprint = "PinHeader_2x03_Vertical"

    power = new Power
    uart = new UART
    boot = new GPIO
    reset = new GPIO

    signal esp_en       ~ pin 1  # reset pin
    signal power_3v3    ~ pin 2
    signal esp_txd      ~ pin 3
    signal gnd          ~ pin 4
    signal esp_rxd      ~ pin 5
    signal esp_io0      ~ pin 6  # boot mode selection

    power.vcc           ~ power_3v3
    power.gnd           ~ gnd
    uart.tx             ~ esp_txd
    uart.rx             ~ esp_rxd
    uart.gnd            ~ gnd
    boot.io             ~ esp_io0
    boot.gnd            ~ gnd
    reset.io            ~ esp_en
    reset.gnd           ~ gnd

This would add a symbol for a 2x3 pin header to your schematic to wire out to an external ESP-Prog board for debugging.

Also last summer on vacation, I had accomplished re-writing my old Airband-1 prototype as Atopile modules and had then been able to export these to KiCAD. KiCAD is a GUI application for schematics and circuit board design that ultimately provides you the files you need to hire a circuit board fab shop to manufacture your PCBs. At press time now, however, it appears Atopile is in the process of moving away from .ato files to a newer Python language framework.

Why Python may be better for AI

Moving to Python should be a positive development for generative AI copilot usage. The corpus of training data that LLMs like ChatGPT and Claude are trained on is predominantly Python and JavaScript, and for this reason AI excels at Python. Conversely, LLMs can struggle with other less popular languages.

Anecdotally, I have had poor experiences when using obscure 3rd-party libraries (in this case a Rust language project) when the software library may only have a few hundred downloads. These libraries are too obscure and certainly aren't being included in any corpus of training data for the LLM. That said, recent advancements like web research and tool calling have enabled AI agents to fetch documentation autonomously to improve generative AI code. In particular, I've used Tavily and context7 for retrieving documentation. And for a recent Rust project, I had great results vendoring in (i.e. copy/pasting) the entire 3rd-party library's source code into a project folder and specifically instructing my AI copilot to reference these files for function signatures and other implementation details.

Other hardware-as-code products I've found:

  • JITX: Next‑generation software to design better electronics
  • tscircuit: Build electronics with code, AI, and drag'n'drop tools
  • Flux.ai: a copilot that designs, reads datasheets, routes your board

Before the schematic: Datasheets

Before any AI agent can describe your circuits and ICs in atopile or any other syntax, it needs to understand the meaning of the words and pin tables in datasheets (hopefully English but sometimes Chinese) and incorporate this knowledge into its context.

Last summer, I manually had to search for and download the PDF datasheets for any part numbers in my project and translate these to Atopile modules. From there, I was able to export the schematic to KiCAD and begin placing and routing the circuit board.

The role for an AI agent or copilot however is to be able to 1) autonomously retrieve PDF datasheets from the web, 2) parse these into meaningful pin descriptions, and 3) hopefully understand any other details and constraints to create sub-tasks for anything analog.

Based on some Google research today as I write, parsing datasheets for circuits doesn't seem to be a solved problem yet.

For step 1, I do feel an LLM web search tool like Tavily or various other documentation retrieval tools will be satisfactory to fetch PDF datasheets. But AI is currently going to struggle with steps 2 and 3 around datasheet parsing. I can think of a couple further strategies to explore that might ameliorate:

  • Multiple agents ("LLM-voting"): have multiple LLMs parse and report pins and consensus wins
  • Parsing datasheet into sub-task prompts (more below)

Whereas a normal prompt might be something like "extract this motion sensor IC datasheet PDF into a pin-out in atopile syntax" that you expect the LLM to one-shot, you might consider a workflow around splicing the entire datasheet into sub-tasks that are always evaluated against.

The MCP tool claude-task-master will already expand complex tasks into sub-tasks that are tackled one at a time, in turn. And if you pass the --tasks 100 argument it will try to create 100 sub-tasks even if your primary task is "print out hello world". There may similarly be a way to add a stage to agentic AI workflows where each paragraph in a datasheet must produce three new sub-task prompts into a project folder called datasheet-prompts/ that are each evaluated in turn after the initial candidate pin-out is drafted. This is just an idea, and completely speculative and untested, but it does kind of reflect how we as humans analyze and review designs.

PDFs have always been a garbage way to describe ICs. We're still in AI's infancy so hopefully first-party players like the manufacturer or distributors or Octopart can just simply adopt Atopile or some other digital format for LLMs to play nice with schematics and create a part library of verified part schematic symbols.

After the schematic: PCB placement and routing

This blog post is super long so I've gotta stop writing here. For my next project, I do plan to try out whatever state-of-the-art autorouting tools are offered and will hopefully make a list surveying them here for this blog. One promising app is Quilter.ai. More soon.

Closed Beta

As a post-script to this blog post: The are two more products I have come across that are in closed betas that look super promising:

  • SnapMagic
  • AutoPCB
Ready to spark your next big big idea?
Join up for the occasional jolt of inspiration.
Made with ♡ in New York City
© 2025 Kitespark LLC.