Featured Article
We Open-Sourced Our AEO Audit Engine
We wanted a way to explain technical AEO work without relying on vague frameworks or proprietary mystery scores. Publishing the core audit engine as a public GitHub repo and npm package gave teams something concrete to inspect and use.
Why we built it in the open
AEO conversations are full of loose language. Teams hear terms like AI SEO, GEO, LLM optimization, and answer engine visibility, but they rarely get a clear model for what should be fixed first.
Publishing the engine meant turning our assumptions into explicit factors, weights, and outputs. That makes the work easier to inspect, test, and improve.
What the package actually does
@ainyc/aeo-audit is a public CLI and JavaScript library that audits 13 technical and content factors we believe correlate with AI citation readiness. It is designed for websites that want to understand whether answer engines can parse, trust, and recommend them. The source is on GitHub under the MIT license.
The package supports terminal use, JSON output for machine-readable workflows, markdown output for reporting, and programmatic usage through the exported runAeoAudit API.
How the skill layer fits in
The same package documentation also ships five skills for recurring AEO workflows. We refer to them publicly as OpenClaw / Claude Code skills because they are designed to turn the raw audit engine into repeatable operational flows. The skill suite is also available on ClawHub.
That matters for client work. A score alone does not fix a site; teams need an audit workflow, a fix workflow, validation steps, llms.txt generation, and a monitoring loop.
Why this matters for agency work
The open-source package is not separate from the service. It reflects how AI NYC thinks about technical AEO: clear scoring, documented signals, and practical workflows.
Clients can review the same model that guides our audits instead of relying on vague claims about proprietary methodology.
FAQ
Is @ainyc/aeo-audit a real public package?
Yes. The package is publicly documented as @ainyc/aeo-audit with a public GitHub repository at AINYC/aeo-audit and npm installation support via npx.
Does the public tool replace agency work?
No. The public tool gives a transparent technical baseline. Agency work adds prompt strategy, content architecture, execution, and monitoring across live buyer queries.
What is the relationship between the package and the skills suite?
The public package includes five documented skills that wrap the engine into practical workflows for auditing, fixing, monitoring, schema validation, and llms.txt generation.
Where should a technical team start?
Start with the public package if you want to inspect the scoring model, and start with the free audit at /audit if you want the fastest path into AI NYC's service workflow.
Try it yourself.
Run a free AEO audit to see how your site scores, or explore the tools and pages referenced in this article.