How to Build a Developer Tool with AI in a Weekend
I built and shipped a developer tool in a weekend using AI coding assistants. Not a toy — a real tool that other developers use. Here is exactly how I did it, step by step.
The Idea
The tool: a CLI that analyzes your project's dependencies and tells you which ones have AI-powered alternatives. You run ai-deps check and it outputs a list of dependencies that could be replaced or augmented with AI tools.
Simple, useful, shippable in a weekend.
Friday Evening: Scaffolding (2 hours)
Step 1: Project Setup
I started by telling Claude what I wanted to build:
I want to build a Node.js CLI tool called "ai-deps" that:
1. Reads package.json from the current directory
2. Checks each dependency against a database of AI alternatives
3. Outputs suggestions in a formatted table
4. Publishes to npm
Give me the project structure and initial setup.
Claude generated:
package.jsonwith proper bin configurationtsconfig.jsonfor TypeScriptsrc/index.tswith argument parsing (usingcommander)src/analyzer.tsfor the core logic.github/workflows/publish.ymlfor automated npm publishing
Total time: 10 minutes for the initial scaffold.
Step 2: Core Logic
I used Cursor to implement the analyzer:
// In Cursor Composer:
// "Implement the dependency analyzer. It should:
// 1. Read package.json dependencies and devDependencies
// 2. Check each against the AI_ALTERNATIVES map
// 3. Return a list of suggestions with: current dep, AI alternative, reason
// 4. Handle missing package.json gracefully"
Cursor generated the complete implementation with error handling, TypeScript types, and even a reasonable default mapping of common packages to AI alternatives.
I spent 30 minutes manually curating the AI alternatives database — this is the kind of domain knowledge AI cannot generate.
Step 3: CLI Interface
Back to Cursor:
// "Add a formatted table output using the chalk and cli-table3 packages.
// Show: Dependency, AI Alternative, Category, Reason.
// Add color coding: green for direct replacements, yellow for augmentations.
// Add a --json flag for machine-readable output."
Cursor generated the CLI output formatting. The result looked professional immediately.
Friday total: 2 hours. I had a working CLI that reads dependencies and outputs suggestions.
Saturday: Features and Polish (6 hours)
Step 4: Testing
I asked Cursor to generate tests:
// "Generate Vitest tests for the analyzer module.
// Test: valid package.json, missing file, empty dependencies,
// dependencies with known alternatives, dependencies without alternatives.
// Use temp directories for file system tests."
Cursor generated 12 test cases. I adjusted 3 of them for edge cases and added 2 more for specific scenarios I cared about. Total testing time: 45 minutes.
Step 5: README
// "Generate a README.md with: project description, installation (npm),
// usage examples, output screenshot placeholder, configuration options,
// contributing guidelines, and license (MIT)."
Claude generated a comprehensive README. I edited the description to be more specific and added a real usage example. Time: 20 minutes.
Step 6: Error Handling and Edge Cases
I used Claude to think through edge cases:
Here is my CLI tool code. [paste]
What edge cases am I missing? What errors could users encounter?
Claude identified:
- Monorepo support (multiple package.json files)
- Workspace dependencies
- Lock file parsing for accurate version detection
- Network errors if I add an online database later
- Permission errors reading files
I implemented the top 3 with Cursor. Time: 2 hours.
Step 7: npm Publishing Setup
Claude generated the GitHub Actions workflow for automated publishing:
# On push to main with version tag
on:
push:
tags: ['v*']
jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
registry-url: https://registry.npmjs.org
- run: npm ci
- run: npm test
- run: npm publish
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
Time: 15 minutes.
Saturday total: 6 hours. The tool is tested, documented, and ready to publish.
Sunday: Ship It (3 hours)
Step 8: Final Polish
- Added a
--verboseflag for detailed output - Added a
--ignoreflag to skip specific dependencies - Cleaned up TypeScript types
- Ran the linter and fixed issues
Most of this was done with Cursor's inline editing. Time: 1.5 hours.
Step 9: Publish
npm login
git tag v1.0.0
git push --tags
GitHub Actions ran the tests and published to npm automatically. Time: 15 minutes.
Step 10: Launch
- Posted on Reddit (r/webdev, r/node)
- Tweeted about it with a screenshot
- Submitted to a few developer tool directories
Time: 1 hour.
What AI Did Well
- Scaffolding. Project setup that would take an hour took 10 minutes.
- Boilerplate. CLI argument parsing, table formatting, error handling — all boilerplate that AI generates perfectly.
- Tests. 80% of test cases were generated correctly.
- Documentation. README, contributing guide, and inline comments.
- CI/CD. GitHub Actions workflow was correct on the first try.
What I Did Manually
- Domain knowledge. The mapping of packages to AI alternatives required my expertise.
- UX decisions. How the output should look, what flags to support, what information is most useful.
- Edge case prioritization. AI identified edge cases; I decided which ones to handle.
- Marketing. AI cannot post on Reddit for you (yet).
The Numbers
- Total development time: 11 hours over a weekend
- Lines of code: ~800 (TypeScript)
- Estimated time without AI: 25-30 hours (2-3 weekends)
- Time saved by AI: ~60%
- npm downloads (first week): 340
AI did not build the tool for me. It accelerated the parts that are mechanical — scaffolding, boilerplate, tests, docs — so I could spend my time on the parts that matter: the idea, the UX, and the domain knowledge.
That is the real power of AI coding assistants. Not replacement, but acceleration.
Find tools to accelerate your builds on BuilderAI →
More Articles
AI Pair Programming: 10 Tips to Get Better Results
Using AI as your pair programmer works — if you know how to work with it. Here are 10 tips.
Why Developers Are Switching to AI-First IDEs
VS Code plugins are not enough anymore. AI-native editors are taking over for a reason.
MCP Servers Explained: What Developers Need to Know
Model Context Protocol is connecting AI to everything. Here is how MCP servers work and why they matter.