Building dbRosetta Part 5: We Need an API
Because I don’t want to have to fight with our support team (they’re awesome, but busy) I decided that, initially, I’m going to host dbRosetta at ScaryDBA.com. I have full control of the web site, and I won’t be breaking Redgate Software entirely if I accidently do something silly. Before starting the process of developing our next prompt, or set of prompts, I discussed the project with CoPilot. We agreed to break the next part into two pieces. This one, we’ll build an API interface. Why an API? Well, I’ll let CoPilot explain it:
An API provides a secure, consistent way to expose dbRosetta’s database so applications and websites can interact with it without direct access to the underlying system. By adding this layer, we separate the database from the front‑end, making it easier to build interactive features while keeping the data protected and portable.
And there you have it. If it’s not clear, this is Part 5 of a multi-part series wherein I’m building a database and application using LLMs. I’m writing everything myself, except the parts, like above, where I call out the AI.
Let’s get going.
Generating the API Prompt
As I’ve done throughout this series, I’m writing prompts to the CoPilot LLM that are then used to generate a prompt for GitHub Copilot. Here’s how I defined my own prompt for this step and the output I received:
| Let’s get started on the API for dbRosetta. I’d like you to generate a prompt for Github Copilot within vsCode using the agreed upon structure. This prompt will define the steps necessary to create an API within Azure that accesses our PostgreSQL database for dbRosetta. It will be a secure API to protect access to the database. The API will provide mechanisms for querying the information stored in the PostgreSQL database. The API will be designed to support a web page running within WordPress, which we’ll build later. We’ll need to incorporate the API and it’s code into our existing Github repository. Deployment of the API and any updates will be done using our Github Actions pipeline already created. We’ll also need a full test suite to validate that the API is working appropriately. | Project: dbRosetta Secure API — Implementation Plan and Tasks
Context You are an expert backend engineer and DevOps automation assistant. Implement a secure, production-ready REST API for dbRosetta that runs in Azure, accesses our PostgreSQL database, and is consumable by a WordPress front end. All code must be TypeScript, follow best practices, include a full test suite, and integrate into the existing GitHub repository and GitHub Actions pipeline.
High-level Goals 1. Create a secure, well-documented REST API that exposes read and query endpoints for dbRosetta PostgreSQL data. 2. Use Azure-managed services and security best practices (Managed Identity, Key Vault, private endpoints, firewall rules). 3. Integrate code into the existing GitHub repo with clear branch/PR workflow and CI/CD via existing GitHub Actions. 4. Provide unit, integration, and end-to-end tests plus a Postman collection and contract tests. 5. Produce OpenAPI (Swagger) documentation and example client usage for WordPress (AJAX/REST calls).
Constraints and Nonfunctional Requirements – Language: **TypeScript** (Node.js 18+). – Framework: **Fastify** (or Express if preferred) with dependency injection and typed request/response models. – ORM/DB Layer: **TypeORM** or **Prisma** (prefer Prisma for type-safety and migrations). – Secrets: **Azure Key Vault** for DB connection strings and JWT signing keys. – Authentication: **OAuth2 / OpenID Connect** with Azure AD or JWT bearer tokens issued by a trusted identity provider; support role-based access control (RBAC). – Network: Use **Azure Private Endpoint** for PostgreSQL and restrict API access via Azure App Service VNet integration or Azure API Management. – CI/CD: Use existing GitHub Actions pipeline; add jobs for build, lint, test, migration, and deploy. – Observability: Structured logging (Winston/Pino), distributed tracing (OpenTelemetry), and Azure Monitor integration. – Security: Parameterized queries, input validation, rate limiting, CORS restricted to WordPress domain(s), and automated dependency scanning.
Deliverables – New API service in – – – – – – – GitHub Actions workflow updates or new workflow file that plugs into existing pipeline steps – Postman collection and example WordPress client snippet – Test coverage report and contract tests – Security checklist and runbook for rotating secrets and emergency rollback
API Surface and Example Endpoints Design endpoints to be RESTful, paginated, and filterable. Example endpoints: – – – – – –
Security and Access Control – **Authentication**: Validate JWTs issued by Azure AD or a configured OIDC provider. Reject unsigned or expired tokens. – **Authorization**: Implement RBAC with scopes/roles (e.g., – **Secrets**: Do not store secrets in repo or environment variables in plaintext. Use Azure Key Vault and assign the API a Managed Identity to fetch secrets at runtime. – **Network Security**: Use private endpoints for PostgreSQL; restrict API inbound traffic via Azure API Management or App Service access restrictions; enable TLS everywhere. – **SQL Safety**: Use Prisma/TypeORM parameterized queries; never interpolate user input into raw SQL. Validate and sanitize all inputs. – **Rate Limiting and Throttling**: Implement per-IP and per-client rate limits; return standard 429 responses. – **CORS**: Restrict to the WordPress domain(s) and admin origins only.
Database Access and Migrations – Use Prisma (recommended) with a schema that maps to existing PostgreSQL tables; include migration scripts. – Provide a safe read-only query DSL for complex queries that maps to parameterized Prisma queries. – Add a migration job in CI that runs
Local Development and Secrets – Provide – Local dev should support a Docker Compose setup with a local Postgres instance and a local Key Vault emulator or secrets file (for dev only). – Document how to authenticate locally with Azure AD (dev flow) or use a test JWT.
Testing Strategy – **Unit tests**: Jest with ts-jest for services, controllers, and utilities. – **Integration tests**: Spin up a test Postgres (Docker) and run tests against it; use test fixtures and transactions to isolate tests. – **Contract tests**: Use Pact or similar to validate API contract with the WordPress client expectations. – **E2E tests**: Use Supertest or Playwright for API flows. – **CI**: Run lint, typecheck, unit tests, integration tests (with a test DB container), and generate coverage. Fail pipeline on coverage drop below threshold (e.g., 80%). – Provide a Postman collection and automated Newman run in CI for smoke tests post-deploy.
CI/CD and GitHub Actions Integration – Add or update workflow steps: – – – – – – – Use environment-specific secrets stored in GitHub Secrets and Azure Key Vault. – Ensure pipeline uses least privilege service principal and supports rollback.
Repository Integration and Developer Workflow – Create a feature branch – Include a PR template that requires: architecture diagram, security checklist, migration plan, and test results. – Add CODEOWNERS for the API folder. – Add pre-commit hooks (husky) for linting and formatting.
Observability and Monitoring – Structured JSON logs with request IDs. – Expose Prometheus metrics and integrate with Azure Monitor. – Add alerts for error rate, latency, and failed deployments.
Acceptance Criteria – API endpoints implemented and documented in OpenAPI. – All tests pass in CI and coverage meets threshold. – Secrets are stored in Key Vault and accessed via Managed Identity. – Deployment via GitHub Actions to staging succeeds and smoke tests pass. – WordPress-compatible example client snippet provided. – Security checklist completed and reviewed.
Developer Tasks and Step-by-Step Implementation Plan 1. Initialize service folder 2. Add Prisma schema and generate client; create initial migration from existing DB schema. 3. Implement DB layer with typed models and safe query helpers. 4. Implement authentication middleware for JWT/OIDC and RBAC enforcement. 5. Implement endpoints and input validation (zod or class-validator). 6. Add OpenAPI generation and serve Swagger UI on a secured route. 7. Add logging, metrics, and health endpoints. 8. Write unit tests for each module and integration tests for endpoints using a test DB container. 9. Add Dockerfile and Azure deployment manifests. 10. Update GitHub Actions workflows to include build/test/migrate/deploy steps and Newman/Postman smoke tests. 11. Document local dev, secrets, and deployment steps in README. 12. Create Postman collection and example WordPress AJAX client snippet. 13. Run security review and fix any issues; finalize PR.
PR Checklist – Code compiles and lints clean. – Tests pass and coverage threshold met. – OpenAPI spec included and validated. – Migration scripts included and tested. – Secrets not in repo; Key Vault integration documented. – CI pipeline updated and tested on staging. – README and runbook updated.
Notes for Copilot – Generate code in small, reviewable commits with clear messages. – Prefer explicit typing and small, testable functions. – When scaffolding, include TODOs for environment-specific values and manual approval gates for production migration/deploy steps. – Provide example curl and WordPress fetch snippets for each endpoint.
End of prompt. |
I’m not going to try to hide anything from you. I’m now outside my comfort zone. I’ve no idea if the approach we’re following now is correct or not. I’m going to be pretty dependent on the two AIs to get this stuff right. I’m now thinking through my own prompts and trying to be more thorough and more prescriptive. I read through the output and it sure seems right to me. Over to VSCode.
Building the API
I’m not entirely sure what I expected, but after I created the markdown file for the prompt (you can read through all the code, including the prompts I’m using here on Github), I pasted it into the chat window in VSCode. Github Copilot didn’t go into a discussion. It started generating files, folders, and code. Lots of code. And it asked me a question:
What would you like to do next?
- Test what we have so far?
- Continue with implementing the API routes?
- Focus on a specific component (tests, CI/CD, documentation)?
- Something else?
It’s claiming to be 70% complete, after about two minutes. Honestly, I’d like to immediately finish up, just to see if it really is done that quickly, but I’m going to test what we have so I can understand where we are. Nice thing is, it told me what to do for that as well:
cd services/dbrosetta-apinpm installnpm run docker:upnpm run prisma:generatenpm run dev# Visit http://localhost:3000/docs
npm warn deprecated inflight@1.0.6: This module is not supported, and leaks memory. Do not use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful.
npm warn deprecated glob@7.2.3: Glob versions prior to v9 are no longer supported
npm warn deprecated glob@7.2.3: Glob versions prior to v9 are no longer supported
npm warn deprecated glob@7.2.3: Glob versions prior to v9 are no longer supported
npm warn deprecated glob@7.2.3: Glob versions prior to v9 are no longer supported> dbrosetta-api@1.0.0 prepare
> husky installhusky – install command is DEPRECATED
added 928 packages, and audited 929 packages in 50s
169 packages are looking for funding
runnpm fundfor details2 moderate severity vulnerabilities
To address all issues (including breaking changes), run:
npm audit fix –forceRun
npm auditfor details.
Wow! I mean, wow! It’s use all sorts of deprecated versions and functions. This is one of the kinds of problems I honestly expected. I took this output to CoPilot (I’m also going to have Github CoPilot fix it, but I wanted to see what the AI thought). It thinks I’m dangerous:
VS Code didn’t intentionally pick deprecated code — the warnings come from transitive dependencies in the toolchain (packages you installed depend on older libraries like
glob@7.xwhich in turn pullinflight@1.0.6). Those upstream packages haven’t been updated to newer versions yet, sonpmprints deprecation warnings; you can either upgrade/override the offending dependency, wait for upstream fixes, or suppress/mitigate the risk with fixes and CI checks
I’m not going to argue with it when the install I did was what vsCode told me to do. However, this is all interesting stuff. This is why, straight up, I don’t think we have to worry about our jobs. We’re still very much necessary to this process. It’s just, this process is WAY faster than it would be.
Lots more back and forth between me and the agent. As we ran the tests, we discovered coding errors, configuration problems and other stuff. Most of which was generated by the LLM, but also fixed by the LLM.
As a side note, for the first four parts of this series, I merged the branches into the main and deleted them. Now, I’m realizing it would be helpful for you, if you’re interested, to be able to see each of the steps taken within VSCode. So, from here on, you’ll see branches that are available and each of the commits I made on that branch, just as a way to see just how interactive this process is (as well as seeing how problematic code is introduced and then eliminated through the use of the agent). I’ll be merging the code into main as before, but I won’t delete the previous branches going forward so you can see the sausage making process in its entirety.
In the middle of ALL this, I had to switch Azure Subscriptions (don’t ask).
However, with the testing scaffolding in place, the migration from one Subscription to another was shockingly simple, if a bit long-winded.
Conclusion
I keep getting reminded, over and over, the AI is a particularly brilliant, but extremely junior, developer on my team. They generate code like lightning. Most of the time, it’s fully functional. However, sometimes, silly mistakes are made. A constant, steady, guide must be in place to arrive at a good destination. Although, to be fair, at one point, I forgot to update the IP address in the firewall in Azure PostgreSQL. When the tests failed, the AI was hellbent on doing massive troubleshooting to find out why. I had to force it to forget what it saw after I got the security situation worked out again. Otherwise, it was a real pleasure to work with. I mean, it did keep trying to bypass Flyway and use some internal tool instead, but I got my way in the end.
Here’s CoPilot’s assessment:
By completing this stage, we’ve established a secure and reliable bridge between dbRosetta’s PostgreSQL foundation and the broader ecosystem it will support. The API not only enforces best practices around authentication, input validation, and deployment hygiene, but also sets the stage for reproducible integrations with other platforms. With this layer in place, contributors and downstream applications can query the database confidently, knowing that access is protected and performance is monitored. This milestone transforms dbRosetta from a static repository of translations into a living service, ready to be consumed by the WordPress interface we’ll build next.
OK. A bit grandiose, but otherwise, yeah. Pretty much.
If you’re just landing here for the first time, the code and the prompts I used to generate it, are all here on GitHub.
The rest of the series is here:
Part 1: Introducing the Concept of dbRosetta
Part 2: Defining the Project & Prompt Templates
Part 3: Creating a Database
Part 4: Automating a CI/CD Database Build