🚀 Meet Data Porter: My AI-Built Service for Heavy Data Exports
If you’re building for fintech or any system that requires users to download heavy data (transactions, reports, logs), you already know the struggle:
Queries timeout when datasets grow into millions of rows.
Users keep downloading the same reports, wasting compute.
Teams lack audit trails of what was exported, when, and by whom.
I wanted a solution that was scalable, auditable, and secure. So I turned to AI.
In less than 2 hours, I built Data Porter — an open-source service that solves heavy data exports once and for all.
👉 Repo: noibilism/data-export-service
💡 Why Data Porter
The name says it all: Data Porter is the service that “carries” your data safely from your database to a CSV file in S3, ready for download.
It’s built with the realities of fintech in mind:
Async exports (no API timeouts).
Deduplication (don’t regenerate the same date-range report twice).
Audit-ready metadata in its own database.
Secure S3 presigned URLs so files are accessible only when and where they should be.
⚙️ How It Works
POST /export → Initiate an export or reuse an existing one if available.
GET /export/{reference_id} → Check status and get a presigned S3 download link.
Under the hood:
Streams millions of rows from your transactions database.
Writes incrementally to CSV (no memory blowups).
Uploads to S3 with multipart handling for huge files.
Tracks every job in an Export Service DB.
🛡️ Built for Scale & Security
Handles 10M+ rows without breaking.
5M rows exported in under 15 minutes.
JWT-authenticated API endpoints.
Presigned S3 URLs that expire (default: 24h).
🛠️ Built With AI, Fast
This is the part I’m most excited about:
I didn’t spend weeks writing boilerplate or re-inventing patterns. I leveraged AI to design, scaffold, and document the entire service in under 2 hours.
That’s a powerful message for anyone building in fintech: the tooling to deliver production-grade infrastructure quickly is already here.
🚀 Getting Started
The repo includes a Docker Compose setup with API, Worker, Redis, and Postgres.
git clone https://github.com/noibilism/data-export-service.gitcd data-export-servicecp .env.example .envdocker-compose up --buildalembic upgrade headAnd you’re ready to start exporting 🚀
🔮 What’s Next
I’m already thinking of where to take Data Porter next:
Supporting Excel, JSON, Parquet formats.
Advanced filters (account IDs, transaction types).
Automatic archival of old exports.
Role-based access controls.
📌 Final Thoughts
Fintech and data-heavy platforms need exports that don’t break under scale. Data Porter is my attempt to solve this once, cleanly — and the fact that it was built with AI in a couple of hours proves how fast we can move today.
Check it out here: GitHub – noibilism/data-export-service
I’d love to hear how you’d use it in your stack. 🚀

