End-to-End Cloud-Native Full-Stack Platform
Category: Cloud Architecture · Serverless · CI/CDStack: AWS Lambda · API Gateway · DynamoDB · S3 · CloudFront · GitHub Actions · Vue.js · iOS · Flutter
Background
The core challenge of this system was its data source diversity: multiple iOS apps, Android apps, and various Edge Devices (e.g. audio capture boxes) deployed in the field, all uploading audio files and sensor data to the cloud at high frequency.
This isn't a simple "API writes to database" problem. The real challenges were:
- High-concurrency ingestion: many devices, concurrent connections, the backend cannot go down
- Cross-platform data alignment: iOS, Android, and Edge Devices each have different data formats
- Sustainable operations: the system must update itself automatically — each deployment can't be a high-risk surgery
System Architecture
Device Side
My iOS development background meant I could define the data upload protocol with a realistic understanding of device constraints — battery, reconnect behavior, background execution limits. This made the protocol design closer to real-world conditions.
- iOS App: Swift native development, audio recording and upload
- Android App: Flutter cross-platform implementation
- Edge Devices: lightweight custom communication protocol
Cloud Layer: Pure Serverless Architecture
Why Serverless? Device connections are bursty and unpredictable. A traditional EC2 setup requires pre-configured capacity — under-provisioned at peak, wasteful at off-peak. Serverless scales automatically with actual request volume.
Architecture:
Device
↓
API Gateway (unified entry point)
↓
Lambda (validation + routing)
↓
DynamoDB (structured Metadata) + S3 (audio files)API Gateway is the single entry point for all devices — handles request validation and routing.
Lambda handles core logic:
- Validate incoming data format
- Write Metadata to DynamoDB
- Upload audio files to S3
Lambda scales automatically with concurrent requests — bursts from many devices are absorbed smoothly.
CI/CD Full Automation
CI (Continuous Integration) — GitHub Actions Every Git push triggers automatically:
- Unit tests + integration tests
- Docker image build and push to AWS ECR
- Test reports and lint checks
CD (Continuous Deployment) — Automated
- Backend Lambda: automatically deployed after tests pass, zero-downtime update
- Frontend Vue.js: automatically built and synced to S3; CloudFront cache invalidated
Once code is merged, the system handles testing, building, and deployment automatically — no manual intervention from operations.
Key Decision: Pure Serverless
The problem with traditional approaches: EC2 + Load Balancer requires upfront capacity planning. Handling traffic bursts requires manual scaling or complex Auto Scaling configuration.
Serverless advantages:
- Pay-per-request billing — no cost when there's no traffic
- Automatic elastic scaling — no capacity planning required
- No server management — reduced operational complexity
Results
- System stably handles high-frequency uploads from multiple device types without service interruptions
- CI/CD automation significantly reduces deployment risk and operational burden
- Serverless cost model scales proportionally with traffic — no wasted fixed capacity
Takeaway
Architecture decisions should answer "which solution fits the shape of this problem," not "which solution is most advanced." Serverless isn't universal, but for bursty, multi-source device data ingestion, it's genuinely the right tool. Matching the architecture to the workload is how you control both cost and complexity.
