Skip to main content

Energy

A Scalable Serverless Data Platform for Maritime Electrification

Millions of raw sensor readings from shore-power charging stations sat locked in a time-series database. The client needed a cloud-native platform to transform that data into something their teams could actually use.

Cargo ship at port connected to shore-power infrastructure with data streams flowing to cloud services

The challenge

The client operates a network of high-voltage shore-power stations that charge ships at ports. Each Station Power Unit (SPU) generates continuous sensor data — energy usage, session durations, equipment health — all stored as raw pings in a TimeScaleDB instance. Millions of readings, but none of them in a shape that could drive dashboards, billing, or operational monitoring.

The initial infrastructure plans relied on rigid cross-account communication and complex VPC peering, which would have slowed deployment and inflated operational overhead. Meanwhile, the physical charging hardware was still being commissioned at ports around the world. That meant the data schemas and synchronisation flows were a moving target — changing as new stations came online and firmware evolved.

The client's front-end team needed stable API contracts to build monitoring dashboards, but the underlying data pipelines were still in flux. Every week of delay compounded: blocked front-end developers, deferred integration testing, and growing uncertainty about whether the architecture could handle production-scale data volumes across a global network of ports.

Our approach

We built an automated, incremental synchronisation workflow to bridge raw time-series data and business intelligence. Using AWS Lambda and SQS, we created a transformation layer that periodically retrieves data from TimeScaleDB, processes it, and persists aggregated results into a refined PostgreSQL database on RDS. We shifted the focus from raw sensor pings to high-value domain entities: active charging sessions, SPU health status, historical peak usage, and individual vessel metrics.

To unblock the front-end team without waiting for the data pipelines to stabilise, we implemented a contract-first API strategy. We formalised the API surface using OpenAPI specifications with comprehensive documentation. Front-end developers could build against mocked endpoints immediately, cutting the delivery timeline and eliminating dependency bottlenecks while the data layer matured behind the contract.

We treated the entire infrastructure as code. We used Terraform to provision API Gateway, Lambda functions, RDS clusters, and S3-backed artefacts — making the platform reproducible across regions as new ports came online. For security, we implemented a production-ready authentication model using AWS Cognito with OAuth2 client credentials, ensuring secure machine-to-machine integration across all API endpoints.

The result is a serverless platform that scales automatically with data volume, costs nothing when idle, and gives the client a clear path from scattered sensor readings to operational dashboards. The contract-first approach meant the front-end team shipped weeks earlier than they would have otherwise, and the Terraform-managed infrastructure is ready to deploy to new port regions without manual provisioning.

Millions

Sensor readings transformed

100%

Infrastructure as code

0

Idle server costs

Weeks

Front-end delivery accelerated

Tech stack

AWS API Gateway AWS Cognito AWS Lambda AWS S3 AWS SQS Contract-First API OpenAPI PostgreSQL (RDS) Pytest Python Terraform TimescaleDB

Need a data platform that scales with your hardware?

We build serverless backends that turn raw sensor data into operational intelligence.

Start a conversation