Data Engineering & Modern Data Stack

A modern data stack to deliver reliable, scalable analytics platforms for enterprise teams.

Core Competencies

Six pillars of delivery

A structured approach aligned with enterprise expectations for governance, reliability, and business impact.

Data Transformation

  • dbt & Dataform modeling
  • Medallion architecture
  • Documentation and lineage

Data Ingestion

  • Airbyte, dlt, Fivetran
  • Incremental loading
  • API and SaaS integration

Python Automation

  • Custom connectors
  • Operational tooling
  • Data validation scripts

Orchestration

  • Airflow and Kestra
  • Retries and SLAs
  • Workflow observability

Data Quality

  • dbt tests and monitors
  • Freshness and volume guards
  • Incident management

Cloud Warehousing

  • BigQuery and Snowflake
  • Cost optimization (FinOps)
  • Performance tuning

Tools & Technologies

Modern data stack

A scalable toolkit for enterprise-grade data delivery.

dbt logo dbt
BigQuery logo BigQuery
Airbyte logo Airbyte
Kestra logo Kestra
Python logo Python
dlt logo dlt
Looker Studio logo Looker Studio
Airflow logo Airflow
Google Cloud Platform logo Google Cloud Platform
Fivetran logo Fivetran
Dataform logo Dataform
Snowflake logo Snowflake
SQL logo SQL
Power BI logo Power BI
Tableau logo Tableau
Streamlit logo Streamlit
PostgreSQL logo PostgreSQL
Docker logo Docker
Git logo Git
Elementary logo Elementary
MongoDB logo MongoDB
Redis logo Redis
Metabase logo Metabase
Funnel.io logo Funnel.io
Cloud Functions logo Cloud Functions
Cloud Run logo Cloud Run

Ready to Build Modern Data Platforms

For data / Analytics Engineering missions in Toulouse and remote.

Get in touch
Available for new missions
Contact me