Camille Lebrun

Data Consultant | Data & Analytics Engineer | Bridging business strategy and technical implementation | Expertise in Modern Data Stack and BI.

About me

I am Camille Lebrun, with a dual scientific and data background, holding a Master's degree in Biochemistry and a Master's in Data & AI, with 3+ years of experience as a Data Analytics Engineer.

In my missions, I leverage my technical versatility (Python, SQL, dbt, Airflow, Kestra, data visualization) and passion for the profession along with interpersonal skills to best understand my clients' needs and challenges. Specialized in Retail, Marketing, FMCG, and E-commerce sectors, I can effectively support them in their data valorization projects.

Curious and adaptable, I excel at selecting the right data tools and architecture while remaining a hands-on contributor. I focus on translating complex business needs into robust, autonomous technical implementations within a team.

Core strengths

dbt Airbyte SQL Python Data Modeling Kestra Airflow CI/CD BigQuery Snowflake Stitch SAP Observability Stakeholder Enablement

Functional Domains

Retail Marketing FMCG E-commerce E-retail Media Skin Care Fashion & Apparel

Impact snapshot

Measurable outcomes

Concrete results delivered across complex data infrastructures, combining advanced automation with operational excellence.

Operational impact

+5h / week saved

Automated complex reporting via dbt, Looker Studio and Python APIs, replacing tedious spreadsheets with real-time strategic visibility.

System reliability

90% faster alerts

Proactive monitoring (Kestra, Slack) to detect and resolve data anomalies instantly, ensuring high-integrity pipelines for production.

End-to-end delivery

5+ custom data stacks

Architected and deployed full-stack platforms from the ground up, integrating Shopify, SAP, and marketing DSPs into unified, production-ready environments.

Camille Lebrun

Data Consultant | Data & Analytics Engineer

Data consultant blending technical depth with business insight across the Modern Data Stack. Specialized in architecting production-ready platforms for high-stakes business outcomes.

Data Engineering Analytics Engineering dbt BigQuery Data Quality Pipeline Automation

Last updated: December 22, 2025

Stay in touch

Contact me

Work experience

Data Analytics Engineering Consultant
Hanalytics, Paris, France
Data consulting firm specializing in Modern Data Stack

Jan.2025 - Dec.2025

Leading data architecture initiatives across retail, e-commerce, and media industries, delivering scalable solutions that drive strategic business decisions.

Key Responsibilities

  • Build end-to-end data platforms with multi-source data extraction (Funnel, Google Sheets, SFTP, Stitch, Airbyte, Catchr, Couchdrop)
  • Model raw data using medallion architecture and create specialized marts for business teams
  • Develop dashboards for C-Level, implement RLS for multi-store access, and build Slack alerting systems

Key Achievements

  • Architected scalable data solutions across 3+ industries, establishing Modern Data Stack best practices
  • Built Slack-based monitoring system reducing data quality issue detection time by 90%+
  • Migrated legacy infrastructure to Modern Data Stack, improving scalability and maintainability

Technical Stack:

dbt Core/Cloud BigQuery Kestra Python SQL Looker Studio Airbyte GA4

Analytics Engineer Apprentice
Lucky Cart, Paris, France
French MarTech combining transactional data with e-retail media

Nov.2023 - Dec.2024

Data platform audit, automation, and refactoring to enhance data quality, streamline reporting processes, and establish best practices.

Key Responsibilities

  • Audit data quality received from retailers and implement data quality monitoring systems
  • Develop Streamlit application for automated retailer reporting (campaign selection, generation, and writing to Google Sheets via Python API)
  • Complete dbt code refactoring and implementation of best practices for the data team
  • Root cause analysis of incidents, creation of diagnostic methodology, and bug resolution with solution documentation

Key Achievements

  • Refactored dbt codebase improving pipeline performance and reducing query costs
  • Built Streamlit application automating manual reporting, saving 5+ hours/week

Technical Stack:

dbt BigQuery Python Streamlit Google Sheets API

Analytics Engineer Intern
My Job Glasses, Paris, France
Career discovery platform connecting job seekers with mentors

June.2023 - Nov.2023

Refactoring and creation of multi-sector data marts tailored to diverse client needs across public, private, and educational sectors.

Key Responsibilities

  • Gather and analyze requirements from client teams (public sector including military, private sector, and educational institutions)
  • Refactor and create new data marts adapted to the specificities of each sector
  • Develop and deploy Metabase dashboards for business teams
  • Train business teams for autonomous dashboard creation on Metabase

Key Achievements

  • Developed dynamic Metabase dashboards enabling data-driven decisions across business teams
  • Successfully trained business teams to create autonomous dashboards, improving self-service analytics

Technical Stack:

dbt BigQuery Metabase SQL

Data Analyst Intern
Click & Boat, Paris, France
Peer-to-peer yacht charter platform, leader in boat rentals

May.2022 - Oct.2022

Implementation of specialized datasets for marketing and press teams, delivering in-depth analyses to drive marketing and commercial activities.

Key Responsibilities

  • Gather and analyze business requirements from marketing and sales teams
  • Create specialized datasets for marketing and press teams
  • Implement in-depth analyses for marketing and commercial activity management
  • Apply modeling and visualization best practices, develop and improve Tableau dashboards

Key Achievements

  • Implemented user segmentation models enabling targeted CRM engagement for 100K+ users
  • Developed Tableau dashboards streamlining marketing analytics and sales performance tracking

Technical Stack:

dbt Snowflake Tableau SQL

Skills & Expertise

Specialized in data transformation, integration, and pipeline automation with the Modern Data Stack.

Data Transformation

Expertise in dbt for modular data modeling, incremental pipelines, custom macros, and integrated testing. Focus on scalable, analytics-ready datasets and automated documentation within cloud data warehouses.

Data Ingestion & Connectors

Hands-on experience building ELT pipelines with Airbyte and dlt, including incremental loading and seamless API/database integration for efficient data ingestion workflows.

Python Automation & Data Processing

Development of custom data connectors and robust data pipelines in Python. Expertise in building scalable solutions for data extraction, transformation, and processing using pandas, Polars, dlt, and Streamlit for rapid prototyping of data applications.

Pipeline Orchestration

Automation of data workflows using orchestration tools such as Airflow and Kestra, with advanced error handling, retry logic, and scheduling for reliable production pipelines.

Data Quality & Monitoring

Implementation of dbt tests, validation frameworks, and automated monitoring to ensure data freshness, accuracy, and reliability throughout the pipeline.

Cloud Data Warehousing

Practical experience with BigQuery and Snowflake, including cost optimization, performance tuning, and best practices for scalable cloud data storage.

Licenses & Certifications

Certificate Partner Certification Program (Level 1)
Funnel
Issued Apr 2025

Validated expertise in Funnel's marketing analytics platform and data integration capabilities.

Education

Master's degree, Data/IA
HETIC, Montreuil
Leading web and digital technology school

Oct 2022 - Nov 2024

Comprehensive program covering modern data engineering, analytics, and AI technologies, providing the foundation for building scalable data solutions.

Core Competencies:

  • Data Engineering & Analytics: Advanced Python programming for data pipelines, ETL/ELT workflows, and automation. Expertise in SQL and database optimization for data modeling and querying.
  • Data Visualization: Proficient in creating interactive dashboards and visualizations using Power BI, Tableau, and Looker Studio to enable data-driven decision-making.
  • Machine Learning & AI: Hands-on experience with neural networks, NLP, and AI applications for predictive analytics and intelligent data solutions.
  • Business Analytics: Applying data-driven insights to enhance marketing strategies and support strategic business decisions.

Master's degree, Structural and Functional Biochemistry
Paul Sabatier University Toulouse III, Toulouse
Leading French university in science, health, and engineering

Sep 2018 - 2020

Advanced training in biomolecular analysis and computational modeling. This scientific background provided a strong analytical foundation that facilitated the transition to data engineering and analytics.

FAQs

Frequently Asked Questions

Expert insights on Data Analytics Engineering and the Modern Data Stack.

How do you orchestrate a data platform ready for business teams?

Source audit, medallion modeling, dbt and BigQuery pipelines, then fast delivery to Looker Studio or Streamlit for day-to-day consumption.

How do you guarantee the reliability of pipelines and dashboards?

Automated dbt tests, Slack/Cloud Monitoring alerts, anomaly detection via custom SQL, and volume/freshness guards on every critical table.

Can you optimize an existing stack without starting from scratch?

Yes, I launch with a technical audit, identify quick wins (dbt refactoring, partitioning, cost controls) and document an actionable roadmap.

Which KPIs prove the impact of your data missions?

Incident detection time (-90%), hours saved in reporting (+5h/week), data reliability (green tests) and dashboard adoption metrics.

How do you collaborate with product, marketing, or ops teams?

Intent workshops, accessible documentation, governance rituals, and ongoing support keep metrics and alerts transparent.

Which tools do you prioritize for orchestration, governance, and reliability?

Kestra/Airflow for orchestration, dbt for modeling, Airbyte/dlt for ingestion, Looker Studio for KPIs, plus Python scripts for validation and monitoring.

Featured Projects

Real-world data solutions delivering measurable business impact across multiple industries.

Modern Data Foundation & Orchestration

Technical Stack:

Kestra Airbyte dbt BigQuery Funnel

Challenge

Consolidation of multiple disparate data sources into a single source of truth. Problem solved: time-consuming manual reporting and lack of confidence in numbers due to fragmented data sources.

Results

More than 5 hours saved per week for reporting through automation. 100% confidence restored in numbers thanks to a reliable data architecture. Comprehensive monitoring of key metrics: net sales, bundle performance, upsell rates, and customer acquisition costs.

Solution

Medallion architecture (bronze, silver, gold) implemented with Kestra for orchestration. Data ingestion with Airbyte for seamless source integration. Modular dbt models with marts layer for analytics exposure. Kestra workflows with automated retry mechanisms and error handling. Data quality framework with automated testing ensuring data accuracy. Integration with Funnel for marketing analytics data consolidation.

Data Monitoring
Slack-Based Analytics Alert System

Technical Stack:

Slack API Kestra BigQuery Python

Challenge

Data quality issues and dbt pipeline failures were detected too late, impacting data reliability. Business teams lacked real-time visibility into key business KPIs. Traditional monitoring required manual dashboard checks and context switching, delaying response times.

Solution

Developed a comprehensive Slack monitoring system integrating Slack API with dbt and BigQuery. Implemented automated Slack alerts for dbt test failures and pipeline errors (technical monitoring). Built scheduled business KPI reporting with configurable thresholds and automated daily metrics delivery to Slack channels. Created user-friendly Slack commands for immediate data access and interactive dashboards within Slack.

Results

Reduced time to detect dbt failures and data quality issues significantly through instant Slack notifications. Enabled scheduled business KPI delivery directly in team communication channels, ensuring consistent visibility. Improved cross-team collaboration through centralized data communication. Increased data quality awareness and faster incident response times across the organization.

Automated Tech Watch System

Technical Stack:

Python dbt dlt BigQuery Cloud Functions Cloud Run Terraform

Challenge

Manual tech watch process was time-consuming and inefficient. Needed to aggregate content from multiple sources (Medium emails, RSS feeds) and transform it into actionable insights. Required an automated solution leveraging Modern Data Stack technologies.

Solution

Built an end-to-end automated system: Extract Medium articles from email with Python → stored as JSON. Process RSS feeds (XML) through automated Python scripts. Orchestration and transformations with dbt. Data ingestion from GCP to BigQuery using dlt. Deployment with Cloud Functions, Cloud Build, and Cloud Run. Infrastructure managed with Terraform.

Impact

Fully automated tech watch process eliminating manual work. Centralized data collection enabling comprehensive content analysis. Foundation for future AI-powered features: LLM summarization and AI agent for intelligent content discovery. Personal project demonstrating end-to-end data engineering capabilities.

Contact me