Operational impact
+5h / week saved
Automated complex reporting via dbt, Looker Studio and Python APIs, replacing tedious spreadsheets with real-time strategic visibility.
Data Consultant | Data & Analytics Engineer | Bridging business strategy and technical implementation | Expertise in Modern Data Stack and BI.
I am Camille Lebrun, with a dual scientific and data background, holding a Master's degree in Biochemistry and a Master's in Data & AI, with 3+ years of experience as a Data Analytics Engineer.
In my missions, I leverage my technical versatility (Python, SQL, dbt, Airflow, Kestra, data visualization) and passion for the profession along with interpersonal skills to best understand my clients' needs and challenges. Specialized in Retail, Marketing, FMCG, and E-commerce sectors, I can effectively support them in their data valorization projects.
Curious and adaptable, I excel at selecting the right data tools and architecture while remaining a hands-on contributor. I focus on translating complex business needs into robust, autonomous technical implementations within a team.
Core strengths
Functional Domains
Impact snapshot
Concrete results delivered across complex data infrastructures, combining advanced automation with operational excellence.
Operational impact
+5h / week saved
Automated complex reporting via dbt, Looker Studio and Python APIs, replacing tedious spreadsheets with real-time strategic visibility.
System reliability
90% faster alerts
Proactive monitoring (Kestra, Slack) to detect and resolve data anomalies instantly, ensuring high-integrity pipelines for production.
End-to-end delivery
5+ custom data stacks
Architected and deployed full-stack platforms from the ground up, integrating Shopify, SAP, and marketing DSPs into unified, production-ready environments.
Camille Lebrun
Data Consultant | Data & Analytics Engineer
Data consultant blending technical depth with business insight across the Modern Data Stack. Specialized in architecting production-ready platforms for high-stakes business outcomes.
Last updated: December 22, 2025
Stay in touch
Contact meData Analytics Engineering Consultant
Hanalytics, Paris, France
Data consulting firm specializing in Modern Data Stack
Jan.2025 - Dec.2025
Leading data architecture initiatives across retail, e-commerce, and media industries, delivering scalable solutions that drive strategic business decisions.
Key Responsibilities
Key Achievements
Technical Stack:
Analytics Engineer Apprentice
Lucky Cart, Paris, France
French MarTech combining transactional data with e-retail media
Nov.2023 - Dec.2024
Data platform audit, automation, and refactoring to enhance data quality, streamline reporting processes, and establish best practices.
Key Responsibilities
Key Achievements
Technical Stack:
Analytics Engineer Intern
My Job Glasses, Paris, France
Career discovery platform connecting job seekers with mentors
June.2023 - Nov.2023
Refactoring and creation of multi-sector data marts tailored to diverse client needs across public, private, and educational sectors.
Key Responsibilities
Key Achievements
Technical Stack:
Data Analyst Intern
Click & Boat, Paris, France
Peer-to-peer yacht charter platform, leader in boat rentals
May.2022 - Oct.2022
Implementation of specialized datasets for marketing and press teams, delivering in-depth analyses to drive marketing and commercial activities.
Key Responsibilities
Key Achievements
Technical Stack:
Specialized in data transformation, integration, and pipeline automation with the Modern Data Stack.
Expertise in dbt for modular data modeling, incremental pipelines, custom macros, and integrated testing. Focus on scalable, analytics-ready datasets and automated documentation within cloud data warehouses.
Hands-on experience building ELT pipelines with Airbyte and dlt, including incremental loading and seamless API/database integration for efficient data ingestion workflows.
Development of custom data connectors and robust data pipelines in Python. Expertise in building scalable solutions for data extraction, transformation, and processing using pandas, Polars, dlt, and Streamlit for rapid prototyping of data applications.
Automation of data workflows using orchestration tools such as Airflow and Kestra, with advanced error handling, retry logic, and scheduling for reliable production pipelines.
Implementation of dbt tests, validation frameworks, and automated monitoring to ensure data freshness, accuracy, and reliability throughout the pipeline.
Practical experience with BigQuery and Snowflake, including cost optimization, performance tuning, and best practices for scalable cloud data storage.
Certificate Partner Certification Program (Level 1)
Funnel
Issued Apr 2025
Validated expertise in Funnel's marketing analytics platform and data integration capabilities.
Master's degree, Data/IA
HETIC, Montreuil
Leading web and digital technology school
Oct 2022 - Nov 2024
Comprehensive program covering modern data engineering, analytics, and AI technologies, providing the foundation for building scalable data solutions.
Core Competencies:
Master's degree, Structural and Functional Biochemistry
Paul Sabatier University Toulouse III, Toulouse
Leading French university in science, health, and engineering
Sep 2018 - 2020
Advanced training in biomolecular analysis and computational modeling. This scientific background provided a strong analytical foundation that facilitated the transition to data engineering and analytics.
FAQs
Expert insights on Data Analytics Engineering and the Modern Data Stack.
Source audit, medallion modeling, dbt and BigQuery pipelines, then fast delivery to Looker Studio or Streamlit for day-to-day consumption.
Automated dbt tests, Slack/Cloud Monitoring alerts, anomaly detection via custom SQL, and volume/freshness guards on every critical table.
Yes, I launch with a technical audit, identify quick wins (dbt refactoring, partitioning, cost controls) and document an actionable roadmap.
Incident detection time (-90%), hours saved in reporting (+5h/week), data reliability (green tests) and dashboard adoption metrics.
Intent workshops, accessible documentation, governance rituals, and ongoing support keep metrics and alerts transparent.
Kestra/Airflow for orchestration, dbt for modeling, Airbyte/dlt for ingestion, Looker Studio for KPIs, plus Python scripts for validation and monitoring.
Real-world data solutions delivering measurable business impact across multiple industries.
Technical Stack:
Consolidation of multiple disparate data sources into a single source of truth. Problem solved: time-consuming manual reporting and lack of confidence in numbers due to fragmented data sources.
More than 5 hours saved per week for reporting through automation. 100% confidence restored in numbers thanks to a reliable data architecture. Comprehensive monitoring of key metrics: net sales, bundle performance, upsell rates, and customer acquisition costs.
Medallion architecture (bronze, silver, gold) implemented with Kestra for orchestration. Data ingestion with Airbyte for seamless source integration. Modular dbt models with marts layer for analytics exposure. Kestra workflows with automated retry mechanisms and error handling. Data quality framework with automated testing ensuring data accuracy. Integration with Funnel for marketing analytics data consolidation.
Technical Stack:
Data quality issues and dbt pipeline failures were detected too late, impacting data reliability. Business teams lacked real-time visibility into key business KPIs. Traditional monitoring required manual dashboard checks and context switching, delaying response times.
Developed a comprehensive Slack monitoring system integrating Slack API with dbt and BigQuery. Implemented automated Slack alerts for dbt test failures and pipeline errors (technical monitoring). Built scheduled business KPI reporting with configurable thresholds and automated daily metrics delivery to Slack channels. Created user-friendly Slack commands for immediate data access and interactive dashboards within Slack.
Reduced time to detect dbt failures and data quality issues significantly through instant Slack notifications. Enabled scheduled business KPI delivery directly in team communication channels, ensuring consistent visibility. Improved cross-team collaboration through centralized data communication. Increased data quality awareness and faster incident response times across the organization.
Technical Stack:
Manual tech watch process was time-consuming and inefficient. Needed to aggregate content from multiple sources (Medium emails, RSS feeds) and transform it into actionable insights. Required an automated solution leveraging Modern Data Stack technologies.
Built an end-to-end automated system: Extract Medium articles from email with Python → stored as JSON. Process RSS feeds (XML) through automated Python scripts. Orchestration and transformations with dbt. Data ingestion from GCP to BigQuery using dlt. Deployment with Cloud Functions, Cloud Build, and Cloud Run. Infrastructure managed with Terraform.
Fully automated tech watch process eliminating manual work. Centralized data collection enabling comprehensive content analysis. Foundation for future AI-powered features: LLM summarization and AI agent for intelligent content discovery. Personal project demonstrating end-to-end data engineering capabilities.