Available for new missions

Camille Lebrun

Data Consultant & Analytics Engineer | Data Engineering & Analytics Engineering Missions | Available in Toulouse đź§± & Remote | Expert Modern Data Stack

3+

Years crafting modern data stacks

+8h

Weekly reporting time saved per project

5+

Custom data stacks deployed end-to-end

About

I am Camille Lebrun, a Data Consultant and Analytics Engineer based in Toulouse, France. With a dual scientific and data background (Msc in Biochemistry and Master's in Data & AI), I have 3+ years of experience as an Analytics Engineer and Data Consultant. Specializing in Data Engineering and Analytics Engineering missions, I work with clients remotely across France and internationally.

I leverage technical versatility with the Modern Data Stack (Python, SQL, dbt, BigQuery, Airflow, Kestra, data visualization) and passion for the profession to best understand my clients' needs and challenges. I can effectively support them in their data valorization projects.

Curious and adaptable, I excel at selecting the right data tools and architecture while remaining a hands-on contributor. I focus on translating complex business needs into robust, autonomous technical implementations within a team.

Functional Domains

Retail Marketing FMCG E-commerce E-retail Media Skin Care Fashion & Apparel

Projects

Recent Missions

End-to-end data projects with measurable business impact

Tech Stack

Modern Data Stack Expertise

dbt logo dbt
BigQuery logo BigQuery
Airflow logo Airflow
Kestra logo Kestra
Python logo Python
Airbyte logo Airbyte
dlt logo dlt
Looker Studio logo Looker Studio

FAQs

Frequently Asked Questions

Enterprise-grade delivery practices and tooling choices.

How do you orchestrate a data platform ready for business teams?

You orchestrate data platforms through source audit, medallion modeling, dbt and BigQuery pipelines, then fast delivery to Looker Studio or Streamlit for day-to-day consumption.

How do you guarantee the reliability of pipelines and dashboards?

You ensure reliability through automated dbt tests, Slack/Cloud Monitoring alerts, anomaly detection via custom SQL, and volume/freshness guards on every critical table.

Can you optimize an existing stack without starting from scratch?

Yes, you launch with a technical audit, identify quick wins (dbt refactoring, partitioning, cost controls) and document an actionable roadmap for Modern Data Stack optimization.

Which KPIs prove the impact of your data missions?

You measure impact through incident detection time (-90%), hours saved in reporting (+8h/week), data reliability (green tests) and dashboard adoption metrics.

How do you collaborate with product, marketing, or ops teams?

You collaborate through intent workshops, accessible documentation, governance rituals, and ongoing support to keep metrics and alerts transparent for all stakeholders.

Which tools do you prioritize for orchestration, governance, and reliability?

You prioritize Kestra/Airflow for orchestration, dbt for modeling, Airbyte/dlt for ingestion, Looker Studio for KPIs, plus Python scripts for validation and monitoring in Modern Data Stack implementations.

Field Notes

Latest Articles

View all articles →
Available for new missions
Contact me