Data Engineer

Data is at the core of SCOR strategic plan Forward 2026 as one of its four enablers. As part of this ambition, the Data Platform team has a key role to play. 

Responsible of the technological stack & platform, the data delivery, the overall data architecture and the entreprise data model, the Data Platform team supports the delivery of analytical use cases with expertise (data engineering, data modeling, data architecture) and evolves & operates a state-of-the-art platform infrastructure stack, while ensuring the global consistency of data across SCOR information system. 

The Data Platform team, international and multi-cultural, mixes various profiles (data engineering, data modeling, data and platform architecture) to make SCOR the reinsurer of tomorrow.

As part of this team, the Data Engineer builds, delivers and maintains data pipelines, ensuring consistent data flows under his/her area of responsibility, from data ingestion, preparation to exposition. The Data Engineer works in close collaboration with systems and analytics teams and is accountable for the quality of data pipelines he / she delivers.

The Data Engineer needs to:

  • Be problem-solving oriented with strong analytical thinking
  • Be autonomous and rigorous in the way he/she approaches technical challenges
  • Collaborate with various stakeholders (product owners, solution owners, analytics experts, developers, architects) to deliver data artefacts following a state of the art approach
  • Commit and bring his/her skills to contribute to the success of the Group

Under the responsibility of a Lead Data Engineer, your mission will be to:

  1. Build, deliver and maintain data artefacts (data pipelines & services) in accordance with the strategic priorities of your domain:
    • Develop data pipelines leveraging ELT / ETL techniques to ingest, transform and display data for well-defined purposes following state-of-the art approach (medaillon architecture for data, gitflow for development, unit tests where applicable…)
    • Tackle key technical questions linked to data, like parallelization, calculation on big volumes of data, performance & cost optimization of queries, etc.
    • Develop, when relevant, taylor-made services / APIs to expose the data for various usages (datasets, APIs, …)
    • Work closely with data modelers, analytics and business stakeholders to ensure that data under your domain serves best the business needs, while keeping overall consistency of data definitions
    • Document data artefacts (code documentation for data pipelines / services / APIs, contribution to the data definition, processes, etc.) and reuse, when relevant components or assets (code, frameworks, data objects) to leverage as much as possible on the Data ecosystem
  2. Contribute to Data Chapter:
    • Contribute to the design of solutions (end-to-end data flows) in close collaboration with Architects, Data Modelers, Product Owners & Analytics Experts and advise on best practices to external stakeholders
    • Contribute to the overall data community by sharing good practices, return of experiences, expertise on relevant technology, etc. 
    • Perform peer review amongst other data engineers to ensure consistency and quality of development
  3. Additional activities related to your day-to-day mission
    • Ensure a technological watch on Data platform solutions, especially related to data engineering topics
    • Participate to Scrum rituals (dailys, sprint planning, sprint reviews, retrospectives, etc.)
    • Contribute to ICS

Required experience & competencies

  • +4 years of experience as a data engineer, data oriented mindset
  • Proven experience in development and maintenance of data pipelines
  • Good development practices (gitflow, unit tests, documentation…)
  • Proven experience in agile projects (Scrum and/or Kanban)
  • Knowledge of (Re)insurance industry and / or Financial Services is a plus
  • Awareness on data management and data privacy topics

Technical Skills :

  • Strong level in Python and Pyspark, ability to develop data pipelines under various platform with experience in either Databricks or Palantir Foundry
  • Strong level in SQL (knowledge of ANSI92 SQL, execution plan analysis)
  • Good knowledge of parallelization, distributed programming techniques
  • Good knowledge of datalakes environments and concepts (delta lakes, medaillon architecture, blob storage vs file shares…)
  • Good knowledge in decisional data modeling (Kimball, Inmon, Data Vault concepts…) and associated good practices (slowly changing dimensions, point in time tables, chasm / fan trap management, change data capture management…)
  • Good knowledge of CI / CD pipelines (AzureDevOps, Jenkins, Artifactory, Azure Container Registry…)
  • Basic knowledge of REST API development (Flask, FastAPI, Django…)
  • Basic knowledge of containers (Docker Compose, Kubernetes)
  • Knowledge of reporting tools is a plus (Tableau, Power BI…)

Behavioral & Management Skills :

  • Strong analytical thinking, solution-oriented and force of proposal
  • Capacity to navigate in a matrix and international environment ​
  • Autonomy, rigorous mindset, commitment
  • Curiosity, interest to challenge
  • Communication, ability to speak and interact with various stakeholders
  • Team player

Required Education 

  • Bachelor's degree in computer science, software or computer engineering, applied math, physics, statistics, or a related field or equivalent experience

As a leading global reinsurer, SCOR offers its clients a diversified and innovative range of reinsurance and insurance solutions and services to control and manage risk. Applying “The Art & Science of Risk,” SCOR uses its industry-recognized expertise and cutting-edge financial solutions to serve its clients and contribute to the welfare and resilience of society in around 160 countries worldwide.

Working at SCOR means engaging with some of the best minds in the industry – actuaries, data scientists, underwriters, risk modelers, engineers, and many others – as we work together to find solutions to pressing challenges facing societies.

As an international company, our common culture is defined by “The SCOR Way.” Serving both to build momentum that drives the Group forward and as a compass to guide our actions and choices, The SCOR Way is anchored by five core values, reflecting the input of employees at all levels of the Group. We care about clients, people, and societies. We perform with integrity. We act with courage. We encourage open minds. And we thrive through collaboration.

SCOR supports inclusion and the diversity of talents, and all positions are open to people with disabilities.

Related vacancies