Data Engineer (Python, SQL, DAX, Power M) – hybrid – based in Malta

Who are we?

Trasys International is a dynamic global organization that takes pride in being the trusted partner of EU Institutions. With strong commitment to excellence and a 30-years track record of delivering high-quality solutions, we are dedicated to supporting the growth and success of our clients. Our Mission is to help our clients keep up with the challenges of digital transformation by providing the right talent at the right time for the right job. To this end, we are constantly looking for talented professionals who are interested in working on challenging international projects and able to deliver high-quality results within multicultural environments. Our services include (but are not limited to) modernization of solutions, digital workspaces, cloud technologies and IT security. Our Headquarters are in Brussels and we have active accounts and offices across Europe (i.e. Luxembourg, Amsterdam, Athens, Stockholm, Geneva).

Is this YOU?

For our customer based in Malta - an European Institution, we are looking for a Data Engineer to join our team in the area of international protection - supporting Member States in applying the package of EU laws that governs asyluminternational protection, and reception conditions. You will play a crucial role in providing practicallegaltechnicaladvisory, and operational assistance to achieve harmonized asylum practices.

The job will be performed 60% remote and 40% onsite from the client’s offices in Valletta, Malta (hybrid mode), relocation and residing in Malta will be required.

More specifically, you will be responsible for:

  • Contribute to the design, development, analysis, unit testing, documentation, and maintenance of the client’s data warehouse, business intelligence, analytical and data related artifacts.

  •  Design, develop, document, and maintain ETL/ELT processes, data integration, cleaning, transformation, dissemination and automation processes.

  • Create robust solutions to collect, integrate, and process structured, semi-structured, and unstructured data (e.g. JSON, Parquet, Delta).

  • Design, develop, document and maintain data architecture, data modelling and metadata.

  • Develop and support data warehouse/lakehouse architectures and data processing ensuring data quality, lineage, auditing, metadata, logging, linkage across datasets and impact assessments.   

  • Work collaboratively with data providers to address data gaps and optimize source-system structures.

  • Develop and maintain business intelligence models, interactive dashboards, reports and analytics using tools such as Databricks, Jupyter Notebooks, and Power BI.

  • Design, develop, document, improve and maintain the Data Warehouse/Lakehouse ecosystem (e.g. the DataDevOps lifecycle, architecture).

  • Support the gathering and analysis of business requirements, translating them into scalable data collection and integration processes.

  • Contribute to the definition and documentation of data governance policies, procedures, standards, and metadata models.

  • Participate in meetings with project and data teams to align on strategic priorities, ensuring seamless integration and optimal data management practices.

Requirements

Are you the perfect match?

  • University degree.

  • Minimum 5 years of experience in software development in Python, SQL, Power M and DAX.

  • Hands-on experience in the following domains:

    • Structured, semi-structured, and unstructured data types and related file format (e.g. JSON, Parquet, Delta)

    • Gathering business requirements and transforming it into data collection, integration, and analysis processes

    • Microsoft On-Prem and Azure Data Platform tools (such as Azure Data Factory, Azure Functions, Azure Logic Apps, SQL Server, ADLS, Azure Databricks, Microsoft Fabric/Power BI, Azure DevOps, Azure AI Services, PowerShell)

    • CI/CD lifecycle using Azure DevOps

    • Databricks ecosystem, Apache Spark, and Python data processing libraries

    • Data Modelling principles and methods

    • Data Lakes and Data Lakehouse architecture, concepts, and governance

    • Data Integration and data warehouse/lakehouse modeling techniques, concepts, and methods (e.g. SCD, Functional Engineering, Data Vault, Data Streaming, etc)

    • Data governance and data management standards, policies, processes, metadata, quality, etc

    • WebAPIs and OpenAPI standard.

Related vacancies