Data Engineer

Permanent contract

27 Jan 2026

    Akkodis is seeking a skilled Data Engineer to join its innovative and technology-driven teams.

    The Data Engineer plays a key role in designing, building, and maintaining scalable and reliable data platforms that enable data-driven decision-making across business, R&D, and digital products.

    The position focuses on data ingestion, transformation, storage, and secure access to data within modern cloud-based architectures, working closely with stakeholders, architects, analysts, and developers.

    Role :

    Understand business, R&D, and technical requirements related to data usage
    Design, build, and maintain robust and scalable data pipelines (batch and streaming)
    Extract raw data from multiple sources such as APIs, databases, files, IoT systems, and third-party platforms
    Transform, clean, enrich, and standardize data according to a centralized data model
    Develop and maintain backend services and REST APIs to expose data to web applications, BI tools, and analytics platforms
    Manage data storage within data lake, lakehouse, and data warehouse architectures
    Implement access control, security, and data governance policies across the data platform
    Ensure data quality, reliability, observability, and performance of data pipelines
    Optimize SQL queries and data processing workloads
    Support reporting, dashboards, and advanced analytics use cases
    Collaborate with data scientists, analysts, architects, product owners, and project managers
    Contribute to data engineering standards, best practices, and architectural guidelines
    Support end-to-end delivery from requirement analysis to production deployment

    Your profile :

    Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field
    Minimum 3 years of experience as a Data Engineer (medior/senior profiles preferred)
    Experience in R&D, innovation, or technology-driven environments is a plus
    Strong proficiency in English (written and spoken)
    Knowledge of French and/or Dutch is an asset
    Strong interest in modern data technologies and continuous improvement
    Data Engineering and data pipeline development (ETL / ELT)
    Strong SQL skills and experience with relational and analytical databases
    Programming languages: Python (mandatory), knowledge of Scala or Java is a plus
    Big Data and processing frameworks: Spark, Databricks or equivalent
    Workflow orchestration tools: Airflow, Azure Data Factory, or similar
    Cloud platforms: Azure (preferred), AWS or GCP
    Data storage architectures: Data Lake, Lakehouse, Data Warehouse
    REST API development and integration
    Containerization and deployment: Docker, Kubernetes
    BI and visualization tools: Power BI, Tableau or similar
    Understanding of data modeling, data governance, and security principles