Senior Data Engineer
Temporary
Canberra, Australian Capital Territory
07 Apr 2026
Essential Requirements:
Experience & Platform
- 10+ years of professional data engineering experience in enterprise environments.
- Demonstrated hands-on experience across the Microsoft Fabric platform including Lakehouse, Warehouse, Data Factory, Data Engineering (Spark), OneLake, and Power BI.
- Proven experience with Azure data services including Azure Data Factory, ADLS Gen2, Azure Databricks, and Azure Synapse Analytics.
- Strong proficiency in PySpark, Python, and SQL for large-scale data transformation and pipeline development.
- Practical knowledge of Microsoft Purview for data governance, cataloguing, lineage, and sensitivity classification.
Dynamics 365
- In-depth understanding of Dynamics 365 data structures including the Dataverse entity model, table relationships, option sets, and metadata.
- Hands-on experience with D365 Finance & Operations and/or Customer Engagement data architecture and common data entities.
- Practical experience implementing Synapse Link for Dataverse or equivalent D365 data export mechanisms for analytics workloads.
- Ability to interpret D365 functional configuration and translate it into sound data engineering design.
DevOps & Engineering Practice
- Demonstrated experience designing and maintaining CI/CD pipelines in Azure DevOps for data platform delivery.
- Proficiency with Git-based source control, branching strategies, and code review workflows.
- Familiarity with Infrastructure-as-Code (Bicep or Terraform) for automated Azure resource provisioning.
- Experience with automated testing for data pipelines, including schema validation and data quality assertions.
- Solid understanding of data modelling principles including star schema, slowly changing dimensions, and medallion architecture.
- Strong communication skills — able to engage technical and non-technical stakeholders effectively across delivery teams
Desirable criteria:
Legacy Systems Integration
- Proven track record migrating or integrating on-premises or legacy ERP/CRM systems into cloud-native data platforms.
- Experience with heterogeneous source systems including SQL Server, Oracle, SAP, flat file/SFTP ingestion, and REST APIs.
- Ability to assess technical debt and design pragmatic lift-and-shift or re-architecture migration patterns.
Data Science & ML Ops
- Hands-on experience with Fabric Data Science workloads, MLflow experiment tracking, and model operationalisation.
- Ability to prepare feature engineering pipelines and curated datasets to support AI and ML use cases.
Certifications (any of the following valued):
- DP-700: Implementing Data Engineering Solutions Using Microsoft Fabric
- DP-600: Implementing Analytics Solutions Using Microsoft Fabric
- DP-500: Designing and Implementing Enterprise-Scale Analytics Solutions
- PL-300: Microsoft Power BI Data Analyst
- SC-400: Microsoft Information Protection Administrator (Purview)
- AZ-204 / AZ-400: Azure Developer or DevOps Engineer Expert
- MB-700: Microsoft Dynamics 365 Finance and Operations Apps Solution Architect
Akkodis is supporting a significant organisation transformation, transitioning a 20-year-old legacy system into contemporary cloud services. The transformation is reliant on the accurate, secure, and efficient migration of the legacy data into the new systems
Profile
We are seeking an experienced Data Engineer with deep expertise across the Microsoft Fabric and Azure data ecosystem to design, build, and operationalise enterprise-grade data platforms. The successful candidate will drive end-to-end pipeline development, govern data assets through Microsoft Purview, deliver analytics via Power BI, and apply Azure-native data services to solve complex integration and scalability challenges.
A thorough understanding of Dynamics 365 data structures — including Dataverse, D365 Finance & Operations, and Customer Engagement entity models — is essential. The role demands strong DevOps practice, including the design and maintenance of CI/CD pipelines that underpin reliable, automated delivery of data platform components.