Introduction

Brunel are partnered with a leading Energy and Resource company to deliver the best work opportunities for IT & Technical professionals within Australia. We are currently seeking an Senior Data Engineer to work on a contract basis on an exciting new Data, Integration & Analytics platform project.

About this role

The Data Engineer will contribute to the technical development and operational support of our Data, Integration and Analytics (DIA) Platform. This platform will empower operational and corporate data reporting, analytics, and AI-driven insights. The Data Engineer will support project teams, collaborate with cross-functional teams, and contribute to data standards, policies and guidelines to ensure the platform meets the evolving needs of the business. The role is essential to supporting efficient data processing, modern data pipelines, real-time analytics, and enabling data-driven decision-making across the organisation.

Deliverables - Technical & Operational

Platform Development & Optimisation:

Demonstrated experience contributing to the development and optimisation of data platforms and implementing automation for data pipelines, with growing exposure to AI/ML workflows. Familiarity with Databricks is essential and Fivetran or similar technologies.

Data Pipelines:Design, implement, and maintain scalable and resilient data pipelines within the DIA Platform across the Databricks and Microsoft Fabric environments. Leverage Microsoft DevOps to establish robust CI/CD workflows for Databricks pipeline deployment, ensuring version control, automated testing, and seamless promotion across development, staging, and production environments. Focus on transforming ingested data into curated, analytics-ready datasets across the Bronze, Silver, and Gold layers of the lakehouse. Automate pipeline orchestration for both batch and streaming Kafka workloads, applying best practices in data quality, lineage, and performance optimisation. Implement Infrastructure as Code principles and collaborative development practices to ensure data pipelines are modular, reusable, and consistently deployable.

Data Integration: Develop integration pipelines using modern data integration techniques for API-based, event-driven, and batch ETL/ELT data ingestion. Ensure high level of data integrity for master and reference data to effectively process transactional events across ERP, IoT, and SaaS systems. Design integrations that are secure and resilient, aligned with enterprise architecture principles and platform performance requirements. Work closely with the infrastructure teams to ensure systems performance and maintain a high level of cyber security. Work closely with the data architect and data owners to extend the enterprise data model and ensure timely and accurate data delivery for operational and analytical consumption.

Reporting & Analytics Enablement: Collaborate with business stakeholders and data stewards to understand their data and reporting needs and translate those into actionable data models supporting the development of dashboards, and visualisations. Continuously optimise analytics workflows to improve business outcomes.

AI/ML Pipeline Development: Exposure to building and maintaining data pipelines that support AI/ML models in production environments, with enthusiasm for enabling data-driven decision-making and predictive analytics.

Is this you?

Technical experience: 5+ years of experience in data engineering and analytics, with solid hands-on experience on cloud platforms such as Azure and AWS, and exposure to technologies like Databricks and Databricks Delta Lake.

Platform Development: Demonstrated experience contributing to the development and optimisation of data platforms and implementing automation for data pipelines, with growing exposure to AI/ML models in production environments.

Cloud Platforms & Tools: Solid foundation in cloud technologies, with working proficiency in AWS and/or Azure (including S3/Blob Storage, Lambda/Azure Functions) and familiarity with stream and batch technologies such as Kafka and Apache Spark.

Medallion Architecture: Practical experience designing or implementing Medallion architecture (Bronze, Silver, Gold layers) using Databricks/Delta Lake or equivalent, including incremental ingestion, data cleansing and transformation patterns, data quality controls, and promotion of data from raw to curated layers.

DevOps & Platform Services: Experience implementing DevOps practices for data platforms, including CI/CD for data pipelines and infrastructure, Infrastructure as Code (Terraform, ARM templates or similar)

What we offer 

  • Salary sacrificing
  • Employee Assistance Program (EAP)
  • Corporate discounts

About Brunel:

Brunel is a recruitment and flexible workforce solution provider which connects talented people with opportunities throughout Australasia and around the world. We specialise in highly skilled roles across a variety of technical, professional, trades and craft disciplines, pairing candidates with industry-leading projects and organisations on a contract, permanent or secondment basis.

Operating in Australasia since 1997, Brunel has major bases of operation in Perth, Sydney, Brisbane and Port Moresby, which are further backed by the strength and reach of a truly global network spanning over 45 countries, 120 offices and 45 years of successful operation.

Brunel is proud to be equal opportunity employer and encourages applications from Aboriginal and Torres Strait Islander and female candidates.

Do you have questions?

If you have any questions or would like to discuss the details of this role, please contact Joel Bellinger-Brown.

Closing: 05 October 2025

Do you have questions?

Vacancy reference: CR-264311

JB

Corporate recruiter

Joel Bellinger-Brown

+61 8 9429 5672

Contact