Boltin

Data Engineer

Melbourne • Contract/Freelance

About the job

Senior Data Engineer

Rate: AUD750 inclusive of Superannuation /Day

Location: Melbourne

Contract Duration- 3 months duration

The role focuses on developing, modernizing, and migrating applications and data pipelines to AWS Cloud. You will design and implement scalable ETL solutions using PySpark, SparkSQL, SQL, and AWS Glue, while applying best practices in DevOps, CloudOps, and DataOps.

This role is ideal for candidates with hands-on experience in AWS-based development, data migration projects, and building cloud-native data solutions.

Key Responsibilities

  • Design, build, and maintain data pipelines and ETL workflows using AWS Glue, PySpark, and Python.
  • Lead and support application and data migration projects to AWS Cloud.
  • Work with AWS services (S3, Glue, Lambda, EMR, Redshift, RDS, Step Functions, etc.) for scalable data solutions.
  • Optimize and manage SQL-based data transformations and queries.
  • Implement dimensional modeling and support reporting/analytics needs.
  • Contribute to Lakehouse and Data Warehouse architectures on AWS.
  • Apply DevOps & CI/CD practices for cloud deployments, monitoring, and automation with an SRE mindset.
  • Collaborate with engineering teams to design and develop APIs and microservices for data and application integration.
  • Ensure version control and collaborative coding practices with Git.
Required skills
AWS DevSecOps SQL CI/CD pipelines Python AWS Lambda Cloud Platforms (AWS, Azure, GCP) Redshift
About the company
Boltin