
APS6 Senior Data Engineer
Talent – Specialists in tech, transformation & beyond
Posted 7 hours ago
The opportunity:
Our highly valued client is a federal government agency. They are seeking an APS6 Data Engineer, you will lead a project stream of a data warehouse platform. This includes updating data development documentation, providing technical guidance to developers and reviewing/approving data designs and data models.
- Richmond location plus hybrid work-from-home
- 12 month initial contract + 12 month extension, rates fully negotiable
- Role only open to Australian Citizens - Federal government client
The role:
You will join a centralised team focusing on future process improvement, delivering data and reporting effectively with clear accountabilities, while building data literacy capability across the branch and regions. The team develops, governs, and maintains an enterprise data warehouse, reporting platforms and production content. They design and build Business Intelligence (BI) interventions and prototype analytic solutions and reports, identifying trends and drivers of performance.
Your duties will include:
- Own the technical implementation of new data from analysis to delivery (designing and delivering data dictionary, defining the release steps, defining best practices of data modelling and review steps)
- Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team
- Prioritise work items and add them to a work queue
- Understand, analyse and size user requirements
- Development and maintenance of SQL analytical and ETL code
- Optimisation and tuning of SQL code to ensure optimal performance
- Development and maintenance of system documentation
- Work within a state of the art, greenfield dev ops environment
- Collaboration with data consumers, database development, testers and IT support teams
Skills and experience:
To succeed in this role you will need:
- Proficiency in AWS services related to data engineering: AWS Glue, S3, Lambda, Redshift, Athena, Kinesis, EMR, and Step Functions, Cloud Formation or Terraform
- Data pipeline design and development using ETL/ELT frameworks
- Ability to understand DevOps process and can use DevOps tools in accordance with the process.
- Proficiency in programming languages: Python (preferred), Java, or Scala
- Strong SQL skills for data manipulation and analysis
- Hands-on experience in cloud-based data architecture and deployments and working with Containerized environments (Docker,ECS,EKS)
- Teradata/Snowflake/Redshift/Databricks: Demonstrated Competency in Developing, Auditioning and reviewing code. Any 2 Technologies.
- Strong EDW Design Concepts, ETL, ELT Knowledge.
- Exposure to at least one ETL tool like DBT, Talend, Informatica etc.
- SAS Base Programming
Please note that our client is a federal government organisation and can only consider Australian Citizens for selection.
Apply:
Submit your resume, or for further information please contact [email protected]
For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career
About Talent – Specialists in tech, transformation & beyond
This company does not have any further information provided at this time. We encourage you to research the company by searching for them to learn more about the company or role in question before applying.
AWS Data Support Engineer - Level 2/3
Talent – Specialists in tech, transformation & beyond

Senior Data Engineer - 6 Month Contract
PRA
Salesforce Data Integration Specialist
Technology Recruiting Solutions
Pyspark Engineer
Talent – Specialists in tech, transformation & beyond

Senior Software Engineer - .Net
The Hassett Group

Data & AI Consultant - Multiple Positions
Akkodis

Senior Software Engineer .Net
PRA
Senior Network Engineer - EMR
Grampians Health
