
EL1 Data Pipeline Engineer
Talent – Specialists in tech, transformation & beyond
Posted 3 days ago
Our client is a federal government organisation with offices throughout Australia. Due to growth, they are seeking an EL1 Data Pipeline Engineer to join their team in their Richmond or Geelong office.
- 12-month initial contract plus 12-month extension, excellent rates negotiable
- Richmond or Geelong location, hybrid with 3 days per week onsite minimum
- Federal government role - Australian citizenship required
The Data Pipeline Engineer designs, builds, tests, and maintains data pipelines that move data between systems. They work with data scientists and analysts to ensure data quality and security. Typical activities include:
- Design: Creating scalable data pipelines that meet project requirements
- Maintain: Optimising pipelines for performance and scalability
- Monitor: Troubleshooting issues and documenting processes
- Collaborate: Working with analysts and data scientists to understand data needs
- Automate: Writing scripts to automate repetitive tasks
Key duties and responsibilities:
The EL1 Data Pipeline Engineer will provide data pipeline design and advice across one or more of the above-mentioned capabilities. The candidate will:
- need to have technical expertise in scripting languages (e.g. AWS Glue, HashiCorp Terraform)
- have a solid understanding of cloud-based architecture relating to data storage and pipelines
- understand data processing of streams (e.g. AWS Kinesis)
- understand and be able to implement security and access controls around data pipelines
- have high level of communication skills, demonstrating an ability to communicate at the technical and business levels.
- have experience in Agile development methodologies
- have experience in data pipeline ETL tools (e.g. AWS Glue, Informatica, DBT, Talend etc)
- Very good SQL skills
- tertiary qualifications in an ICT related field or applicable industry certifications.
To apply for this opportunity, you will need the Selection Criteria below:
Essential criteria
- Minimum of five years' experience working as a Data Pipeline Engineer in a Cloud Computing environment
- Demonstrated experience configuring (design, build, test, deploy) varying types of data pipelines solutions e.g. MFT, API, File extracts etc.
- Good command of Amazon Web Services as it relates to Data Storage, Extraction, Transformation, Data processing frameworks
- A proficient understanding of security concepts around the above areas
- Experience with Agile development methodologies (Scrum, Kanban, Lean, Xtreme Programming)
- Good problem-solving and communication skills
Desirable criteria
- Experience with GoAnywhere MFT
- Experience with Ctrl-M orchestration
- Experience with CI/CD automation / GITHub / Bitbucket
APPLY:
Submit your resume or contact Shelley at [email protected] or call on 0418 572 482 for further information. Shortlisted will be contacted and applicants will be required to address and complete the above Selection Criteria and clear national police and federal background checks
About Talent – Specialists in tech, transformation & beyond
This company does not have any further information provided at this time. We encourage you to research the company by searching for them to learn more about the company or role in question before applying.
More Jobs
Azure Data Engineer - Contract
Genesis IT Recruitment
Senior Data Engineer-Microsoft Data Platform
AAR Consulting Group P/L
Data Engineer (SQL & Python ETL)
Recruitment Hive
Senior Data Engineer
Ambition
Senior Data Engineer
Bluefin Resources Pty Limited
Data Engineer
Talenza
Browse Jobs
by State
by Classification