Senior Data Engineer
Cognizant
Date: 11 hours ago
City: Perth, Western Australia
Contract type: Full time

What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU — people with an entrepreneurial spirit who want to make a difference in this world.
At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story.
Position Summary
Senior Data Engineer with over 11 years of experience in ETL, specializing in building pipelines and models using Streamsets and DBT (Data Build Tool). Experienced with Snowflake, Oracle DBA, AWS Cloud-based services such as S3, EC2, RDS, and Apache Airflow. Familiar with data pipeline orchestration, monitoring, and troubleshooting. Proficient in Terraform, YAML, and Python for infrastructure automation and data engineering workflows. Experience in the mining industry is preferred.
Mandatory Skills
Date of Posting: 27-May-25
Next Steps: If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us.
For a complete list of open opportunities with Cognizant, visit http://www.cognizant.com/careers. Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check.
At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story.
Position Summary
Senior Data Engineer with over 11 years of experience in ETL, specializing in building pipelines and models using Streamsets and DBT (Data Build Tool). Experienced with Snowflake, Oracle DBA, AWS Cloud-based services such as S3, EC2, RDS, and Apache Airflow. Familiar with data pipeline orchestration, monitoring, and troubleshooting. Proficient in Terraform, YAML, and Python for infrastructure automation and data engineering workflows. Experience in the mining industry is preferred.
Mandatory Skills
- Design, develop, and maintain Snowflake-based data ingestion and replication pipelines using AWS Lambda, Airflow, DMS (Database Migration Service), and Streamsets.
- Collaborate with source teams to gather data requirements and ensure smooth data ingestion and migration for each pipeline or project.
- Design and implement Snowflake data models with a focus on performance tuning and best practices
- Collaborate with source teams to gather requirements and ensure smooth data migration.
- Ensure data governance, security, and compliance across all data pipelines.
- Mentoring team members to work individually for the new tools in our environment.
- Leverage AWS services (Lambda, S3, EC2, Airflow, DMS) for data storage, compute, transformation and processing.
- Use Terraform and YAML for infrastructure-as-code (IaC) to automate pipeline deployment.
- Manage real-time data replication using AWS DMS (Database Migration Service).
- Implement CI/CD best practices for automated data pipeline deployment and version control.
- Monitor, troubleshoot, and optimize pipeline performance for high availability and reliability.
- Work with DBT (Data Build Tool) to transform raw datasets into meaningful insights, create views/tables, and develop macros for different business logic
- 6+ years of experience in Snowflake, including data modeling, performance tuning, and optimization.
- 11+ years of experience as an Oracle DBA, including performance tuning and database administration.
- Experience in DBT for transforming raw datasets, creating macros, and building reusable logic.
- Hands-on experience with AWS services (Lambda, S3, EC2) and DMS (Database Migration Service) for real-time replication.
- Proficiency in Terraform, YAML, and Python for infrastructure automation and data engineering workflows.
- Experience with Streamsets, Apache Airflow, and SQL-based ETL development.
- Knowledge of traditional databases (Oracle, MS SQL Server) and migration to Snowflake.
- Familiarity with data pipeline orchestration, monitoring, and troubleshooting.
- Experience with Power BI for data visualization and reporting.
Date of Posting: 27-May-25
Next Steps: If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us.
For a complete list of open opportunities with Cognizant, visit http://www.cognizant.com/careers. Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check.
See more jobs in Perth, Western Australia