Data Architect // Data Brick & Kakfa
Talent
Date: 14 hours ago
City: Sydney, New South Wales
Contract type: Contractor

Our client is a well-known client in the Financial Services sector. They are seeking a highly skilled Data Architect to design, implement, and optimize enterprise-level data solutions. The ideal candidate will have expertise in Data Warehousing (DWH), DataBricks, Kafka, and Data Architecture. This role involves working closely with cross-functional teams to develop scalable and efficient data architectures that support business needs.
Responsibilities
Responsibilities
- Design and develop scalable, high-performance data architectures.
- Lead the design, implementation, and maintenance of Data Warehouses (DWH) and Data Lake solutions.
- Utilize Databricks to build and manage big data pipelines.
- Implement real-time data streaming solutions using Kafka.
- Define data modeling standards, best practices, and governance policies.
- Collaborate with engineering teams to ensure seamless integration of data solutions.
- Optimize data storage, retrieval, and processing for performance and efficiency.
- Ensure security, compliance, and data privacy standards are met.
- Provide technical leadership and mentorship to data engineering teams.
- 6+ years of experience in data architecture, design, and implementation.
- Must have:
- Dimensional modelling – hands on designing and developing dimensional models involving different data sets
- Domain modelling – Experience in designing Lake house Tabular data modelling (Flatten tables)
- Defining and building Data quality controls and frameworks
- Hands on experience with Data Bricks, Azure Data factory and real time streaming
- Experience with data governance, security, and compliance best practices.
- Strong problem-solving skills and the ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
See more jobs in Sydney, New South Wales