Data Specialist
Job description
As the Data Specialist, you will undertake activities relating to data solution build (ingestion, storage, etc.), data modelling, data integration, and related program delivery activities, as well as contribute to data architecture design to meet the needs of our client and their Data+ Program.
This is a hands-on role requiring a diverse range of data-related skills and experience to deliver successfully. This is a rare opportunity to be a key contributor delivering transformational change within a small, multi-functional, high-performance team.
Key Responsibilities
As the Data Specialist, you will have responsibility for the following:
Applications close Monday 21st July 2025.
This is a hands-on role requiring a diverse range of data-related skills and experience to deliver successfully. This is a rare opportunity to be a key contributor delivering transformational change within a small, multi-functional, high-performance team.
Key Responsibilities
As the Data Specialist, you will have responsibility for the following:
- Designing and building scalable and secure data pipelines and integration solutions, including ingestion, transformation, and delivery of high-quality data across the organisation's systems and environments. Including, but not limited to:
- Establish and maintain data validation, transformation, and mapping processes in line with business rules.
- Develop and manage data schemas, metadata, and documentation to ensure quality and consistency
- Support integration of clean, validated data into downstream systems
- Leading the technical implementation of various data tools, ensuring integration into the data ecosystem supports the intended architecture, data flows, governance controls, and user access models.
- Contributing to enterprise data architecture artefacts including conceptual, logical and physical data models, data flow diagrams, integration maps, and architectural patterns aligned with modern cloud-first principles.
- Developing proofs-of-concept and production-ready solutions leveraging Microsoft Fabric’s services such as OneLake, Dataflows, Pipelines, and Synapse.
- Documenting technical design decisions, data transformation logic, metadata lineage, and operational procedures to ensure transparent, repeatable, and supportable solutions.
- Contributing to ensuring solutions adhere to data governance, privacy, and security requirements, working with relevant stakeholders to embed controls within the technical implementation.
- Providing technical leadership and guidance to teams on various data tool capabilities, optimal usage patterns, and continuous improvement opportunities to improve efficiency, reliability, and scalability of data ingestion processes.
- Maintaining a high standard of communication, collaboration, and written documentation to support knowledge transfer and stakeholder engagement across both technical and non-technical audiences.
- Work collaboratively and proactively with the program team and program stakeholders at all times. A cross functional mindset and strong interpersonal skills are essential.
- Participating in the development of training materials, knowledge transfer artefacts, and internal standards to uplift organisational capability in managing and using the new data platform effectively.
- Higher learning degree, or similar, in data science and/or computer science and/or information technology.
- Proven experience (minimum 10 years) in roles encompassing data engineering and data integration.
- Proficiency in best practice data management, data integration, data governance, and data engineering.
- Proven experience in implementing modern data platforms, tools, and languages (including, Microsoft Azure Platform).
- Demonstrated ability to work within an evolving and complex data ecosystem, with a strong grasp of enterprise data architecture and governance concepts.
- Exceptional written and verbal communication skills, with demonstrated experience preparing clear, concise, and technically sound documentation for both technical and non-technical audiences.
- Demonstrated ability to work both autonomously and collaboratively in small, high-performing multidisciplinary teams.
- Demonstrated experience in API integration (REST APIs, JSON/XML, authentication protocols like OAuth2).
- Proficiency in scripting languages such as Python or PowerShell for workflow automation.
- Strong understanding of data validation and quality assurance, including implementing role-based controls and exception handling.
- Proven experience in data modelling, including designing and managing relational and dimensional schemas to support ingestion, transformation, and integration into analytical and operational systems.
- Demonstrated hands-on experience with implementing and using data management platforms (e.g. Microsoft Fabric).
- Experience in designing and implementing master data management practices.
- Demonstrated experience and familiarity with data cataloguing tools (e.g., Microsoft Purview).
- Exposure to modern DevOps practices and working in a project environment.
- Experience with managing structured and unstructured data throughout the data lifecycle.
- Knowledge of Queensland Government Enterprise Architecture Framework.
- Experience in contributing to data architecture design and development.
Applications close Monday 21st July 2025.