Back to jobs

Big Data Engineer

Job description

About our client:
 
Our client is an award-winning, regional, and global tech consultancy with a rich heritage in Data, DevOps, Digital app dev, Machine Learning, and cloud. With a reputation for embracing innovation and change, our client can clearly demonstrate their capability to make a positive impact on their clients.
 
Our client has a passion for developing its people which involves bespoke, personal and technical development pathways. All with the focus on ensuring meaningful and rapid career growth.
 
Having experienced phenomenal growth over the last 5 years, our client is looking to hire ambitious data engineers at all levels. Successful candidates will have the opportunity to collaborate and learn from some of the most talented minds in the Sydney market. They will work on exciting and interesting projects, engage with a broad spectrum of clients to understand their needs and help shape ‘ahead of the curve’ engagements which allow our client to provide unparalleled services to meet their client's needs.   
 
The role:
 
  • Development of end to end data pipelines. We are particularly interested in Google Cloud,  Azure, AWS or Snowflake experience.
  • Advising on data architecture, data models, data migration, integration and pipelines and data analysis and visualisation.
  • Implementing solutions for the establishment of data management capabilities including data models and structures, database and data storage infrastructure, master and metadata management, data quality, data integration, data warehousing, data transformation, data analysis and data governance.
  • Development and execution of Data migrations.
  • Supporting pre-sales activity to promote our client, their capabilities and value to current and prospective clients.
 The successful candidate:
 
  • A strong analytical thinker and problem solver with thought leadership and commercial awareness.
  • Experienced building end to end data pipelines using on-premise or cloud based data platforms.
  • Experienced with hands-on delivery of solutions which include databases, advanced SQL and software development in languages such as Python, Scala, Java, TSQL, PL/SQL to name a few.
  • Knowledgeable in relational and big data architectures, data warehousing, data integration, data modelling, data optimisation and data analysis techniques.
  • Interested and knowledgeable in Big Data technologies and Apache ecosystem technologies such as Beam, Spark, Kafka, Hive, Airflow, NiFi, databases, integration, master data management, quality assurance, data wrangling and data governance technologies.
  • Knowledgeable in Cloud Data Warehouse services. Experience in Google BigQuery, Snowflake, AWS Redshift, Azure SQL DWH or Azure Databricks  is highly desirable.
  • Experienced with public cloud platforms and cloud infrastructure, which is essential.
  • Exposed to ETL/ELT and governance tools (inc Talend, Informatica, Matillion, FiveTran, IBM Datastage, Collibra)
  • Interested in AI and ML technologies and principles.
  • Able to  migrate and transform large, complex datasets from diverse sources, structures and formats, modeled to support analysis for access to quality actionable insights.
What's on offer?

Our client offers a competitive package across all levels. Candidates with 1-5 years of experience are encouraged to apply.
In addtion to this our client offers the following:
  • Access to a wide range of internal training sessions, across both technical and consulting skills.
  • Sponsored certifications across all our client's partner technology vendors. 
  • They provide employees with Life Insurance, Income Protection Insurance and access to an Employee Assistance Program. 
  • Generous Parental leave.
  • Our client is proud to have a sociable workplace that values diversity and celebrates achievement with multiple company-wide, social, and family events throughout the year. Due to Covid-19, all social events are currently virtual.