top of page

Data Warehouse Engineer

2+ years of experience | Onsite | Makati or Iloilo

Why Cooee


Because we believe in the power of human connection. Because we are committed to flourishing  human potential. Because we dream of a world where each one of us walks along the path to  who we are and the best that we can be. This is Why we do What we do – ‘To be a part of  transformation one person, one community, one business at a time.’


We are One Team committed to investing in relationships fueled by trust and anchored on the  One Shared Vision ‘to transform through connection’. We believe this is where the strength of Cooee and our partnerships lie – in having clarity and conviction in purpose. 



About the Role


We have partnered with a dynamic Australian start-up that is making a huge impact in the auto  industry. As a Data Engineer, you will directly report to the Head of Data and Analytics and will  be responsible in design and development of data modelling, data pipelines supporting the  company’s platform and apps, as well as providing data and reporting support for business end  users.


What you’ll be working on:

  • Design and development of reporting data model and data transformation jobs, including  the modelling of very large data sets.

  • Identify and implement the most efficient ways of performing data transformation tasks  using best practice methods and tooling.

  • Prepare and maintain documentation such as business requirements documents, design  specifications and test cases.

  • Work with stakeholders (including data team, software engineers and product team) to  understand business requirements and translate these into technical specifications.

  • Lead the data migration and modelling process from GCP to data warehouse.

  • Responsible for data warehouse administration, user access and security.

  • Contribute to the design and implementation of our data model and ETL framework.


What we’re looking for:

  • Minimum 2+ years of experience in a data engineering environment, with hands on  experience building and maintaining complex data environments in the cloud (preferably  GCP BigQuery and/or Snowflake).

  • Extensive experience with SQL (Postgres preferred), with a core focus on analyzing and  validating complex and disparate data sets to find gaps between datasets, requirements,  and source systems.

  • Demonstrate understanding and experience with following data engineering  competencies:

  • Data warehousing principles, including data architecture, modelling, database  design, and performance optimization best practices.

  • Building group data assets and pipelines from scratch, by integrating large  quantities of data from disparate internal and external sources.

  • Supporting analytics solutions to be productionised, including deployment,  automation, orchestration, monitoring, and logging. Preferably with an ETL tool  such as Matillion, DBT, or equivalent.

  • Experience in deploying cloud infrastructure as code (IaC) using Terraform or similar.

  • Experience using Python to develop scripts and small programs for job orchestration  and/or data manipulation.

  • Ability to interact with business end user to draw and distil business requirement into data  pipeline design and reporting solution.

  • Ability to prioritize on the fly and work in a high-performing, outcomes- focused  environment with multiple competing and ambiguous deliverables.

  • Working in an Agile development environment

bottom of page