Google Cloud Platform Data Engineer:
Client and role for Google Cloud Platform Data Engineer
- 9 month Contract (Inside IR35).
- Fully Remote until September (then in the office on occasion).
- Circa £700 per day (depending on exp).
We're currently helping a global, award winning, digital media organisation to hire a Data Engineer on a 9 month contract in the team responsible for Data Technology.What you'll be doing as a Google Cloud Platform Data Engineer:
Skills and experience needed for Google Cloud Platform Data Engineer:
- You will mostly be programming in Python and/or Scala, Terraform, SQL and running in GCP, but we use the tool that best fits the problem.
- You will be working with data warehouse technologies including BigQuery, DataProc, Apache Spark, DBT and Cloud Composer (Apache Airflow).
- You will contribute to implementation of our new data platform on GCP. Including leveraging new tools within GCP.
- You will build a new generation of tooling to support our data engineering projects.
- You will be expected to conduct knowledge sharing on GCP through pairing, collaboration and coaching others across the department.
As a data engineer, you will create and maintain our data pipelines. You'll be equally willing to create data tools for analytics, insights and business performance metrics whilst building the infrastructure required for optimal extraction, transformation and loading of data.
- You have 1-3 years experience working on GCP.
- You are a certified GCP Data Engineer or you have significant data engineering experience on GCP.
- You are comfortable working collaboratively with other teams. Including data science, data modellers and product engineering teams.
- You understand the importance of data quality and have familiarity with building high quality data assets.
- You have built data pipelines for large datasets using ETL/ELT.
- You work collaboratively with Data Science to deploy models into production.