Data Engineer ( Python / Databricks / DBT / Snowflake )
Data Engineer ( Python / Databricks / DBT / Snowflake ) is required to join a leading Open Banking Platform app who are changing the way businesses provide financial platforms.
With the founders previously creating multi-million pound fintech firms, the success rate and global expansion of this company is huge.
Building on this global expansion, they have exciting plans for further growth within the Data and AI team who are focused on building out the capabilities with handling large scales of data to both run the business and super power the data driven insights and products for their expanding customer base.
This is a collaborative role in which you will be a part of a team of Data Analysts, Engineers and Scientists to deliver insights, research, monitoring and ML pipelines across the entire company.
The platform is Cloud Agnostic and is built from Looker, Snowflake, Databricks and Fivetran using scalable and flexible power to deliver insights.
Roles of responsibility for the Data Engineer ( Python / Databricks / DBT / Snowflake ) include :
• Build and manage the data ingestions, integrations, calculations, pipelines, feeds that drive the business
• Collaborate with Data Analysts and Data Scientists, Engineering teams, Operations Teams, Compliance and Risk Monitoring, Sales and Customer Success teams to ensure the quality, accuracy and availability of data meets their needs.
• Strong focus on accuracy, validation, quality and timeliness of data
• Continually advance and improve our system of change management, peer-reviewed code and automated numbers testing capabilities.
Essential and desirable skills for the Data Engineer ( Python / Databricks / DBT / Snowflake ) include :
• Degree educated, ideally in a computer science, mathematical, statistical or data science related subject or strong professional experience in this area
• Tools/technical skills:
○ Batch and streaming data pipelines and calculations in Databricks, PySpark / Scala to build scalable, testable and reusable components.
○ Proficiency in Python and Object Oriented Programming practices
○ Proficiency in SQL
○ Strong craftsmanship and focus on code quality, unit and E2E testing.
○ Experience deploying Infrastructure as Code with Terraform and Gitlab CI/CD would be a huge bonus!
○ Experience with Data Warehouse design and modelling (Snowflake, Synapse, Redshift or similar)
○ Experience with Scrum and Agile methodologies ability to step up as Scrum Master and lead sprint ceremonies.
○ Experience in building dashboards for internal and embedded external visualisation in tools such as Looker or experience in similar tools such as Tableau, Power Bi, Sisense, Dash or similar
○ Desirable : Fivetran and dbt ETL
• Experience in developing for and releasing to a business critical production environment
• Logical problem solver and critical thinker with a good ability to prioritise projects (Eisenhower Matrix, GTD method).
• Excellent communicators who can explain technical details to people in other roles and understand other people's perspectives and priorities.
• Enthusiasm for delivering excellent value for the business.
• Prior experience in a FinTech role or experience in working with Financial transaction data and compliance functions a big plus
If you would like to apply for the above Data Engineer ( Python / Databricks / DBT / Snowflake ) role, please click apply using your recent CV