Our client, a general insurance company, is looking to scale it's existing AWS data lake and cloud and is looking for a Senior Data Engineer to join its team in London.
The company have gone through an enterprise wide transformation with the way they use and store data and with a companywide agile transformation they are looking for senior data engineers to build pipelines from source and legacy systems into the data lake for a variety of use cases. Use cases include machine learning and real-time data.
As a senior data engineer you will collaborate with engineering and product stakeholders to build, optimise, maintain, and secure new products that have data at their heart. You will be asked to contribute to logging of data, information architecture, ETL pipelines, and delivering tooling that provide key analytics insights for the business.
You will be working on various use cases driven by stakeholders and assuring they have access to data through building data pipelines and ETL processes.
We are looking for candidates that has previously worked with AWS, have strong ability with Python and have worked ideally within a Spark framework previously.
You will need to have solid coding ability with an object orientated programming language. Python is preferred but if you are well versed in another like Scala, Ruby, Java etc. this will also be of interest.
· Python exp. (Other OO languages also looked at)
· AWS experience
· Experience working with large datasets (terabyte scale and growing)
· Solid experience building data pipelines
· Experience with a variety of SQL and NoSQL databases
If you are interested in this role, please send through a CV and we will get back to you ASAP.