Hayward Hawk are partnered with our established client who are currently recruiting for Data Engineers to join on a permanent basis. Working for one of the Top 20 "Best Employers in the World", this diverse role includes engaging with key stakeholders, understanding their needs and transforming their data.
The successful candidate duties will include:
* Identifying and analysing user requirements through engaging with key stakeholders.
* Prioritize, assign and execute tasks throughout the software development life cycle
* Develop applications
* Write well-designed, efficient code
* Review, test and debug team members' code
* Combine raw information from different sources
* Explore ways to enhance data quality and reliability
Ideal candidates will have considerable experience working on large scale projects dealing with Data Integration into Data Lakes in Agile environment.
Essential Criteria includes being:
* Immensely proficient with Talend Data integration and Talend Big Data, Talend data quality.
* Previous experience in setting up the ETL architecture for the data platform on any of the cloud provider infrastructure preferably AWS or Cloudera platform.
* Consolidated experience in setting up the Talend TAC server and managing the infrastructure tools, identifying and troubleshooting the Talend infrastructure issues.
* Experience with Talend On-cloud version would be an advantage.
* Considerable experience building ETL pipelines from various sources like RDBMS, NoSQL, Kafka and other batch or streaming sources and flat files
* Experience with AWS DMS and AWS S3
* Solid experience on working with various data formats like Parquet, Avro, CSV, JSON and XML and other unstructured formats in batch and real time environments.
* Strong knowledge of big data processing framework - Spark and should be able to tune the ETL pipelines build using the framework.
* Strong understanding of distributed data processing and MPP databases.
* Strong understanding of relational database concepts & technology - data modelling (dimensional/data vault), SQL, query optimization.
* Knowledge of any of the NoSQL databases like HBase, MongoDB etc would be an advantage.
* Understand existing ETL tool and/or business logic and rules, data sources and convert same ETL jobs to Talend pipelines
* Familiarity with CI/CD process as well as tools and processes including Git, Jenkins, Jira. Candidate would be required to set up the complete CI/CD and development process within the project.
* Experience with setup and configuration, setting up design best practices, coding best practices and configuration management process for the data engineering team as well as with BI tools
* Experience working in Development, Support, Maintenance, and Enhancement projects
For further information regarding this role please apply for this role below or alternatively speak or contact Ryan Hughes in strictest confidence