Graduate in IT stream with experience in DWBI and Big Data implementation. Extensive experience building and designing large-scale distributed applications. Good knowledge in back-end programming, specifically java, JS, Node.js and OOAD. Writing high-performance, reliable and maintainable code. Ability to write MapReduce jobs. Good knowledge of database structures, theories, principles, and practices. Ability to write Pig Latin scripts. Hands on experience in HiveQL,Spark. Familiarity with data loading tools like Flume, Sqoop. Knowledge of workflow/schedulers like Oozie. Analytical and problem solving skills, applied to Big Data domain. Proven understanding with Hadoop, HBase, Hive, Pig, and HBase. Experience in one of the following scripting languages - Perl/Python/Ruby. Good aptitude in multi-threading and concurrency concepts. Design and execute business test scenarios for unit testing and communicate with developers any issues. Strong written and verbal communication skills, including presentation skills. Strong persuasion and negotiation skills, including conflict resolution skills. Ability to work independently. Self-motivated with ability to drive projects. History of meeting deadlines in uncertain environments, and delivering high quality results. Have profound knowledge of various strategies for loading Dimensional Data. Experience in implementing ETL Best Practices like Parallelism, etc. Expertise in Performance Tuning of existing ETL Architectures and identification and resolution of Performance Bottlenecks. Deep understanding of schedulers, workload management, availability, scalability and distributed data platforms.
The candidate will develop ETL process to extract, transform and load data to the data warehouse, Provide leadership and direction to the development team to ensure design and development meet team standards, Perform code reviews for ETL mappings from a standard and performance perspective, Coordinate between various technology teams, support teams, and business units and secure agreement on business requirements, Translate complex functional and technical requirements into detailed design, Perform analysis of vast data stores and uncover insights, Maintain security and data privacy, Create scalable and high-performance web services for data tracking.
Willing to travel in UK, Europe or India including possible temporary relocation to undertake project work.
Working hours: 37.5 hours/week
Last date for receiving applications is 16th August 2017.