This job has expired

Big Data Architect

Staffworx Ltd
Closing date
15 Apr 2021

View more

Technology & New Media
Contract Type
You need to sign in or create an account to save a job.

Job Details

Big Data Architect, Data Architect, contract, work from home.

Architect, design, estimate, developing and deploy cutting edge software products and services that leverage large scale data ingestion, processing, storage and querying, in-stream & batch analytics for Cloud and on-prem environments.

Solid experience in;
  • Data related technologies, to include knowledge of Big Data Architecture Patterns and Cloud services (AWS/Azure/GCP)
  • Delivering end to end Big Data solutions on premise and/or on Cloud
  • Pros and cons of various database technologies like Relational, NoSQL, MPP, Columnar databases
  • Hadoop eco-system with one or more distribution-like Cloudera and cloud specific distributions
  • Java and Scala programming languages (Python a plus)
  • NoSQL database (Mongo DB, Cassandra, HBase, DynamoDB, Big Table etc.)
  • Big data ingestion tools (Sqoop, Flume, NiFI etc.), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub etc.)
  • Data processing framework eg Spark (Core, Streaming, Pyspark SQL), Storm, Flink etc.
  • Scalable data models addressing a wide variety of consumption patterns including random-access, sequential access including necessary optimisations like bucketing, aggregating, sharding
  • Performance tuning, optimization and scaling solutions from a storage/processing standpoint
  • Experience building DevOps pipelines for data solutions, including automated testing

Ideally with some of the following:
  • containerization, orchestration and Kubernetes engine
  • Big data cluster security (Authorization/Authentication, Security for data at rest, data in transit)
  • Monitoring and alerting for Big data clusters
  • orchestration tools Oozie, Airflow, Ctr-M or similar
  • MPP style query engines like Impala, Presto, Athena etc.
  • multi-dimensional modeling like start schema, snowflakes, normalized and de-normalized models
  • data governance, collibra, catalog, lineage and associated tools would be an added advantage
  • cloud platforms or big data technologies

INSIDE IR35 - this assignment will fall within scope of IR35 legislation and appropriate umbrella company should be utilised during this assignment.

#dataarchitect #kubernetes #contractjobs #googlecloudplatform #bigquery #googledatastudio #gcp #aws #hadoop #spark #hbase #dataengineering #gcp #dataengineeringjobs #datapipelines #staffworx #recruitmentpartner #collibra #pyspark

This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global digital, E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.

Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration

This job was originally posted as
You need to sign in or create an account to save a job.

Get job alerts

Create a job alert and receive personalised job recommendations straight to your inbox.

Create alert