Big Data Engineer

Fixed Price - Est. Time: 12 months
  • 1,200,000
  • 1
  • 214
  • Closed for Bidding

Project Details

Our Client, A leading Data Sciences Consulting company is looking to augment their team with appropriate candidates with following skills. 

This position is for a Big Data Engineer specialized in Hadoop and Spark technologies.
• Design & implement new components and various emerging technologies in Hadoop Eco System, and successful execution of various projects.
• Integrate external data sources and create data lake/data mart.
• Integrate machine learning models on real time input data stream.
• Collaborate with various cross-functional teams: infrastructure, network, database.
• Work with various teams to setup new Hadoop users, security and platform governance which should be pci-dss complaint.
• Create and executive capacity planning strategy process for the Hadoop platform.
• Monitor job performances, file system/disk-space management, cluster & database connectivity, log files, management of backup/security and troubleshooting various user issues.
• Design, implement, test and document performance benchmarking strategy for platform as well for each use cases.
• Drive customer communication during critical events and participate/lead various operational improvement initiatives.
• Responsible for setup, administration, monitoring, tuning, optimizing, governing Large Scale
• Hadoop Cluster and Hadoop components: On-Premise/Cloud to meet high availability/uptime requirements.
Education & Skills Summary:
• 2-4 years’ relevant experience in BIG DATA.
• Exposure to Cloudera/Hortonworks production implementations.
• Knowlegde of Linux and shell scripting is a must.
• Sound knowledge on Python or Scala.
• Sound knowledge on Spark, HDFS/HIVE/HBASE
• Thorough understanding of Hadoop, Spark, and ecosystem components.
• Must be proficient with data ingestion tools like sqoop, flume, talend, and Kafka.
• Candidates having knowledge on Machine Learning using Spark will be given preference.
• Knowledge of Spark & Hadoop is must.
• Knowledge of AWS and Google Cloud Platform and their various components is preferable.

Location: Gurgaon / Mumbai

No of Postion: 2/1

Skills Required

Project Updates

Friday, 04 Oct, 2019 04:10 PM

We have received overwhelming of bidding. So as of now will evaluate the same and get back if required.