Overview

  • Hadoop administration/developer with Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark/Scala, Hive, Kafka, YARN, and Zoo Keeper.
  • Should be good in Docker and Container concepts in redhat openshift environment.
  • Managing CDH and Managed Services.
  • Performance & Resource Management.
  • Importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Developing shell/python scripts to transform the data in HDFS.
  • Backup and Disaster Recovery.
  • Cloudera Navigator Data Management Component Administration.
  • Create a Multitenant Enterprise Data Hub.
  • Good knowledge in back-end programming, specifically java.
  • Writing high-performance, reliable and maintainable code.
  • Ability to write MapReduce jobs.
  • Good knowledge of database structures, theories, principles, and practices.
  • Hands on experience in HiveQL.
  • Familiarity with data loading tools like Flume, Sqoop.
  • Knowledge of workflow/schedulers like Oozie, CronTab.
  • Analytical and problem solving skills, applied to Big Data domain
  • Good aptitude in multi-threading and concurrency concepts.
  • 8 plus years of IT industry experience
  • 4 plus years of experience in handling Big data & Apache tools like Kafka, Scala,
  • 2 plus years of admin experience in installation, clustering, performance management of Hadoop cluster
  • 2 plus years of experience in handling tenancy aspect of handing multiple Hadoop clusterAdmin , installation, configuration experience in DevOps tools like Jenkins will be helpful
  • Must have good oracle communication
  • Must have good familiarity with tools like Scala, Kafka, Spark and Java

– provided by Dice

To Apply: https://www.jobg8.com/Traffic.aspx?tIfNKiSXLe45%2fS%2fjQrGGWAu