Hadoop Admin
IQVIA
Contract Plymouth Meeting, Pennsylvania, United States Posted 3 years ago
About Position
Hadoop Admin (Contract)
$70.00 / Hourly
Plymouth Meeting, Pennsylvania, United States
Hadoop Admin
Contract Plymouth Meeting, Pennsylvania, United States Posted 3 years ago
Skills
Bachelor's Degree in information technology Computer Science or other relevant fields. 5+ years of systems analysis experience systems programming experience or combination of both. 4+ years of experience with Big Data or Hadoop tools such as Spark Hive Kafka and MapReduce. 3+ years of Red Hat Linux or UNIX experience 2+ years of Hadoop Hortonworks of Cloudera CDH experience. Knowledge of Kerberos and Apache Ranger for configuring security. Cloud deployment experience AWS/Azure preferred. Microsoft Azure Administration cloud deployment architecture Windows server administration. Azure Active directory and domain services. Network administration load balancer firewall routing configuration. Automation via runbooks templates and powershell scripting. Able to monitor Hadoop cluster connectivity and performance. Manage and analyze Hadoop log files. File system management and monitoring. General operational expertise such as good troubleshooting skills understanding of systems capacity bottlenecks basics of memory CPU OS storage and networks. Hadoop skills like HDFS Hive Impala Solr HBase. Ability to deploy Hadoop cluster add and remove nodes keep track of jobs monitor critical parts of the cluster configure name node high availability schedule and configure it and take backups. Knowledge and understanding of technology support: open tickets or calls with vendors. Familiarity with open source web applications and middleware tools their deployment and maintenance. Knowledge of troubleshooting custom built middleware applications is a plus.Description
Responsible for implementation and ongoing administration of Hadoop infrastructure on Cloudera Data Platform CDP and CDH.
Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
Working with internal teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive and MapReduce access for the new users.
Performance tuning of Hadoop clusters and Hadoop MapReduce routines Screen Hadoop cluster job performances and capacity planning Monitor Hadoop cluster connectivity and security Manage and review Hadoop log files. File system management and monitoring.
Teaming with the infrastructure, network, database, application, and business teams to guarantee highly available systems.
Collaboration with application teams to install Hadoop updates, patches, version upgrades when required.
Collaborate with end users to resolve complex issues/provide solutions that meet business needs and benefit system performance Troubleshoot application problems related to configuration, network, and server issues, and provide solutions to recovery. Participate in postmortems to avoid repeated incidents.
User support and provisioning access
Process improvement, including the creation of new automation to streamline manual tasks Develop and update documentation, departmental technical procedures, and user guides.
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.