Hadoop Administrator/Developer
Verizon Communications Inc
Contract Irving , Texas, United States Posted 3 years ago
About Position
Hadoop Administrator/Developer (Contract)
$125.00 / Hourly
Irving , Texas, United States
Hadoop Administrator/Developer
Contract Irving , Texas, United States Posted 3 years ago
Skills
MUST HAVE SKILLS (Most Important): • Experience with Hadoop Data Platforms Experience with Relational databases like Oracle • SQL/PLSQL • Unix Shell Script • CRONJOB • HIVE DESIRED SKILLS: • Python • Java • Hive and Spark cluster environments • Qlik SenseDescription
Possess extensive analysis, design and development experience in Hadoop and AWS Big Data platforms
Able to critically inspect and analyze large, complex, multi-dimensional data sets in Big Data platforms
Experience with Big Data technologies, distributed file systems, Hadoop, HDFS, Hive, and Hbase
Define and execute appropriate steps to validate various data feeds to and from the organization
Collaborate with business partners to gain in-depth understanding of data requirements and desired business outcomes
Create scripts to extract, transfer, transform, load, and analyze data residing in Hadoop and RDBMS including Oracle and Teradata
Design, implement, and load table structures in Hadoop and RDBMS including Oracle and Teradata to facilitate detailed data analysis
Participate in user acceptance testing in a fast-paced Agile development environment
Troubleshoot data issues and work creatively and analytically to solve problems and design solutions
Create documentation to clearly articulate designs, use cases, test results, and deliverables to varied audiences
Create executive-level presentations and status reports
Under general supervision, manage priorities for multiple projects simultaneously while meeting published deadlines
Bachelor's degree or Master's degree in Computer Science or equivalent work experience
Highly proficient and extensive experience working with relational databases, particularly Oracle and Teradata
Excellent working knowledge of UNIX-based systems
Excellent Interpersonal, written, and verbal communication skills
Very proficient in the use of Microsoft Office or G Suite productivity tools
Experience with designing solutions and implementing IT projects
Exposure to DevOps, Agile Methodology, CI/CD methods and tools, e.g. JIRA, Jenkins, is a huge plus
Prior work experience in a telecommunications environment is a huge plus
Experience with Spark, Scala, R, and Python is a huge plus
Experience with BI visualization tools such as Tableau and Qlik is a plus
Background in financial reporting, financial planning, budgeting, ERP (Enterprise Resource Planning) is a plus
Exposure to advanced analytics tools and techniques e.g. machine learning, predictive modeling is a plus.
Responsibilities
- Install and configure Hadoop clusters, Sqoop,Python & Spark packages Expertise in administration of Hive, Kafka, Python, Hbase, Spark, Sqoop Manage Hadoop,Kafka,hbase, Sqoop,Hive and Spark cluster environments Apply proper architecture guidelines to ensure highly available services Plan and execute major platform software and operating system upgrades and maintenance across physical environments Develop and automate processes for maintenance of the environment Implement security measures for all aspects of the cluster (SSL, disk encryption, role-based access) Ensure proper resource utilization between the different development teams and processes Design and implement a toolset that simplifies provisioning and support of a
- large cluster environment Review performance stats and query execution/explain plans; recommend changes for tuning Create and maintain detailed, up-to-date technical documentation Ability to shell script with Linux Integration to other Hadoop platforms
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.