Hadoop/Spark Developer

HMS Holdings Corporation

Contract Irving , Texas, United States Posted 6 years ago

 Write a Review Add Vendor   Add Contact  

About Position

Hadoop/Spark Developer (Contract)

$90.00 / Hourly

Irving , Texas, United States

Hadoop/Spark Developer

Contract Irving , Texas, United States Posted 6 years ago

Description

The Senior Hadoop Developer is responsible for designing, developing, testing, tuning and building a largescale data processing system for data products that allow HMS to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs.Required Function 2 Responsibilities Responsible for design, development and delivery of data from operational systems and files into ODS, downstream Data Marts and files. Works with BAs, end users and architects to define and process requirements, build code efficiently and work in collaboration with the rest of the team for effective solutions. Has strong analytical SQL experience working with dimensional modeling. Research, develop and modify ETL processes and job according to the requirements. Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as Informatica, Talend. Knowledge of and experience with any Azure Data Platform components Azure Data Lake, Data Factory, Data Management Gateway, Azure Storage Options, DocumentDB, Data Lake Analytics, Stream Analytics, EventHubs, Azure SQL Translate, load and present disparate data sets in multiple formats and multiple sources including JSON, Avro, text files, Kafka queues, and log data. Will implement quality logical and physical ETL designs that have been optimized to meet the operational performance requirements for our multiple solutions and products, this will include the implementation of sound architecture, design, and development standards. Has the experience to design the optimal performance strategy, and manage the technical metadata across all ETL jobs. Responsible for building solutions involving large data sets using SQL methodologies, Data Integration Tools like Informatica in any database. Deliver projects ontime and to specification with quality.Required Function 3 Qualifications 8+ years of experience in managing data lineage and performing impact analysis. 5+ years of experience with any ETL tool development. 4+ years of experience with Hadoop Eco System. Experience working in Data Management projects. Experience working in Hive or related tools on Hadoop, Performance tuning, File Format, executing designing complex hive HQLs, data migration conversion. Experience working with programing language like Java/Scala or python. Experience working in agile environment. Experience working with Spark for data manipulation, preparation, cleansing. Experience working with ETL Tools (Informatica/DS/SSIS) for data Integration. Experience designing and developing automated analytic software, techniques, and algorithms. Ability to handle multiple tasks and adapt to a constantly changing environment. Selfstarter with the ability to work independently and take initiatives. Ability to translate ideas and business requirements into fully functioning ETL workflows. Ability to apply mastery knowledge in one of the relational data base (DB2, MSSQL, Teradata, Oracle 8i/9i/10g/11i) Expert ability and hands on experience in SQL is a must. Experience with Unix/Linux and shell scripting. Strong analytical and problem solving skills. Excellent written and oral communication skills, with the ability to articulate and document processes and workflows for use by various individuals of varying technical abilities. Knowledge of HealthCare a plus.Required Function 4 Minimum Education MS/BS in Computer Science, Information Systems, or related field preferred and/or equivalent experience.Required Function 5 Additional Skills Ability to work both independently and in a collaborative environment. Excellent problem solving skills, communication skills and interpersonal skills. Ability to analyze information and use logic to address work related issues and problems. Ability to demonstrate proficiency in Microsoft Access, Excel, Word, PowerPoint and Visio. Experience working in DevOps environment is a plus. Experience or knowledge of web architecture, (Javascript, SOAP/XML, Weblogic, Tomcat) is a plus. Experience with an ORM framework, SOA Architecture, Microservices is a plus. Experience with Middleware components (ESB, API Gateway) is a plus.

By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.

Questions / Comments:

Display Questions / Comments:

No Questions / comments

Job Summary

$90.00 / Hourly

Contract

Irving , Texas, United States

Experience Required : 9 Year/s

Posted : 6 years ago

Deadline : September 22, 2018 6 years ago

Job ID : Job0000014232

HMS Holdings Corporation

5615 High Point Drive

2128575000

www.hms.com