Hadoop Developer
Adobe Systems Incorporated
Contract San Jose, California, United States Posted 9 years ago
About Position
Hadoop Developer (Contract)
$70.00 / Hourly
San Jose, California, United States
Hadoop Developer
Contract San Jose, California, United States Posted 9 years ago
Skills
· Strong SQL ETL scripting and or programming skills with a preference towards Python Java Scala shell scripting · Demonstrated ability to clearly form and communicate ideas to both technical and non-technical audiences. · Strong problem-solving skills with an ability to isolate deconstruct and resolve complex data / engineering challenges · Results driven with attention to detail strong sense of ownership and a commitment to up-leveling the broader IDS engineering team through mentoring innovation and thought leadershipDescription
· Designing, develop & tune data products, applications and integrations
on large scale data platforms (*Hadoop*, Kafka Streaming, Hana*, SQL
serve*r etc) with an emphasis on performance, reliability and scalability
and most of all quality.
· Analyze the business needs, profile large data sets and build custom
data models and applications to drive the business decision making and
customers experience
· Develop and extend design patterns, processes, standards, frameworks and
reusable components for various data engineering functions/areas.
· Collaborate with key stakeholders including business team, engineering
leads, architects, BSA's & program managers.
Educational Requirements
- · MS/BS in Computer Science / related technical field with 4+ years of
- strong hands-on experience in enterprise data warehousing / big data
- implementations & complex data solutions and frameworks
Experience Requirements
- Desired skils*:
- · Familiarity with streaming applications
- · Experience in development methodologies like Agile / Scrum
- · Strong Experience with Hadoop ETL/ Data Ingestion: Sqoop, Flume, Hive,
- Spark, Hbase
- · Strong experience on SQL and PLSQL
- · Nice to have experience in Real Time Data Ingestion using Kafka, Storm,
- Spark or Complex Event Processing
- · Experience in Hadoop Data Consumption and Other Components: Hive, Hue
- HBase, , Spark, Pig, Impala, Presto
- · Experience monitoring, troubleshooting and tuning services and
- applications and operational expertise such as good troubleshooting skills,
- understanding of systems capacity, bottlenecks, and basics of memory, CPU,
- OS, storage, and networks.
- · Experience in Design & Development of API framework using Python/Java is
- a Plu
- · Experience in developing BI Dash boards and Reports is a plus
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.