Hadoop Developer/Admin
Sirius Computer Solutions Inc
Contract Buffalo, New York, United States Posted 5 years ago
About Position
Hadoop Developer/Admin (Contract)
$65.00 / Hourly
Buffalo, New York, United States
Hadoop Developer/Admin
Contract Buffalo, New York, United States Posted 5 years ago
Description
Job Description :
Responsible for designing the data intake into the Hadoop and make it available to business in query able format
Responsible for implementation, ongoing technical support of Hadoop eco-system (including access, incident and problem management)- Provide technical leadership and collaborate with developers and architects for implementations on the Hadoop Platform.
Design and Configuration of the Hadoop Platform and various associated components (including 3rd party tools) for data ingestion, transformation, migration, processing, and reporting.
Responsible to work with the infrastructure, network, database, application, and business intelligence teams to achieve high data quality, performance, availability and security of the platform.
Mentor other Hadoop developers and administrators
Responsible for documenting the design and processes in implementation for ongoing support
Assist to optimize and integrate new infrastructure via continuous integration methodologies.
Setup and maintain CI/CD application server environments and pipelines with tools and technologies like Docker, Jenkins and Kubernetes.
Follow company policies, procedures, controls, and processes for the job.
In addition to the above key responsibilities, you may be required to undertake other duties from time to time as the Company may reasonably require.
Required Skills:
A minimum of bachelor's degree in computer science or equivalent.
Hortonworks(HDP)/Cloudera Hadoop(CDH), Ambari/Cloudera Manager, HDFS, Yarn, MapReduce, Hive, Impala, KUDU, Sqoop, Spark, Kafka, HBase, Tika, Kerberos, Active Directory, Sentry, TLS/SSL, Linux/RHEL, Unix Windows, SBT, Maven, Jenkins, Oracle, MS SQL Server, Shell Scripting, Eclipse IDE, Git, SVN
Experience working on any Hadoop distribution, such as Cloudera/Hortonworks and have at least coded in Apache Hadoop, Spark, Kafka, Hive, Pig, Drill for 6 years (or) more
Experience with Data lineage, Data Tagging following data driven security model
Experience in NiFi, Spark streaming, Elastic Search, Tensorflow, Pytorch
Strong knowledge of relational databases (Oracle, SQL Server , Postgres) and Expert in SQL language
Experience with languages such as Python, Go and Java is required.
Experience with Agile, DevOps and GITOps automation.
Proficient in utilizing cloud computing virtualization technologies, storage architecture & AWS/Azure technologies
Knowledge of working with various Hadoop connectors
Healthcare knowledge is an advantage
Must have experience with source control tools
Must have strong problem-solving and analytical skills
Must have the ability to identify complex problems and review related information to develop and evaluate options and implement solutions.
Kubernetes Containerization experience
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.