Data Engineer
Walmart.Com USA LLC
Contract Sunnyvale, California, United States Posted 7 months ago
About Position
Data Engineer (Contract)
$75.00 / Hourly
Sunnyvale, California, United States
Data Engineer
Contract Sunnyvale, California, United States Posted 7 months ago
Skills
Spark – 8+ Yrs of Exp Scala – 8+ Yrs of Exp GCP –5+ Yrs of Exp Hive– 8+Yrs of Exp SQL - 8+ Yrs of Exp ETL Process / Data Pipeline - 8+ Years of experienceDescription
• Design and develop big data applications using the latest open source technologies.
• Desired working in offshore model and Managed outcome
• Develop logical and physical data models for big data platforms.
• Automate workflows using Apache Airflow.
• Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka.
• Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
• Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
• Mentor junior engineers on the team
• Lead daily stand-ups and design reviews
• Groom and prioritize backlog using JIRA
• Act as the point of contact for your assigned business domain
Responsibilities
- • 8+ years of hands-on experience with developing data warehouse solutions and data products.
- • 4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow or a workflow orchestration solution are required
- • 4 + years of experience in GCP,GCS Data proc, BIG Query
- • 2+ years of hands-on experience in modelling (Erwin) and designing schema for data lakes or for RDBMS platforms.
- • Experience with programming languages: Python, Java, Scala, etc.
- • Experience with scripting languages: Perl, Shell, etc.
- • Practice working with, processing, and managing large data sets (multi TB/PB scale).
- • Exposure to test driven development and automated testing frameworks.
- • Background in Scrum/Agile development methodologies.
- • Capable of delivering on multiple competing priorities with little supervision.
- • Excellent verbal and written communication skills.
- • Bachelor's Degree in computer science or equivalent experience.
Educational Requirements
- Required skillset: Pyspark, Azure Databricks, Azure Data Factory, Python, Structured Streaming, Basics of Azure, Knowledge of Data Lake, Data Warehouse concepts and Knowledge of SQLs
- Minimum 3+ Yrs of experience in the relevant area.
- Fully Remote is not an option. Resource should be in office 2 days per week
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.