Data Engineer
Nike Inc
Contract Beaverton, Oregon, United States Posted 1 month ago
About Position
Data Engineer (Contract)
$75.00 / Hourly
Beaverton, Oregon, United States
Data Engineer
Contract Beaverton, Oregon, United States Posted 1 month ago
Description
Design and implement data products and features in collaboration with product owners, data analysts, and business partners.
Work with a variety of teammates to build first-class solutions for Client technology and its business partners, working on development projects related to supply chain, commerce, consumer behavior and web analytics among others.
Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes. Research, evaluate and utilize new technologies/tools/frameworks centered around high-volume data processing.
Responsible for the evaluation of technical feasibility or risks and conveying that information to the team.
Translate backlog items into engineering design and logical units of work. Profile and analyze data for the purpose of designing scalable solutions.
Define and apply appropriate data acquisition and consumption strategies for given technical scenarios.
Design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem.
Build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns. Implement complex automated routines using workflow orchestration tools. Work with architecture, engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to.
Anticipate, identify and solve issues concerning data management to improve data quality. Build and incorporate automated unit tests and participating in integration testing efforts. Utilize and advance continuous integration and deployment frameworks.
Troubleshoot data issues and performing root cause analysis.
Responsibilities
- Databricks
- Databricks sole
- Snowflake
- SQL
- EMR
- Spark
- Dynamo DB
- Data Pipelines
- Python programming
- Airflow
- AWS (S3, SQS, Lambda, Athena, Open search, Glue data catalog, cloud watch)
- Apache Spark
- RDS
- Logging (Splunk, Slack)
- Hive
- Metastore
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.