Data Engineer
Komodo Health, Inc
Contract San Francisco, California, United States Posted 3 years ago
About Position
Data Engineer (Contract)
$65.00 / Hourly
San Francisco, California, United States
Data Engineer
Contract San Francisco, California, United States Posted 3 years ago
Skills
data engineer hive spark snowflake sql python airflow etl developmentDescription
Job Description
The Life Sciences Products team is looking for Senior Data Engineers to help us build next generation data pipelines, data infrastructure and databases, and to develop data processing that is scalable, reliable and automated. The ideal candidate has demonstrated experience going from understanding business use cases and whiteboard designs all the way through to the nuances of implementation and rollout. The ideal candidate would have experience with product development and has gone through different phases of product growth and maturity.
This is an opportunity to help solve complex challenges, and be a part of a team of folks accomplished in diverse Engineering disciplines; focused on using the best of what lies at the forefront of technology and skills to address complex, real-world problems in the Healthcare and Life Science space. Some of the tools we use are: Spark, Snowflake, Airflow, Python, AWS EMR, Kubernetes, and Docker.
Looking back on your first 12 months at Komodo Health, you will have…
Designed, developed, and implemented data infrastructure, data pipelines and data processing code for product features.
Analyzed, reviewed and translated product management artifacts like product requirements documents and product roadmaps, into actionable engineering requirements documents.
Created automation systems and tools to configure, monitor, and orchestrate our data infrastructure and our data pipelines.
Evaluated new technologies for continuous improvements in Data Engineering.
Collaborated closely with the Product and Customer Success teams to build out new data features.
Collaborated with Data Scientists to implement descriptive, forecasting, and predictive algorithms and models using latest technologies.
What you bring to Komodo:
Responsibilities
- Building and deploying large-scale, complex data processing pipelines.
- Python or Scala development, proficiency with at least one of them is required
- Pipeline scheduling and monitoring systems, like Airflow or Luigi
- Data processing platforms such as Spark, Amazon EMR, Google DataProc, Hadoop/MapReduce, etc.
- Data warehouse, modeling, and SQL experience such as Snowflake, Redshift, or BigQuery
- You've led planning, launching, optimizing, and refactoring phases of data pipeline platforms.
- Ability to work as part of a collaborative team in a fast-paced environment.
- Sincere interest in working at a healthcare startup and passion for healthcare data.
- Additional Skills & Qualifications
- Ascend - bonus
- Employee Value Proposition (EVP)
- - What’s exciting: an opportunity to build something from scratch, hit the ground running, lean/sustainable/unincumbered by legacy systems/existing infrastructure
- - Complex Business Logic
- - Touches on every aspect of healthcare data (25 data ‘sources’)
- - Challenging for ETL devs: To inherit the flaws of the existing system (won’t be an issue here)
- - Mission of the company is to bring together all healthcare info & stitching it together
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.