Snowflake Data Engineer

Northern Trust Corporation

Contract Chicago , Illinois, United States Posted 2 months ago

 Write a Review Add Vendor   Add Contact  

About Position

Snowflake Data Engineer (Contract)

$75.00 / Hourly

Chicago , Illinois, United States

Snowflake Data Engineer

Contract Chicago , Illinois, United States Posted 2 months ago

Skills
Data Pipeline Development: Design build and manage data pipelines for the ETL process using Airflow for orchestration and Python for scripting to transform raw data into a format suitable for our new Snowflake data model. Data Integration: Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using CDC tools. Support Data Modeling: Assist in developing and optimizing the data model for Snowflake ensuring it supports our analytics and reporting requirements. Reporting Support: Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting. Technical Documentation: Create and maintain comprehensive documentation of data pipelines ETL processes and data models to ensure best practices are followed and knowledge is shared within the team.
Description

This project aims to modernize our data architecture by leveraging cloud technologies, specifically Snowflake and Databricks, to enhance our data storage, processing, and analytics capabilities. It involves migrating application data from an on-premises Oracle database to a more scalable, flexible and secure cloud-based environment, optimizing data flows and analytics to support real-time decision-making and insights.

Responsibilities
  • Experience: Total 10+ years of experience as data engineer; proven experience as a Snowflake data engineer for 5+ years.
  • Skills: Strong knowledge of SQL writing complex queries, performance tuning, Strong experience with Oracle, Snowflake, ETL/ELT tools
  • Data engineering: proven track record of developing and maintaining data pipelines and data integration projects
  • Orchestration Tools: Experience in Airflow for managing data pipeline workflows.
  • Programming: proficiency in Python and SQL for data processing tasks.
  • Data Modeling: Understanding of data modeling principles and experience with data warehousing solutions.
  • Cloud Platforms: Knowledge of cloud infrastructure and services, preferably Azure, as it relates to Snowflake integration.
  • Collaboration Tools: Experience with version control systems (like Git) and collaboration platforms.
  • CI/CD Implementation: Utilize CI/CD tools to automate the deployment of data pipelines and infrastructure changes, ensuring high-quality data processing with minimal manual intervention.
  • Communication: Excellent communication and teamwork skills, with a detail-oriented mindset. Strong analytical skills, with the ability to work independently and solve complex problems.

By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.

Questions / Comments:

Display Questions / Comments:

No Questions / comments

Northern Trust Corporation Vendors

( Login to see all the 19 vendors)
TEK Systems
Write a Review

971 Corporate Boulevard
Linthicum
Maryland
www.teksystems.com ( 85 vendors)

Wipro Technologies
Write a Review

1300 Crittenden Lane
Mountain View
California
www.wipro.com ( 61 vendors)

Tata Consultancy Services
Write a Review

101 Park Avenue 26th Floor
New York
New York
www.usa-tcs.com ( 41 vendors)

Job Summary

$75.00 / Hourly

Contract

Chicago , Illinois, United States

Experience Level : Medium

Experience Required : 8 Year/s

Only Any

Posted : 2 months ago

Deadline : September 23, 2024 2 months ago

Job ID : Job0000002657

Northern Trust Corporation

50 South La Salle Street

3126306000

www.northerntrust.com