Snowflake Data Engineer
Northern Trust Corporation
Contract Chicago , Illinois, United States Posted 2 months ago
About Position
Snowflake Data Engineer (Contract)
$75.00 / Hourly
Chicago , Illinois, United States
Snowflake Data Engineer
Contract Chicago , Illinois, United States Posted 2 months ago
Skills
Data Pipeline Development: Design build and manage data pipelines for the ETL process using Airflow for orchestration and Python for scripting to transform raw data into a format suitable for our new Snowflake data model. Data Integration: Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using CDC tools. Support Data Modeling: Assist in developing and optimizing the data model for Snowflake ensuring it supports our analytics and reporting requirements. Reporting Support: Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting. Technical Documentation: Create and maintain comprehensive documentation of data pipelines ETL processes and data models to ensure best practices are followed and knowledge is shared within the team.Description
This project aims to modernize our data architecture by leveraging cloud technologies, specifically Snowflake and Databricks, to enhance our data storage, processing, and analytics capabilities. It involves migrating application data from an on-premises Oracle database to a more scalable, flexible and secure cloud-based environment, optimizing data flows and analytics to support real-time decision-making and insights.
Responsibilities
- Experience: Total 10+ years of experience as data engineer; proven experience as a Snowflake data engineer for 5+ years.
- Skills: Strong knowledge of SQL writing complex queries, performance tuning, Strong experience with Oracle, Snowflake, ETL/ELT tools
- Data engineering: proven track record of developing and maintaining data pipelines and data integration projects
- Orchestration Tools: Experience in Airflow for managing data pipeline workflows.
- Programming: proficiency in Python and SQL for data processing tasks.
- Data Modeling: Understanding of data modeling principles and experience with data warehousing solutions.
- Cloud Platforms: Knowledge of cloud infrastructure and services, preferably Azure, as it relates to Snowflake integration.
- Collaboration Tools: Experience with version control systems (like Git) and collaboration platforms.
- CI/CD Implementation: Utilize CI/CD tools to automate the deployment of data pipelines and infrastructure changes, ensuring high-quality data processing with minimal manual intervention.
- Communication: Excellent communication and teamwork skills, with a detail-oriented mindset. Strong analytical skills, with the ability to work independently and solve complex problems.
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.