AWS Snowflake Data Engineer
US Bank
Contract Charlotte, North Carolina, United States Posted 5 months ago
About Position
AWS Snowflake Data Engineer (Contract)
$75.00 / Hourly
Charlotte, North Carolina, United States
AWS Snowflake Data Engineer
Contract Charlotte, North Carolina, United States Posted 5 months ago
Skills
• 7+ years of experience in data engineering or related roles • Extensive experience with Snowflake implementations and optimizations • Experience with AWS S3 data lakes and mainframe data sources (VSAM DB2 GDG COBOL) • Expertise in performance tuning and query optimization in Snowflake • Experience with ETL transformations for COBOL data sources • Proficiency with AWS Glue ETL AWS Lake Formation or any ETL tool compatible with SnowflakeDescription
The Finance and Treasury Data Engineering team at US Bank constructs pipelines that contextualize and provide easy access to data by the entire enterprise. As a Data Engineer, you will design, build, and deploy data solutions to support business intelligence/insights, Machine Learning, and Artificial Intelligence.
Responsibilities
- • Developing and managing high-volume data pipelines
- • Designing data integration solutions
- • Building Data Flow using AWS Glue ETL or similar tools
- • Managing Data Lake using AWS Lake Formation service
- • Building and managing data APIs in Python
- • Providing expert guidance on Snowflake solutions
- • Optimizing Snowflake environments for performance and cost-efficiency
- • Ensuring data security and compliance within Snowflake solutions
- • Conducting training sessions on Snowflake best practices
Educational Requirements
- • Experience with Snowflake features like SnowPipe, Bulk Copy, Tasks, Streams, Stored Procedures, and UDFs
- • Hands-on experience with AWS services (S3, Glue, Lambda) or Azure services (Blob Storage, ADLS gen2, ADF)
- • Experience with CI/CD deployment for AWS services and Snowflake solutions
- • Experience with infrastructure as service (IaC) tools like Terraform
- • Experience in building and managing APIs using Python/Pyspark integration with Snowflake
- • Knowledge of Snowpark for advanced data processing and analytics
- • Finance experience is a plus
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.