Data Engineer
Delta Dental Plan Of Michigan
Contract Okemos, Michigan, United States Posted 6 months ago
About Position
Data Engineer (Contract)
$80.00 / Hourly
Okemos, Michigan, United States
Data Engineer
Contract Okemos, Michigan, United States Posted 6 months ago
Skills
· Participates in the analysis and development of technical specifications programming and testing of Data Engineering components. · Participates creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed. Assist with updating the enterprise standards when gaps are identified. · Follows technology best practices and standards and escalates any issues as deemed appropriate. Follows architecture and design best practices (as guided by the Lead Data Engineer BI Architect and Architectural team). · Responsible for assisting in configuration and scripting to implement fully automated data pipelines stored procedures and functions and ETL workflows that allow for data to flow from on-premises Oracle databases to Snowflake where the data will be consumable by our end customers. · Follows standard change control and configuration management practices. · Participates in 24-hour on-call rotation in support of the platform.Description
· We are looking for a Data Engineer to join our Data Engineering Team. The ideal candidate should have a minimum of 3 years of experience with excellent analytical reasoning and critical thinking skills.
· The candidate will be a part of a team that creates data pipelines that use change data capture (CDC) mechanisms to move data to a cloud provider and then transform data to make it available to Customers to consume.
· The Data Engineering Team also does general extraction, transformation, and load (ETL) work, along with traditional Enterprise Data Warehousing (EDW) work.
Responsibilities
- · Database Platforms: Snowflake, Oracle, and SQL Server
- · OS Platforms: Linux OS and Windows Server
- · Languages and Tools: PL/SQL, Python, T-SQL, StreamSets, Snowflake Streams and Tasks, and Informatica PowerCenter, DBeaver
- · Drive and desire to automate repeatable processes.
- · Excellent interpersonal skills and communication, as well as the willingness to collaborate with teams across the organization.
Educational Requirements
- · Experience loading data from files in Snowflake file stages into existing tables.
- · Experience creating and working with near-real-time data pipelines between relational sources and destinations.
- · Experience working with StreamSets Data Collector or similar data streaming/pipelining tools (Fivetran, Striim, Airbyte etc...).
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.