Certified Cloud Engineer
State of Wisconsin
Contract Madison, Wisconsin, United States Posted 6 months ago
About Position
Certified Cloud Engineer (Contract)
$0.00 / Hourly
Madison, Wisconsin, United States
Certified Cloud Engineer
Contract Madison, Wisconsin, United States Posted 6 months ago
Skills
Collaborate with data engineering business analysts and development teams to design develop test and maintain robust and scalable data pipelines from Workday to AWS Redshift. Architect implement and manage end-to-end data pipelines ensuring data accuracy reliability data quality performance and timeliness. Provide expertise in Redshift database optimization performance tuning and query optimization. Assist with design and implementation of workflows using Airflow. Perform data profiling and analysis to troubleshoot data-related challenges / issues and build solutions to address those concerns. Proactively identify opportunities to automate tasks and develop reusable frameworks. Work closely with version control team to maintain a well-organized and documented repository of codes scripts and configurations using Git/Bitbucket. Provide technical guidance and mentorship to fellow developers sharing insights into best practices tips and techniques for optimizing Redshift-based dataDescription
The Universities of Wisconsin Enterprise Analytics Platform (EAP) team supports the development and support of the data lake for the UW System Office and member universities. We are seeking an experienced AWS Redshift Data Engineer to assist our team in designing, developing, and optimizing data pipelines for our AWS Redshift-based data lake house. Priority needs are cloud formation and event-based data processing utilizing SQS to support ingestion and movement of data from Workday to AWS Redshift for consumer and analytic use.
Responsibilities
- Advanced hands-on experience designing AWS data lake solutions.
- Experience integrating Redshift with other AWS services, such as DMS, Glue, Lambda, S3, Athena, Airflow.
- Proficiency in Python programming with a focus on developing efficient Airflow DAGs and operators.
- Experience with Pyspark and Glue ETL scripting including functions like relationalize, performing joins and transforming dataframes with pyspark code.
- Competency developing CloudFormation templates to deploy AWS infrastructure, including YAML defined IAM policies and roles.
- Experience with Airflow DAG creation.
- Familiarity with debugging serverless applications using AWS tooling like Cloudwatch Logs & Log Insights, Cloudtrail, IAM.
- Ability to work in a highly complex python object oriented platform.
- Strong understanding of ETL best practices, data integration, data modeling, and data transformation.
- Proficiency in identifying and resolving performance bottleneck and fine-tuning Redshift queries.
- Familiarity with version control systems, particularly Git, for maintaining a structured code repository.
- Strong coding and problem-solving skills, and attention to detail in data quality and accuracy.
- Ability to work collaboratively in a fast-paced, agile environment and effectively communicate technical concepts to non-technical stakeholders.
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.