Data Engineer with GCP
Macys Systems and Technology
Contract Duluth , Georgia, United States Posted 6 months ago
About Position
Data Engineer with GCP (Contract)
$65.00 / Hourly
Duluth , Georgia, United States
Data Engineer with GCP
Contract Duluth , Georgia, United States Posted 6 months ago
Skills
· Lead the design and implementation of scalable data architectures using GCP GCP Dataflow Airflow and Hadoop · Oversee the development and deployment of data pipelines to ensure efficient data processing and integration · Provide technical expertise in optimizing data workflows and ensuring data quality and reliability · Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Develop and maintain documentation for data architecture processes and best practices · Ensure compliance with data governance and security policies across all data solutions · Monitor and troubleshoot data pipeline performance identifying and resolving issues promptly · Conduct regular code reviews to ensure adherence to coding standards and best practices · Mentor and guide junior team members fostering a culture of continuous learning and improvement · Stay updated with the latest industry trends and technologies to drive innovation within the team · Participate in strategic planning and decision-making processes to align data initiatives with business goals · Communicate effectively with stakeholders to provide updates on project progress and address any concerns · Contribute to the development of data-driven strategies that enhance business performance and customer satisfactionDescription
We are seeking a Data Engineer with 10 to 12 years of experience to join our team as a Business Associate.
The ideal candidate will have extensive experience in GCP, GCP Dataflow, Airflow, and Hadoop.
This role involves designing and implementing robust data solutions that align with our business objectives and drive impactful results.
Required Skills: D365 Common Data Service, Dynamics AX, GCP Dataflow, Hadoop
Responsibilities
- · Possess a strong background in GCP, GCP Dataflow, Airflow, and Hadoop with hands-on experience
- · Demonstrate excellent problem-solving skills and the ability to think critically and analytically
- · Exhibit strong communication and collaboration skills to work effectively with diverse teams
- · Show proficiency in designing and implementing data architectures that meet business needs
- · Have a solid understanding of data governance and security best practices
- · Display a proactive approach to learning and staying current with emerging technologies.
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.