Data Architect
Hyatt Corp
Full Time Chicago , Illinois, United States Posted 1 year ago
About Position
Data Architect (Full Time)
$0.00 / Hourly
Chicago , Illinois, United States
Data Architect
Full Time Chicago , Illinois, United States Posted 1 year ago
Description
As a Data Architect, you will Lead the creation of the strategic enterprise data architecture for Hyatt. Partners with internal stakeholders to define the principles, standards, and guidelines regarding data flows, data aggregation, data migration, data curation, data model, data consumption and data placements. Provide expertise regarding data architecture in critical programs, data strategy and data quality remediation activities. Validates data architecture for adherence to defined policies, standards and guidelines including regulatory directives.
Responsibilities
- • You will be a part of a ground-floor, hands-on, highly visible team which is positioned for growth and is highly collaborative and passionate about data.
- • This candidate builds fantastic relationships across all levels of the organization and is recognized as a problem solver who looks to elevate the work of everyone around them.
- • Provides expert guidance to projects to ensure that their processes and deliverables align with the Hyatt target state architecture.
- • Defines & develops enterprise data architecture concepts and standards leveraging leading architecture practices and advanced data technologies.
- • Requirements gathering with business stakeholders, domain: agile team work, people data, and hierarchies of project portfolio work
- • Ability to write requirements for ETL and BI developers
- • Ability to write designs for data architecture of data warehouse or data lake solutions or end to end pipelines
- • Expert in data architecture principles, distributed computing knowhow
- • Intake prioritization, cost/benefit analysis, decision making of what to pursue across a wide base of users/stakeholders and across products, databases and services,
- • Design or approve data models that provide a full view of what the Hyatt technology teams are working on and the business impact they are having.
- • End to end data pipeline design, security review, architecture and deployment overview
- • Automate reporting views used by management and executives to decide where to invest the organization’s time and resources, and stay up to date on key company initiatives and products
- • Create self-service reporting including a data lake for Hyatt’s internal projects and resources
- • Design for comprehensive data quality management tooling.
- • 6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics
- • 4+ years of experience in architecture for commercial scale data pipelines
- • Experience and comfort solving problems in an ambiguous environment where there is constant change. Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners
- • Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
- • Exposure to Amazon AWS or another cloud provider
- • Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker
- • Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.
- • Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills
- • Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders
- • Rigorous attention to detail and accuracy
- • Aware of and motivated by driving business value
- • Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase
- • Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)
- • Bachelor’s degree in Engineering, Computer Science, Statistics, Economics, Mathematics, Finance, a related quantitative field
- • Advance CS degree is a plus knowledge
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.