Data Architect
Brown Brothers Harriman & Co
Contract Boston, Massachusetts, United States Posted 6 months ago
About Position
Data Architect (Contract)
$75.00 / Hourly
Boston, Massachusetts, United States
Data Architect
Contract Boston, Massachusetts, United States Posted 6 months ago
Skills
-Participate in strategic planning and contribute to the organization’s data strategy and roadmap. -Completely understand the current DW systems and user communities’ data needs and requirements. -Define Legacy Data Warehouse migration strategy. Understand existing target platform and data management environment. -Build the Facilitate the establishment of a secure data platform on OnPrem Cloudera infrastructure . -Document and develop ETL logic and data flows to facilitate the easy usage of data assets both batch and real-time streaming. -Migrate operationalize and support of the platform. -Manage and provide technical guidance and support to the development team ensuring best practices and standards are followed.Description
Seeking a Sr. Data Architect who has experience working on modern data platforms with the capabilities of supporting bigdata, relational/Non-relational databases, data warehousing, analytics, machine learning and data lake. Key responsibilities will include developing and migrating off legacy Oracle Data Warehouses to a new data platform as the foundation for a key set of offerings running on Oracle Exadata and Cloudera's distribution technology. Sr. Data Architect will continue to support, develop, and drive the data roadmap supporting our system and business lines.
Responsibilities
- -10+ years of experience in an IT, preliminary on hands on development
- -Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
- -6+ years’ real data warehouse project experience
- -Strong hands-on experience with Snowflake
- -Strong hands-on experience with Spark
- -Strong hands-on experience with Kafka
- -Experience with performance tuning of SQL Queries and Spark
- -Experience in designing efficient and robust ETL/ELT workflows and schedulers.
- -Experience working with Git, Jira, and Agile methodologies.
- -End-to-end development life-cycle support and SDLC processes
- -Communication skills – both written and verbal
- -Strong analytical and problem-solving skills
- -Self-driven, work in teams and independently if required.
- -Working experience with Snowflake, AWS/Azure/GCP
- -Working experience in a Financial industry is a plus.
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.