Cloud Data Engineer
Fannie Mae
Contract Washington, District of Columbia, United States Posted 4 years ago
About Position
Cloud Data Engineer (Contract)
$55.00 / Hourly
Washington, District of Columbia, United States
Cloud Data Engineer
Contract Washington, District of Columbia, United States Posted 4 years ago
Description
Fannie Mae provides reliable, largescale access to affordable mortgage credit in communities across our nation. We are the leading source of funding for housing in America, which means more people can buy or rent a home. We are focused on sustaining the housing recovery, improving our company, and leading change to make housing better.
Join our diverse, highperforming team and make a difference as we work together to enable access to a good home.
For more information about Fannie Mae, visit http //www.fanniemae.com/progress.
JOB INFORMATION
Develop, modify, or update applications used by the Single Family Analytics Group. Work on creating ETLs to retrieve, analyze, extract, transform and make the data available for data mining, analytical processing, research and decision support with the use of Business Intelligence Tools.
KEY JOB FUNCTIONS
Support migration of existent on premise Analytics processes to AWS and/or implementation of new processes in AWS.
Support data integration process to new Enterprise Data Lake (AWS) and/or Enterprise data platforms that will replace current Oracle data warehouses.
Conduct proof of concept of new technologies in different platforms such as AWS/Cloud, to help assess and evaluate new tools for implementation within the company and in support of the Analytics Group.
Build and maintain ETLs (Informatica, Python, and SAS) and other data solutions to support business needs.
Build and maintain high performance database solutions (i.e. Oracle, Netezza, AWS Postgres) to support Business Intelligence Tools performance (Tableau).
Follow data governance and industry best practices.
Qualifications EDUCATION Bachelor's Degree or equivalent required MINIMUM EXPERIENCE 2+ years of related experience SPECIALIZED KNOWLEDGE & SKILLS Required Knowledge of Oracle and SQL
Must have handson experience with Hive, Spark, Python, UNIX scripting, SQL
Experience in AWS cloud and big data eco system and tools
Proficient technical aptitude in at least one of the following programming languages SQL, R, C++, C#, Python, Java, or Scala, Spark
Ability to learn new technologies/tools Knowledge of relational databases and/or data structure in support of data warehouses Preferred AWS certification desired
Nice to have DynamoDB, Redshift, Snowflake
Informatica / Talend / Alteryx able to develop and schedule ETLs
Netezza knowledge of Netezza to be able to create and design
SAS, R, Linux, Unix basic knowledge of SAS, Unix and Linux
Basic knowledge of Tableau
Autosys
JIRA
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.