Data Engineer
News Corp
Contract New York, New York, United States Posted 3 years ago
About Position
Data Engineer (Contract)
$85.00 / Hourly
New York, New York, United States
Data Engineer
Contract New York, New York, United States Posted 3 years ago
Skills
Be accountable for planning designing developing and implementing applications to provide services to the global organization Good experience and understanding of data models for the data warehouse and BI data marts Technical support of Data warehouse and BI tools& infrastructure Identification analysis & resolution ofproduction & development bugs Support the release process including completing& reviewing documentation Configure data mappings & transformations toorchestrate data integration & validation Design build & test Visualizations (Google Data Studio Tableau) ETL processes using GCP Composer Airflow & SQL or any ETL tool for the corporate data warehouse on Google BigQuery or any databases. Develop domain knowledge and become subject matter experts in key business verticals eg. Media & Publications. Document solutions tools processes &create/support test plans with hands-on testing Peer review work developed by other data engineers within the team Establish good working relationships communication channels with relevant departments & senior stakeholders Require a passion for all things automation and stickler for efficiencyDescription
The Senior Data Engineer will be responsible for writing SQL, Python and PySpark scripts to be used in API calls to pull data from multiple disparate systems and databases. The source data may include Analytic System data (Google Analytics, Adobe Analytics), 3rd party systems, CSV files, CSS Feeds, etc. This individual would also assist with cleaning up the data so it's in readily accessible format for the BI systems. The Senior Data Engineer will contribute expertise, embrace emerging trends and provide overall guidance on best practices across all of News Corp business and technology groups. The position will require the ability to multitask and work independently, as well as work collaboratively with teams, some of which may be geographically distributed.
Responsibilities
- Building ETL data pipelines using SQL, Python for APIcalls to get data.
- Expertise in Google Biq Query, PySpark, SQL andrelational databases (PostgreSQL, MySQL). Scripting experience with Shell & Python.
- Experience working with Atlassian products such as Jira and Confluence
- Experience in managing code within the SDLC methodology
- A firm understanding of code source control concepts in particular using GIT repository
- Proven ability to write both customer facing and technical documentation
- Experience in Google Analytics & Adobe Analytics a plus
- Hands on experience in developing visualization using Google Data Studio.
Educational Requirements
- Strong written and verbal communication skills
- The ability to follow assigned tasks to full completion
- Show creativity to find the no-obvious solution to a given problem
- The ability to hold constructive discussions with fellow team members
- The ability to work independently or part of a team as required
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.