Software Developer/ Big data Etl developer
Turner Broadcasting System Inc
Contract Atlanta, Georgia, United States Posted 4 years ago
About Position
Software Developer/ Big data Etl developer (Contract)
$35.00 / Hourly
Atlanta, Georgia, United States
Software Developer/ Big data Etl developer
Contract Atlanta, Georgia, United States Posted 4 years ago
Description
Must haves 57 years of experience, understanding of ETL, experiencec creating data flow diagrams, knowledge of HiveQL, Spark SQL, relational data, JavaScript, expertise working with AWS and Linux environment, React, Node.JS, and experience with REST APIs
Nice to haves data architecture background, Node.JS, GraphQL, B.S. in Computer Science
Interview Process First interview is a phone screen use case scenario Sourcing tips relational database, Node, javascript
Job Description The Turner Story Turner is a division of WarnerMedia along with our sister companies, HBO and Warner Bros. We are better known as the folks who bring you CNN, HLN, TCM, TNT, tbs, Adult Swim, Cartoon Network, Turner Sports, truTV and so much more! From cuttingedge breaking news stories, uptotheminute sports coverage, and the characters we grew up loving on to the shows we love today, Turner continues to be the gold standard in first class television programming and a demonstrated leader in digital content. We tell the stories the world wants to hear. Wont you be a part of our story www.turner.com
See what its like to work at Turner! Follow us on Instagram, Twitter and Facebook
What part will you play We are seeking a Sr. Software Developer to be part of our Architecture and Development teams that support CNN Digital. The Sr. Software Developer will partner with other team members to ensure that we bring together all the consumer data assets in one place utilizing our new Data Platform. This platform will allow CNN to standardize our tool platforms as well as the way we provide reports. The business goal is to leverage CNN's assets to create modeling solutions for audience targeting, experience personalization and subscription management. The ideal candidate should bring extensive knowledge of various Big Data technologies along with experience working on a public cloud platform such as AWS or Azure.
What will you be doing The ideal candidate would be Resourceful and creative.
Nothing is more satisfying than solving that difficult problem and you love to bring new ways to tackle an issue, even if it goes against popular opinion or the way things have been done before. You're not afraid to inject some reality when needed but offer alternative solutions, not excuses. You don't know the meaning of blocked. Excellent communicator We work with a myriad of third party vendors and internal groups to prototype and develop next generation technology for our customers. Your exceptional communication skills and attention to detail ensure your team is well represented and everyone understands what is expected. An Agile ninja.
You believe in Scrum and iterating quickly, turning ideas into deployable code while always thinking about ways to improve performance, squash old bugs, and just making anything we deliver better for our customers. Things move fast here, and if changes come down the pipe, you handle them with grace and creativity. Someone that plays nice with others. We work closely together as a team with our product, design and development teams to help turn their dreams into reality. You'll review code, pair program, and provide feedback using your great communication skills and attention to detail to ensure your team understands what is expected and that we deliver on time and up to (and above) expectations. What do we require from you
Experience creating Data Flow diagrams and other technical documentation Minimum 57 years solid experience developing databases Expertise in working with AWS and Linux environments Understanding of ETL processes Expertise in database platforms such as RDBMS or NOSQL platforms Expert level knowledge of SQL and it's variants such as HiveQL, Spark SQL Experience with Big Data platforms such as Hadoop, NoSQL DBs or cloudbased tools such as Amazon Redshift, Snowflake Working with realtime data and tools such as Kafka, AWS Kinesis Experience working with REST APIs Specific application experience with Media, Web Analytics and Consumer Data Systems Experience with BI tools like Tableau and Looker Excellent Front End skills, including the use of CSS Preprocessors (SaaS preferred) and modern principles of JavaScript development (including experience with React or at least one other JS framework). Git experience with git merging, branching and pull requests Preferred Skills Data Architecture background preferred
Node.JS, Docker containers, CI/CD Pipeline, Git, React, MongoDB, Redis, GraphQL (preferred), Neo4j
Experience with Unix Shell Script, Awk, Python
Knowledge of GraphQL
BS in Computer Science, MIS, business, or equivalent education/training/experience
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.