Business Intelligence Engineer
Slack Technologies Inc
Contract San Francisco, California, United States Posted 5 years ago
About Position
Business Intelligence Engineer (Contract)
$45.00 / Hourly
San Francisco, California, United States
Business Intelligence Engineer
Contract San Francisco, California, United States Posted 5 years ago
Description
ResponsibilitiesDesign, build and launch new data models and ETL pipelines that will make Slack even more data driven.Work with Engineering, Data Science and Product Management teams to build and manage a wide variety of data sets.Build rich and dynamic dashboards using outofbox features, customizations and visualizations.Simplify and democratize access to useable data throughout our various teams at Slack.Design and publish custom dashboards for business functions, stakeholders and employees around the company.Automate and document processes.RequirementsBS degree in Computer Science or Engineering discipline.5+ years work experience showing growth as a BI Engineer.Problem solver with excellent interpersonal skills with ability to make sound complex decisions in a fastpaced, technical environment.Experience with AWS tools & technologies (S3, EMR, Kinesis, Lambda, API Gateway, Dynamodb etc).Ability to work on multiple areas like Data pipeline ETL, Data modeling & design, writing complex SQL queries etc.Expert knowledge in database technologies, which means excellent in SQL and an excellent understanding of tradeoffs in building data models.Strong programming skills and experience in Python.Expertise with tools (For example Tableau, Domo, Looker) is required to support our unique requirements for visualization, security, data access, etc.Strong experience in working with large data sets.Experience building ETL with open source tools such as Talend and Pentaho.Capable of planning and executing on both shortterm and longterm goals individually and with the team.Passionate about various technologies including but not limited to SQL/No SQL/MPP databases etc.Experience with streaming data pipelines using any of Kafka, AWS Kinesis, Spark streaming etc would be a plus.Knowledge of Statistics and/or Machine Learning. Familiarity with columnar data would be a plus.
By applying to a job using PingJob.com you are agreeing to comply with and be subject to the PingJob.com Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.