BizTek People
Job Posting
Job Opening ID: 6163 Date Opened: 01/07/2020 Job Type: Contract Language Skills: English Location: Beaverton, Oregon Industry: IT Services City: Beaverton State/Province: Oregon Country: United States Zip/Postal Code: 97006 Job Description
Requirements: MS/BS in Computer Science, or related technical discipline 5+ years of industry experience and 5+ years of relevant big data/ETL data warehouse experience building data pipelines 2+ years leading a team of data engineers ie mentor, improve processes, peer training, long term best practices, coding standards 5+ year experience in Python and Snowflake. Strong programming experience in Python Professionally worked with APIs to extract data for data pipelines Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc. Ability to architect, design and implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena Troubleshooting production issues and performing On-Call duties, at times. Working knowledge with workflow orchestration tools like Apache Airflow Hands on experience with performance and scalability tuning Professional experience in Agile/Scrum application development using JIRA Experience working in a public cloud environment, particularly AWS Professional experience with source control, merging strategies and coding standards, specifically Bitbucket/Git Professional experience in data design and modeling specifically working with ErWin Demonstrated experience developing in a continuous integration environment using tools like Jenkins, Bamboo, or TeamCity CI Frameworks. Demonstrated ability to maintain the build and deployment process through the use of build integration tools Working experience and communicating with business stakeholders and architects Demonstrated experience implementing security around sensitive data Experience designing instrumentation into code and using and integrating with software and logging analysis tools like log4Python, Signal FX and/or Splunk Required Soft Skills: Desire to lead collaboratively with your teammates to come up with the best solution to a problem Demonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environment Excellent problem-solving and interpersonal communication skills Strong desire to learn and share knowledge with others Passionate about data and striving for excellence Desire to learn and understand the business and communicate with business stakeholders to accomplish business rules transformations and data validation while coding Understands the importance of data security and privacy Nice to Have: Call center data at a global level Skill Set
ETL, Python, Snowflake, Spark, Hive, AWS
Job Opening ID: 6163 Date Opened: 01/07/2020 Job Type: Contract Language Skills: English Location: Beaverton, Oregon Industry: IT Services City: Beaverton State/Province: Oregon Country: United States Zip/Postal Code: 97006 Job Description
Requirements: MS/BS in Computer Science, or related technical discipline 5+ years of industry experience and 5+ years of relevant big data/ETL data warehouse experience building data pipelines 2+ years leading a team of data engineers ie mentor, improve processes, peer training, long term best practices, coding standards 5+ year experience in Python and Snowflake. Strong programming experience in Python Professionally worked with APIs to extract data for data pipelines Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc. Ability to architect, design and implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena Troubleshooting production issues and performing On-Call duties, at times. Working knowledge with workflow orchestration tools like Apache Airflow Hands on experience with performance and scalability tuning Professional experience in Agile/Scrum application development using JIRA Experience working in a public cloud environment, particularly AWS Professional experience with source control, merging strategies and coding standards, specifically Bitbucket/Git Professional experience in data design and modeling specifically working with ErWin Demonstrated experience developing in a continuous integration environment using tools like Jenkins, Bamboo, or TeamCity CI Frameworks. Demonstrated ability to maintain the build and deployment process through the use of build integration tools Working experience and communicating with business stakeholders and architects Demonstrated experience implementing security around sensitive data Experience designing instrumentation into code and using and integrating with software and logging analysis tools like log4Python, Signal FX and/or Splunk Required Soft Skills: Desire to lead collaboratively with your teammates to come up with the best solution to a problem Demonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environment Excellent problem-solving and interpersonal communication skills Strong desire to learn and share knowledge with others Passionate about data and striving for excellence Desire to learn and understand the business and communicate with business stakeholders to accomplish business rules transformations and data validation while coding Understands the importance of data security and privacy Nice to Have: Call center data at a global level Skill Set
ETL, Python, Snowflake, Spark, Hive, AWS