Jobs via Dice
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Avenues International, Inc., is seeking the following. Apply via Dice today!
Title: Senior Software Engineer - ETL Developer (Pyspark) Location: Skillman, NJ (Three days onsite is must) Overall Skills needed:
Working knowledge of industry standard Data Infrastructure (e.g. Warehouse, BI, Analytics, Big-Data, etc.) tools
Proficient at developing, architecting, standardizing and supporting technology platforms using Industry leading ETL solutions.
Building scalable and high throughput systems
Experience with agile BI & ETL practices to assist with the interim Data preparation for Data Discovery & self-service needs.
Strong communication, presentation, problem-solving, and trouble-shooting skills.
Must have Skills:
10+ years of experience in designing and developing ETL pipelines leveraging pyspark/python.
Experience on python database libraries like SQLAlchemy, psycopg2 .etc
Strong understanding of data warehousing methodologies, ETL processing and dimensional data modeling.
Advanced SQL capabilities are required. Knowledge of database design techniques and experience working with extremely large data volumes is a plus.
Demonstrated experience and ability to work with business users to gather requirements and manage scope.
Experience in workflow tools such as oozie or Airflow or Tidal
Experience working in a big data environment with technologies such as Greenplum, Hadoop and HIVE
BA, BS, MS, PhD in Computer Science, Engineering or related technology field
Desirable Skills:
Experience with large database and DW Implementation (20+ TBs)
Understanding of VLDB performance aspects, such as table partitioning, sharding, table distribution and optimization techniques
Knowledge of reporting tools such as Qlik Sense, Tableau, Cognos
Seniority level
Mid-Senior level
#J-18808-Ljbffr
Title: Senior Software Engineer - ETL Developer (Pyspark) Location: Skillman, NJ (Three days onsite is must) Overall Skills needed:
Working knowledge of industry standard Data Infrastructure (e.g. Warehouse, BI, Analytics, Big-Data, etc.) tools
Proficient at developing, architecting, standardizing and supporting technology platforms using Industry leading ETL solutions.
Building scalable and high throughput systems
Experience with agile BI & ETL practices to assist with the interim Data preparation for Data Discovery & self-service needs.
Strong communication, presentation, problem-solving, and trouble-shooting skills.
Must have Skills:
10+ years of experience in designing and developing ETL pipelines leveraging pyspark/python.
Experience on python database libraries like SQLAlchemy, psycopg2 .etc
Strong understanding of data warehousing methodologies, ETL processing and dimensional data modeling.
Advanced SQL capabilities are required. Knowledge of database design techniques and experience working with extremely large data volumes is a plus.
Demonstrated experience and ability to work with business users to gather requirements and manage scope.
Experience in workflow tools such as oozie or Airflow or Tidal
Experience working in a big data environment with technologies such as Greenplum, Hadoop and HIVE
BA, BS, MS, PhD in Computer Science, Engineering or related technology field
Desirable Skills:
Experience with large database and DW Implementation (20+ TBs)
Understanding of VLDB performance aspects, such as table partitioning, sharding, table distribution and optimization techniques
Knowledge of reporting tools such as Qlik Sense, Tableau, Cognos
Seniority level
Mid-Senior level
#J-18808-Ljbffr