Remotework
Overview
8+ years as a
hands-on
Solutions Architect and/or Data Engineer designing and implementing data solutions. Team lead, and/or mentorship of other engineers. Ability to develop end-to-end technical solutions into production and to help ensure performance, security, scalability, and robust data integration. Programming expertise in
Java, Python and/or Scala . Core cloud data platforms including
Snowflake, AWS, Azure, Databricks
and
GCP .
SQL
and the ability to write, debug, and optimize
SQL queries . Client-facing written and verbal communication skills and experience. Create and deliver detailed presentations. Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.). 4-year Bachelor's degree in Computer Science or a related field. Prefer any of the following: Production experience in core data platforms:
Snowflake, AWS, Azure, GCP, Hadoop, Databricks Cloud and Distributed Data Storage:
S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems Data integration technologies:
Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies Complete software development lifecycle experience
including design, documentation, implementation, testing, and deployment Automated data transformation and data curation:
dbt, Spark, Spark streaming, automated pipelines Workflow Management and Orchestration : Airflow, AWS Managed Airflow, Luigi, NiFi
Why phData? We offer: Casual, award-winning small-business work environment Collaborative high performance culture that prizes autonomy, creativity, and transparency Competitive comp, excellent benefits, generous weeks PTO plus 10 Holidays (and other cool perks) Accelerated learning and professional development through advanced training and certifications
#J-18808-Ljbffr
8+ years as a
hands-on
Solutions Architect and/or Data Engineer designing and implementing data solutions. Team lead, and/or mentorship of other engineers. Ability to develop end-to-end technical solutions into production and to help ensure performance, security, scalability, and robust data integration. Programming expertise in
Java, Python and/or Scala . Core cloud data platforms including
Snowflake, AWS, Azure, Databricks
and
GCP .
SQL
and the ability to write, debug, and optimize
SQL queries . Client-facing written and verbal communication skills and experience. Create and deliver detailed presentations. Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.). 4-year Bachelor's degree in Computer Science or a related field. Prefer any of the following: Production experience in core data platforms:
Snowflake, AWS, Azure, GCP, Hadoop, Databricks Cloud and Distributed Data Storage:
S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems Data integration technologies:
Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies Complete software development lifecycle experience
including design, documentation, implementation, testing, and deployment Automated data transformation and data curation:
dbt, Spark, Spark streaming, automated pipelines Workflow Management and Orchestration : Airflow, AWS Managed Airflow, Luigi, NiFi
Why phData? We offer: Casual, award-winning small-business work environment Collaborative high performance culture that prizes autonomy, creativity, and transparency Competitive comp, excellent benefits, generous weeks PTO plus 10 Holidays (and other cool perks) Accelerated learning and professional development through advanced training and certifications
#J-18808-Ljbffr