247Sports
Requirements
10+ years as a hands‑on Solutions Architect and/or Data Engineer designing and implementing data solutions.
Team lead, and/or mentorship of other engineers.
Ability to develop end‑to‑end technical solutions into production and to help ensure performance, security, scalability, and robust data integration.
Programming expertise in Java, Python and/or Scala.
Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP.
SQL and the ability to write, debug, and optimise SQL queries.
Client‑facing written and verbal communication skills and experience.
Create and deliver detailed presentations.
Detailed solution documentation (e.g. including POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
4‑year Bachelor's degree in Computer Science or a related field.
Prefer any of the following
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks.
Cloud and Distributed Data Storage: S3 ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies.
Multiple data sources (e.g. queues, relational databases, files, search, API).
Complete software development lifecycle experience, including design, documentation, implementation, testing, and deployment.
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines.
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi.
#J-18808-Ljbffr
10+ years as a hands‑on Solutions Architect and/or Data Engineer designing and implementing data solutions.
Team lead, and/or mentorship of other engineers.
Ability to develop end‑to‑end technical solutions into production and to help ensure performance, security, scalability, and robust data integration.
Programming expertise in Java, Python and/or Scala.
Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP.
SQL and the ability to write, debug, and optimise SQL queries.
Client‑facing written and verbal communication skills and experience.
Create and deliver detailed presentations.
Detailed solution documentation (e.g. including POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
4‑year Bachelor's degree in Computer Science or a related field.
Prefer any of the following
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks.
Cloud and Distributed Data Storage: S3 ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies.
Multiple data sources (e.g. queues, relational databases, files, search, API).
Complete software development lifecycle experience, including design, documentation, implementation, testing, and deployment.
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines.
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi.
#J-18808-Ljbffr