Hard Rock Digital
We’re building the best online sportsbook, casino, and social casino platform worldwide.
What’s the position? Data Engineer – Austin, TX.
Responsibilities
Be responsible for the quality, scope, and timeliness of deliverables.
Design, build, and operate platform components for other teams (streaming architectures, data storage, processing environments).
Work with engineering and business stakeholders to understand data requirements.
Perform data cleansing and enhance data quality.
Focus on quality, performance, scalability, and maintainability.
Collaborate in a fast‑paced Agile team environment.
Qualifications
Extensive experience with Python (Java/Scala a plus) and SQL, daily usage.
Big Data on cloud (Snowflake, BigQuery, S3/Athena, Redshift, Delta Lake).
Strong communication skills (written and verbal).
Database orchestration technologies (Airflow, DBT).
Streaming data architectures (Kafka).
Knowledge of semi‑structured data (Parquet, Avro, JSONA) and AWS Cloud data technologies (RDS, DynamoDB, Aurora).
Experience with PostgreSQL, MS SQL Server, or MySQL.
Experience with Spark, Kafka, Flink, Beam, Kinesis.
SnowPro Certification or equivalent from Databricks or AWS.
Comfort with Linux, Git, and version control.
Experience with Jupyter or Databricks Notebooks.
Degree in Computer Science, Engineering, MIS, Mathematics, or equivalent (3+ years).
Desired
Degree in a related field and/or 3+ years of experience.
Seniority level Mid‑Senior level
Employment type Full‑time
Job function Information Technology
Industries Software Development
#J-18808-Ljbffr
What’s the position? Data Engineer – Austin, TX.
Responsibilities
Be responsible for the quality, scope, and timeliness of deliverables.
Design, build, and operate platform components for other teams (streaming architectures, data storage, processing environments).
Work with engineering and business stakeholders to understand data requirements.
Perform data cleansing and enhance data quality.
Focus on quality, performance, scalability, and maintainability.
Collaborate in a fast‑paced Agile team environment.
Qualifications
Extensive experience with Python (Java/Scala a plus) and SQL, daily usage.
Big Data on cloud (Snowflake, BigQuery, S3/Athena, Redshift, Delta Lake).
Strong communication skills (written and verbal).
Database orchestration technologies (Airflow, DBT).
Streaming data architectures (Kafka).
Knowledge of semi‑structured data (Parquet, Avro, JSONA) and AWS Cloud data technologies (RDS, DynamoDB, Aurora).
Experience with PostgreSQL, MS SQL Server, or MySQL.
Experience with Spark, Kafka, Flink, Beam, Kinesis.
SnowPro Certification or equivalent from Databricks or AWS.
Comfort with Linux, Git, and version control.
Experience with Jupyter or Databricks Notebooks.
Degree in Computer Science, Engineering, MIS, Mathematics, or equivalent (3+ years).
Desired
Degree in a related field and/or 3+ years of experience.
Seniority level Mid‑Senior level
Employment type Full‑time
Job function Information Technology
Industries Software Development
#J-18808-Ljbffr