DataRobot
Job Description:
DataRobot delivers AI that maximizes impact and minimizes business risk. Our platform and applications integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business today and in the future.
Title: Senior Data Engineer (India)
About DataRobot
*DataRobot delivers industry-leading agentic AI applications and platform that maximize impact and minimize risk for your business. DataRobots enterprise AI platform democratizes data science with end-to-end automation for building, deploying, and managing machine learning models. This platform maximizes business value by delivering AI at scale and continuously optimizing performance over time. The companys proven combination of cutting-edge software and world-class AI implementation, training, and support services empowers any organization, regardless of size, industry, or resources, to drive better business outcomes with AI.*
You will be responsible for the following:
Partner with internal customers and business analysts to understand business needs and build strong relationships with key stakeholders.
Develop, deploy, and support analytic data products, such as data marts, ETL jobs (extract/transform/load), functions (in Python/SQL/DBT) in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, AWS services (e.g., EC2, Lambda, Kinesis).
Navigate various data sources and efficiently locate data in a complex data ecosystem.
Work closely with data analysts and data scientists to build data models and metrics to support analytics needs.
Maintain and support deployed ETL pipelines and ensure data quality.
Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
Partner with IT enterprise applications and engineering teams on system integration efforts impacting data & analytics.
Requirements: BA/BS preferred in a technical or engineering field.
5-7 years of experience in a data engineering or data analyst role.
Strong understanding of data warehousing concepts, experience with relational databases (Snowflake, Redshift, Postgres, etc.), and SQL.
Experience working with cloud providers like AWS, Azure, GCP, etc.
Proficiency in data-related languages such as Python, Scala, R.
Experience with DevOps tools like DBT, GitHub, Airflow, etc.
Experience with infrastructure-as-code tools such as Terraform or CloudFormation.
Excellent communication skills for technical and non-technical audiences.
Knowledge of real-time stream technologies like AWS Firehose, Spark, etc.
Highly collaborative working style.
AWS cloud certification is a plus.
We value diversity and encourage all qualified candidates to apply, regardless of gender, ethnicity, or background. DataRobot is an equal opportunity employer committed to inclusion and diversity. For more information, visit our website and connect with us on LinkedIn. #J-18808-Ljbffr
Develop, deploy, and support analytic data products, such as data marts, ETL jobs (extract/transform/load), functions (in Python/SQL/DBT) in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, AWS services (e.g., EC2, Lambda, Kinesis).
Navigate various data sources and efficiently locate data in a complex data ecosystem.
Work closely with data analysts and data scientists to build data models and metrics to support analytics needs.
Maintain and support deployed ETL pipelines and ensure data quality.
Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
Partner with IT enterprise applications and engineering teams on system integration efforts impacting data & analytics.
Requirements: BA/BS preferred in a technical or engineering field.
5-7 years of experience in a data engineering or data analyst role.
Strong understanding of data warehousing concepts, experience with relational databases (Snowflake, Redshift, Postgres, etc.), and SQL.
Experience working with cloud providers like AWS, Azure, GCP, etc.
Proficiency in data-related languages such as Python, Scala, R.
Experience with DevOps tools like DBT, GitHub, Airflow, etc.
Experience with infrastructure-as-code tools such as Terraform or CloudFormation.
Excellent communication skills for technical and non-technical audiences.
Knowledge of real-time stream technologies like AWS Firehose, Spark, etc.
Highly collaborative working style.
AWS cloud certification is a plus.
We value diversity and encourage all qualified candidates to apply, regardless of gender, ethnicity, or background. DataRobot is an equal opportunity employer committed to inclusion and diversity. For more information, visit our website and connect with us on LinkedIn. #J-18808-Ljbffr