Talentica Software India Pvt. Ltd.
Data Engineer I3
Talentica Software India Pvt. Ltd., Snowflake, Arizona, United States, 85937
Talentica Software, started by industry veterans and ex- IITB grads, is a
product engineering company
that helps tech-native enterprises and startups turn their ideas into market-leading products. We deliver innovative, high-quality products at an accelerated pace by combining the product mindset of our human experts with the power of AI. Over the last 22 years, the company has worked with over 200+ startups, with most clients based in the US, ensuring many successful exits. In 2022, Great Place to Work recognized Talentica Software as India's Great Mid-Size Workplace. What we're looking for?
Were seeking a
skilled and experienced Data Engineer
to lead the development and implementation of our
Data Warehouse/Lakehouse solutions , ensuring it serves as the foundation for scalable, high-performance analytics. This role calls for deep expertise in
data architecture, cloud-based solutions, and modern analytics platforms.
Youll play a pivotal role in defining and executing our data infrastructure strategy, ensuring it drives innovation and business impact. If this excites you, wed love to connect. What youll be doing ?
Lead the end-to-end development and deployment of a scalable and secure Lakehouse architecture. Define best practices for data ingestion, storage, transformation, and processing using modern cloud technologies. Architect data pipelines using ETL/ELT frameworks to support structured, semi structured, and unstructured data. Optimize data modelling strategies to meet the analytical and performance needs of stakeholders. Evaluate and select appropriate cloud technologies, frameworks, and architectures Develop and maintain efficient, automated data pipelines that integrate data from multiple sources (Oracle, Big Query, MongoDB, Google Analytics, etc.). Implement distributed data processing frameworks using tools like Apache Spark and Dataflow. Ensure optimal query performance tuning and cost-effective cloud resource utilization. Establish and enforce data governance policies, ensuring adherence to compliance regulations such as GDPR and CCPA. Implement robust data access control and security measures across multi-tenant environments. Define data lineage, cataloging, and metadata management to enable data democratization. Stay abreast of developments in cloud data platforms, AI integrations, and analytics technologies to drive innovation. Evaluate and recommend emerging data technologies to enhance efficiency and scalability. Lead Proof of Concepts (POCs) to validate new approaches and technologies. To be successful in this role, you should have:
Qualification:
Bachelors degree in Computer Science, Data Engineering, Information Systems, or a closely related field from a top-tier institution in India Experience:
5+ years of experience in data engineering, with a proven track record of implementing large-scale data solutions. Skills
Extensive experience with cloud platforms (AWS, GCP, or Azure), specifically in data warehouse/lakehouse implementations. Expertise in modern data architectures with tools like Databricks, Snowflake, or Big Query. Strong background in SQL, Python, and distributed computing frameworks (Spark, Dataflow, etc.). Experience building high-volume, reliable data pipelines for real-time and batch processing. In-depth knowledge of data modelling principles (e.g., Star Schema, Snowflake Schema). Experience in enabling AI tools to consume data from the Lakehouse, including implementing semantic layers. Strong understanding of data governance, security, and compliance frameworks. Must-Have Skills Kafka, Databricks, Python, NoSQL, Medallion Architecture Good to Have AWS Glue, SQL, Dibarium Soft Skills: Excellent problem-solving and strategic thinking Strong leadership and collaboration skills Clear communication of technical concepts to non-technical stakeholders Ability to manage multiple priorities independently in a fast-paced environment What youll find here?
A culture of innovation:
We only take up projects that challenge us to innovate. Our customers come to us for our technology expertise. Endless learning opportunities:
Continuous learning is baked into our DNA. Youll always have the chance to learn new things and stay on top of the latest trends. Talented peers:
Work alongside engineers from IITs, NITs, BITS, and other premier institutions. Work-life balance:
We value work-life balance and offer flexible schedules, including remote work options, so you can thrive both professionally and personally. A great culture:
Our employees love working here! 82% recommend Talentica to their friends, according to Glassdoor. Join us, and youll see why! Recognition & rewards:
We dont just work hard, we celebrate success. Your contributions wont go unnoticed. Well make sure you're recognized for the amazing work you do. Ready to Make an Impact?
Fill out the lead form below, and well get in touch! We value and respect your privacy. By submitting, you agree to our Privacy Policy. #J-18808-Ljbffr
product engineering company
that helps tech-native enterprises and startups turn their ideas into market-leading products. We deliver innovative, high-quality products at an accelerated pace by combining the product mindset of our human experts with the power of AI. Over the last 22 years, the company has worked with over 200+ startups, with most clients based in the US, ensuring many successful exits. In 2022, Great Place to Work recognized Talentica Software as India's Great Mid-Size Workplace. What we're looking for?
Were seeking a
skilled and experienced Data Engineer
to lead the development and implementation of our
Data Warehouse/Lakehouse solutions , ensuring it serves as the foundation for scalable, high-performance analytics. This role calls for deep expertise in
data architecture, cloud-based solutions, and modern analytics platforms.
Youll play a pivotal role in defining and executing our data infrastructure strategy, ensuring it drives innovation and business impact. If this excites you, wed love to connect. What youll be doing ?
Lead the end-to-end development and deployment of a scalable and secure Lakehouse architecture. Define best practices for data ingestion, storage, transformation, and processing using modern cloud technologies. Architect data pipelines using ETL/ELT frameworks to support structured, semi structured, and unstructured data. Optimize data modelling strategies to meet the analytical and performance needs of stakeholders. Evaluate and select appropriate cloud technologies, frameworks, and architectures Develop and maintain efficient, automated data pipelines that integrate data from multiple sources (Oracle, Big Query, MongoDB, Google Analytics, etc.). Implement distributed data processing frameworks using tools like Apache Spark and Dataflow. Ensure optimal query performance tuning and cost-effective cloud resource utilization. Establish and enforce data governance policies, ensuring adherence to compliance regulations such as GDPR and CCPA. Implement robust data access control and security measures across multi-tenant environments. Define data lineage, cataloging, and metadata management to enable data democratization. Stay abreast of developments in cloud data platforms, AI integrations, and analytics technologies to drive innovation. Evaluate and recommend emerging data technologies to enhance efficiency and scalability. Lead Proof of Concepts (POCs) to validate new approaches and technologies. To be successful in this role, you should have:
Qualification:
Bachelors degree in Computer Science, Data Engineering, Information Systems, or a closely related field from a top-tier institution in India Experience:
5+ years of experience in data engineering, with a proven track record of implementing large-scale data solutions. Skills
Extensive experience with cloud platforms (AWS, GCP, or Azure), specifically in data warehouse/lakehouse implementations. Expertise in modern data architectures with tools like Databricks, Snowflake, or Big Query. Strong background in SQL, Python, and distributed computing frameworks (Spark, Dataflow, etc.). Experience building high-volume, reliable data pipelines for real-time and batch processing. In-depth knowledge of data modelling principles (e.g., Star Schema, Snowflake Schema). Experience in enabling AI tools to consume data from the Lakehouse, including implementing semantic layers. Strong understanding of data governance, security, and compliance frameworks. Must-Have Skills Kafka, Databricks, Python, NoSQL, Medallion Architecture Good to Have AWS Glue, SQL, Dibarium Soft Skills: Excellent problem-solving and strategic thinking Strong leadership and collaboration skills Clear communication of technical concepts to non-technical stakeholders Ability to manage multiple priorities independently in a fast-paced environment What youll find here?
A culture of innovation:
We only take up projects that challenge us to innovate. Our customers come to us for our technology expertise. Endless learning opportunities:
Continuous learning is baked into our DNA. Youll always have the chance to learn new things and stay on top of the latest trends. Talented peers:
Work alongside engineers from IITs, NITs, BITS, and other premier institutions. Work-life balance:
We value work-life balance and offer flexible schedules, including remote work options, so you can thrive both professionally and personally. A great culture:
Our employees love working here! 82% recommend Talentica to their friends, according to Glassdoor. Join us, and youll see why! Recognition & rewards:
We dont just work hard, we celebrate success. Your contributions wont go unnoticed. Well make sure you're recognized for the amazing work you do. Ready to Make an Impact?
Fill out the lead form below, and well get in touch! We value and respect your privacy. By submitting, you agree to our Privacy Policy. #J-18808-Ljbffr