Soni Resources
Soni's client is seeking a highly skilled Data Engineer to design, build, and maintain scalable, reliable data pipelines that power their analytics ecosystem. You will be responsible for ensuring the smooth integration of data from multiple transactional systems into the AWS-based data platform.
You'll work closely with application developers, data analysts, and business stakeholders to ensure high data quality, efficient processing, and timely availability of transactional and analytics-ready data.
Key Responsibilities:
Data Pipeline Development
Design, implement, and maintain ETL/ELT processes to ingest data from multiple transactional sources (PostgreSQL, DynamoDB, and others) into the data warehouse.
Transform and load curated datasets from the warehouse into data marts. Optimize pipeline performance for large datasets and complex transformations. Data Modeling & Transformation
Design and maintain efficient data models for both the warehouse and data marts to support analytics use cases.
Implement best practices for normalization, denormalization, and dimensional modeling. Data Quality & Governance
Implement data validation, anomaly detection, and error-handling mechanisms.
Work with developers and analysts to implement and maintain security for data at rest.
Maintain data lineage and metadata documentation. Platform & Infrastructure
Leverage AWS services (e.g., S3, Lambda, Glue, DMS, Step Functions, RDS, DynamoDB) to manage scalable and secure pipelines. Work with DevOps to automate deployments and CI/CD for data workflows. Maintain and support the different Data Platforms(operational and transactional) for optimum usage and performance Responsible for ensuring the security of data and user information related to the various databases. Collaboration & Support
Partner with analysts and BI teams to ensure data marts meet reporting and dashboard needs suck as actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
Collaborate with application teams to integrate new transactional sources into the pipeline. Required Skills & Experience:
Technical Skills
Strong SQL skills and experience with
PostgreSQL
and
SQL Server .
Hands-on experience with AWS data services (S3, Glue, Lambda, DMS, Step Functions, RDS, DynamoDB, IAM).
Proficiency in ETL/ELT frameworks and scripting languages (Python, Bash).
Experience designing and maintaining data models for analytics (star schema, snowflake). Professional Experience
3+ years as a Data Engineer, ETL Developer, or related role.
Experience working with both structured, unstructured, and semi-structured data.
Proven track record of optimizing data processing performance and reliability. Soft Skills
Strong analytical thinking and problem-solving abilities.
Excellent communication skills to work with both technical and non-technical stakeholders.
Ability to manage multiple priorities in a fast-paced environment. Preferred Qualifications
Experience with
dbt
or similar transformation frameworks.
Familiarity with BI tools (Power BI, Tableau, Looker).
Experience with data governance tools and practices. What Is Offered:
Competitive salary and benefits package.
Opportunities for professional growth.
Collaborative environment with a focus on data-driven innovation including AI/ML.
#SoniTech1
Key Responsibilities:
Data Pipeline Development
Design, implement, and maintain ETL/ELT processes to ingest data from multiple transactional sources (PostgreSQL, DynamoDB, and others) into the data warehouse.
Transform and load curated datasets from the warehouse into data marts. Optimize pipeline performance for large datasets and complex transformations. Data Modeling & Transformation
Design and maintain efficient data models for both the warehouse and data marts to support analytics use cases.
Implement best practices for normalization, denormalization, and dimensional modeling. Data Quality & Governance
Implement data validation, anomaly detection, and error-handling mechanisms.
Work with developers and analysts to implement and maintain security for data at rest.
Maintain data lineage and metadata documentation. Platform & Infrastructure
Leverage AWS services (e.g., S3, Lambda, Glue, DMS, Step Functions, RDS, DynamoDB) to manage scalable and secure pipelines. Work with DevOps to automate deployments and CI/CD for data workflows. Maintain and support the different Data Platforms(operational and transactional) for optimum usage and performance Responsible for ensuring the security of data and user information related to the various databases. Collaboration & Support
Partner with analysts and BI teams to ensure data marts meet reporting and dashboard needs suck as actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
Collaborate with application teams to integrate new transactional sources into the pipeline. Required Skills & Experience:
Technical Skills
Strong SQL skills and experience with
PostgreSQL
and
SQL Server .
Hands-on experience with AWS data services (S3, Glue, Lambda, DMS, Step Functions, RDS, DynamoDB, IAM).
Proficiency in ETL/ELT frameworks and scripting languages (Python, Bash).
Experience designing and maintaining data models for analytics (star schema, snowflake). Professional Experience
3+ years as a Data Engineer, ETL Developer, or related role.
Experience working with both structured, unstructured, and semi-structured data.
Proven track record of optimizing data processing performance and reliability. Soft Skills
Strong analytical thinking and problem-solving abilities.
Excellent communication skills to work with both technical and non-technical stakeholders.
Ability to manage multiple priorities in a fast-paced environment. Preferred Qualifications
Experience with
dbt
or similar transformation frameworks.
Familiarity with BI tools (Power BI, Tableau, Looker).
Experience with data governance tools and practices. What Is Offered:
Competitive salary and benefits package.
Opportunities for professional growth.
Collaborative environment with a focus on data-driven innovation including AI/ML.
#SoniTech1