Ravin IT Solutions
Job Description
Job Description PFB new role: 2833 – there are 2 roles ETL Developer Charlotte, NC – 3 days/week onsite role
12+ years of experience candidate, local to Charlotte NC Informatica, Talend, SSIS, or Apache Interview: Video + Client In-Person interview (Need local resumes only)
ETL/API Developer
Strong understanding of ETL concepts, data warehousing principles, and data integration best practices. Proficiency in ETL tools such as
Informatica, Talend, SSIS, or Apache
for designing and implementing robust data pipelines. Experience in extracting data from diverse sources such as
relational databases, flat files ,
cloud storage, and APIs. Develop and expose
RESTful APIs
for secure data access, sharing, and integrations with internal and external systems. Ability to design and optimize data transformation workflows including cleansing, filtering, joining, and aggregating large datasets. Proficiency in writing efficient SQL queries, stored procedures, and performance-tuned data extraction Experience in working with various database platforms like
Oracle, SQL Server, PostgreSQL, and MySQL. Knowledge of data modeling concepts such as star schema, snowflake schema, normalization, and denormalization. Understanding data quality, validation, and profiling techniques to ensure accurate and consistent data loads.
Job Description PFB new role: 2833 – there are 2 roles ETL Developer Charlotte, NC – 3 days/week onsite role
12+ years of experience candidate, local to Charlotte NC Informatica, Talend, SSIS, or Apache Interview: Video + Client In-Person interview (Need local resumes only)
ETL/API Developer
Strong understanding of ETL concepts, data warehousing principles, and data integration best practices. Proficiency in ETL tools such as
Informatica, Talend, SSIS, or Apache
for designing and implementing robust data pipelines. Experience in extracting data from diverse sources such as
relational databases, flat files ,
cloud storage, and APIs. Develop and expose
RESTful APIs
for secure data access, sharing, and integrations with internal and external systems. Ability to design and optimize data transformation workflows including cleansing, filtering, joining, and aggregating large datasets. Proficiency in writing efficient SQL queries, stored procedures, and performance-tuned data extraction Experience in working with various database platforms like
Oracle, SQL Server, PostgreSQL, and MySQL. Knowledge of data modeling concepts such as star schema, snowflake schema, normalization, and denormalization. Understanding data quality, validation, and profiling techniques to ensure accurate and consistent data loads.