Compunnel, Inc.
We are seeking a seasoned Data Architect to lead the design, development, and implementation of scalable data solutions.
This role requires deep technical expertise in AWS data services, data migration, and enterprise architecture.
The ideal candidate will also provide hands‑on support and technical leadership to a team of data engineers, developers, and analysts.
Key Responsibilities
Define and implement system architecture, data governance, CI/CD pipelines, and ETL frameworks.
Ensure scalability, performance, and security of data workflows and integrations.
Lead data migration efforts from legacy systems such as Oracle and DB2.
Design and implement complex enterprise data solutions using AWS services including Aurora, DynamoDB, RedShift, S3, Glue, and Lambda.
Manage data in AWS GovCloud environments.
Oversee data mapping and streaming data processing using ETL/ELT frameworks.
Apply database management best practices across platforms.
Communicate solution options and recommendations effectively to stakeholders.
Provide technical leadership and mentorship to the data team.
Ensure adherence to Agile methodologies and proactively mitigate technical risks.
Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
10+ years of experience designing, developing, and implementing data pipelines and storage solutions using AWS technologies.Proven experience with large-scale data migration projects.
Expertise in enterprise architecture and complex solution design.
Experience managing data in AWS GovCloud.
Strong knowledge of data mapping, ETL/ELT frameworks, and streaming data processing.
Excellent communication and leadership skills.
Preferred Qualifications
Experience with enterprise tooling and platforms used in regulated environments.
Familiarity with Agile development practices.
#J-18808-Ljbffr
This role requires deep technical expertise in AWS data services, data migration, and enterprise architecture.
The ideal candidate will also provide hands‑on support and technical leadership to a team of data engineers, developers, and analysts.
Key Responsibilities
Define and implement system architecture, data governance, CI/CD pipelines, and ETL frameworks.
Ensure scalability, performance, and security of data workflows and integrations.
Lead data migration efforts from legacy systems such as Oracle and DB2.
Design and implement complex enterprise data solutions using AWS services including Aurora, DynamoDB, RedShift, S3, Glue, and Lambda.
Manage data in AWS GovCloud environments.
Oversee data mapping and streaming data processing using ETL/ELT frameworks.
Apply database management best practices across platforms.
Communicate solution options and recommendations effectively to stakeholders.
Provide technical leadership and mentorship to the data team.
Ensure adherence to Agile methodologies and proactively mitigate technical risks.
Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
10+ years of experience designing, developing, and implementing data pipelines and storage solutions using AWS technologies.Proven experience with large-scale data migration projects.
Expertise in enterprise architecture and complex solution design.
Experience managing data in AWS GovCloud.
Strong knowledge of data mapping, ETL/ELT frameworks, and streaming data processing.
Excellent communication and leadership skills.
Preferred Qualifications
Experience with enterprise tooling and platforms used in regulated environments.
Familiarity with Agile development practices.
#J-18808-Ljbffr