Compunnel, Inc.
We are seeking an experienced Data Architect to design, develop, and evaluate data solutions that support analytics, reporting, and dashboard creation.
This role involves working with visualization tools, artificial intelligence, and machine learning technologies to build scalable data platforms.
The ideal candidate will have strong expertise in Python, Power BI, and cloud-based AI services, along with a deep understanding of data structures and ETL processes.
Key Responsibilities
Design and implement data architecture to support analytics and AI-driven solutions.
Build intuitive and insightful reports and dashboards using Power BI and similar visualization tools.
Integrate and process structured and unstructured data from multiple sources.
Develop and deploy intelligent models and generative AI solutions using AWS SageMaker, Bedrock, or similar platforms.
Collaborate with technical and non-technical stakeholders to deliver business-focused solutions.
Ensure data quality, security, and compliance throughout the data lifecycle.
Required Qualifications
5+ years of experience writing Python and building data pipelines.
3+ years of experience working with Data Lakes in an AWS environment.
3+ years of experience with reporting tools such as Power BI.
2+ years of experience with SQL and ETL processes.
Hands-on experience with AWS services including SageMaker, Bedrock, and QuickSight (or similar).
Strong understanding of data ingestion from various sources (APIs, databases, CSV, etc.) and database structures.
Proficient in integrating and processing structured and unstructured data.
Solid understanding of AI and machine learning concepts with practical experience in deployment.
Ability to communicate technical findings effectively to both technical and non-technical stakeholders.
Preferred Qualifications
Prior experience with AWS Kiro (nice to have).
Familiarity with major cloud service providers (AWS and Azure).
Experience designing reports, charts, and dashboards using Power BI or QuickSight.
#J-18808-Ljbffr
This role involves working with visualization tools, artificial intelligence, and machine learning technologies to build scalable data platforms.
The ideal candidate will have strong expertise in Python, Power BI, and cloud-based AI services, along with a deep understanding of data structures and ETL processes.
Key Responsibilities
Design and implement data architecture to support analytics and AI-driven solutions.
Build intuitive and insightful reports and dashboards using Power BI and similar visualization tools.
Integrate and process structured and unstructured data from multiple sources.
Develop and deploy intelligent models and generative AI solutions using AWS SageMaker, Bedrock, or similar platforms.
Collaborate with technical and non-technical stakeholders to deliver business-focused solutions.
Ensure data quality, security, and compliance throughout the data lifecycle.
Required Qualifications
5+ years of experience writing Python and building data pipelines.
3+ years of experience working with Data Lakes in an AWS environment.
3+ years of experience with reporting tools such as Power BI.
2+ years of experience with SQL and ETL processes.
Hands-on experience with AWS services including SageMaker, Bedrock, and QuickSight (or similar).
Strong understanding of data ingestion from various sources (APIs, databases, CSV, etc.) and database structures.
Proficient in integrating and processing structured and unstructured data.
Solid understanding of AI and machine learning concepts with practical experience in deployment.
Ability to communicate technical findings effectively to both technical and non-technical stakeholders.
Preferred Qualifications
Prior experience with AWS Kiro (nice to have).
Familiarity with major cloud service providers (AWS and Azure).
Experience designing reports, charts, and dashboards using Power BI or QuickSight.
#J-18808-Ljbffr