Jobs via Dice
Join to apply for the
Data Architect
role at
Jobs via Dice .
Rate: Open to Market Location: Trenton, NJ – Hybrid (3 days onsite, 2 days remote) Duration: 1 Year
Job Description We are in need of a
Data Architect
to research, design, develop, and evaluate data that support the creation of reports and dashboards which will be utilized for analytics. This would include the use of visualization tools, artificial intelligence, and machine learning. This is a 6‑month contract opportunity with the possibility of extension. The position is hybrid in Trenton, NJ.
The Data Architect will work with Power BI and Python and have a strong understanding of data structures. This position will work with Bedrock, SageMaker or similar tools to work on our clients AI solution. ETL and data analysis experience are key to the success in this role.
What You’ll Do
Build intuitive and insightful reports and dashboards using visualization platforms like Power BI
Integrate and process structured and unstructured data from various sources
Build, deploy, or integrate intelligent models and generative AI solutions into business workflows using Amazon SageMaker and AWS Bedrock
Work with technical and non-technical stakeholders
What You’ll Need Required
5 years experience with writing Python
5 years experience with building data pipelines
3 years experience in Data Lakes in an AWS environment
3 years experience with reporting tools: PowerBI
2 years experience with SQL
Knowledge and experience with some of the major CSPs (AWS and Azure)
Prior experience with AWS Bedrock and WS QuickSight or similar
2 years experience with using AWS SageMaker or similar
Experience designing reports, charts, and dashboards using tools such as PowerBI, QuickSight
Proficient in Python, SageMaker, Bedrock, Kiro, and other AI services
Understands data ingestion from various data sources (APIs, databases, csv, etc.), database structures
Data‑driven professional with a strong analytical mindset and hands‑on experience in the full data lifecycle — from ingestion and transformation to visualization and advanced analytics
Proficient in integrating and processing structured and unstructured data from various sources, leveraging tools such as SQL, Python, or ETL frameworks
Expertise in building intuitive and insightful reports and dashboards using visualization platforms like Power BI is essential
Ability to communicate findings effectively to both technical and non-technical stakeholders
Solid understanding of AI and machine learning concepts, with practical experience using platforms like Amazon SageMaker and AWS Bedrock
A combination of technical acuity, problem‑solving ability, and business awareness is critical for success in this role
Preferred
Prior experience with AWS Kiro nice to have
Seniority level Mid‑Senior level
Employment type Full-time
Job function Engineering and Information Technology
Industries Software Development
Referrals increase your chances of interviewing at Jobs via Dice by 2x
#J-18808-Ljbffr
Data Architect
role at
Jobs via Dice .
Rate: Open to Market Location: Trenton, NJ – Hybrid (3 days onsite, 2 days remote) Duration: 1 Year
Job Description We are in need of a
Data Architect
to research, design, develop, and evaluate data that support the creation of reports and dashboards which will be utilized for analytics. This would include the use of visualization tools, artificial intelligence, and machine learning. This is a 6‑month contract opportunity with the possibility of extension. The position is hybrid in Trenton, NJ.
The Data Architect will work with Power BI and Python and have a strong understanding of data structures. This position will work with Bedrock, SageMaker or similar tools to work on our clients AI solution. ETL and data analysis experience are key to the success in this role.
What You’ll Do
Build intuitive and insightful reports and dashboards using visualization platforms like Power BI
Integrate and process structured and unstructured data from various sources
Build, deploy, or integrate intelligent models and generative AI solutions into business workflows using Amazon SageMaker and AWS Bedrock
Work with technical and non-technical stakeholders
What You’ll Need Required
5 years experience with writing Python
5 years experience with building data pipelines
3 years experience in Data Lakes in an AWS environment
3 years experience with reporting tools: PowerBI
2 years experience with SQL
Knowledge and experience with some of the major CSPs (AWS and Azure)
Prior experience with AWS Bedrock and WS QuickSight or similar
2 years experience with using AWS SageMaker or similar
Experience designing reports, charts, and dashboards using tools such as PowerBI, QuickSight
Proficient in Python, SageMaker, Bedrock, Kiro, and other AI services
Understands data ingestion from various data sources (APIs, databases, csv, etc.), database structures
Data‑driven professional with a strong analytical mindset and hands‑on experience in the full data lifecycle — from ingestion and transformation to visualization and advanced analytics
Proficient in integrating and processing structured and unstructured data from various sources, leveraging tools such as SQL, Python, or ETL frameworks
Expertise in building intuitive and insightful reports and dashboards using visualization platforms like Power BI is essential
Ability to communicate findings effectively to both technical and non-technical stakeholders
Solid understanding of AI and machine learning concepts, with practical experience using platforms like Amazon SageMaker and AWS Bedrock
A combination of technical acuity, problem‑solving ability, and business awareness is critical for success in this role
Preferred
Prior experience with AWS Kiro nice to have
Seniority level Mid‑Senior level
Employment type Full-time
Job function Engineering and Information Technology
Industries Software Development
Referrals increase your chances of interviewing at Jobs via Dice by 2x
#J-18808-Ljbffr