Logo
Data Freelance Hub

Solutions Architect – Data

Data Freelance Hub, Phoenix, Arizona, United States, 85003

Save Job

Solutions Architect - Data Phoenix, AZ – 6‑month contract. Working onsite as a 1099 contractor. Key skills include Azure, AWS, Databricks, ETL/ELT design, Python, SQL, and data governance.

Job Overview The Solutions Architect - Data is responsible for contributing to the design, modernization, optimization, and ongoing operations of enterprise‑scale data systems for CHP. This role focuses on designing and implementing data solutions that organize, store, and manage data within a cloud‑based data platform. The architect will perform continuous maintenance and operational support within the cloud environment, including reviewing existing data infrastructure, planning future database solutions, and implementing systems that support data management needs for CHP users.

This role is also accountable for ensuring data integrity and governance, ensuring adherence to standards that maintain accuracy, consistency, and reliability across systems. The architect will identify data quality issues, analyze discrepancies, and drive resolution efforts. The position requires a strong balance of architectural leadership, technical expertise, and collaboration with business stakeholders, data engineers, machine learning practitioners, and domain experts to deliver scalable, secure, and reliable AI‑driven solutions.

The ideal candidate will have demonstrated experience delivering end‑to‑end ETL/ELT pipelines across Databricks, Azure, and AWS environments.

Key Responsibilities

Design scalable data lake and enterprise data architectures using Databricks and cloud‑native services.

Develop metadata‑driven, parameterized ingestion frameworks and multi‑layer data architectures.

Optimize data workloads and system performance.

Define and enforce data governance frameworks for CHP.

Design and develop reliable and scalable data pipelines.

Architect AI systems, including RAG workflows and prompt engineering solutions.

Lead cloud migration initiatives from legacy systems to modern data platforms.

Provide architectural guidance, technical leadership, and best practices across teams.

Create documentation, reusable components, and standardized architectural patterns.

Required Skills and Experience

Strong expertise with cloud platforms, primarily Azure or AWS.

Hands‑on experience with Databricks.

Strong proficiency in Python and SQL.

Expertise in building ETL/ELT pipelines and ADF workflows.

Experience designing data lakes and implementing data governance frameworks.

Hands‑on experience with CI/CD, DevOps, and Git‑based development.

Ability to translate business requirements into technical and architectural solutions.

Programming: Python, SQL, R.

Big Data: Hadoop, Spark, Kafka, Hive.

Cloud Platforms: Azure (ADF, Databricks, Azure OpenAI), AWS.

Data Warehousing: Redshift, SQL Server.

ETL/ELT Tools: SSIS.

Bachelor’s degree in Computer Science, Information Technology, Information Systems, Engineering, or a related field.

6+ years of experience in data engineering or .NET development.

Warm regards, Vishal (Victor) Verma | Assistant Manager vishal@nsitsolutions.com NS IT Solutions www.nsitsolutions.com

#J-18808-Ljbffr