Logo
TexcelVision Inc.

Data Solutions Architect (Required Local Candidates)

TexcelVision Inc., Phoenix, Arizona, United States, 85003

Save Job

Solution Architect Data Overview The

Solution Architect Data

is responsible for contributing to the

design, modernization, and optimization of

enterprise-scale data systems, as well as the maintenance and operations strategy for

CHP

. This role involves designing and implementing

data systems

that organize, store, and manage data within our

cloud data platform.

The architect will perform continuous maintenance and operations work for CHP in the

cloud

environment. They will review and analyze

CHP’s data infrastructure

, plan future

database

solutions, and implement systems to support

data management for CHP users

.

Additionally, this role is accountable for

ensuring data integrity

, making sure the CHP team adheres to

data governance standards

to maintain accuracy, consistency, and reliability across all systems. The architect will identify

data discrepancies

and quality issues, and work to resolve them.

This position requires a

strong blend of architectural leadership

, technical depth, and the ability to collaborate with business stakeholders, data engineers, machine learning practitioners, and domain experts to deliver scalable, secure, and reliable AI-driven solutions.

The ideal candidate will have a proven track record of delivering

end-to-end ETL/ELT pipelines

across

Databricks, Azure, and AWS environments.

Key Responsibilities

Design

scalable data lake and data architectures

using

Databricks and cloud-native services.

Develop

metadata-driven, parameterized

ingestion frameworks and multi- layer data architectures.

Optimize

data workloads and performance .

Define

data governance frameworks for CHP.

Design and develop

robust data pipelines.

Architect AI systems, including RAG

workflows and prompt engineering.

Lead

cloud migration initiatives

from legacy systems to

modern data platforms .

Provide

architectural guidance, best practices, and technical leadership

across teams.

Build

documentation, reusable modules, and standardized patterns .

Required Skills and Experience

Strong expertise in

cloud platforms, primarily Azure or AWS .

Hands-on experience with

Databricks.

Deep proficiency in

Python and SQL.

Expertise in building

ETL/ELT pipelines and ADF workflows.

Experience

architecting data lakes and implementing data governance frameworks .

Hands-on experience with

CI/CD, DevOps, and Git-based development .

Ability to translate business requirements into technical architecture.

Technical Expertise Programming : Python, SQL, R

Big Data : Hadoop, Spark, Kafka, Hive

Cloud Platforms : Azure (ADF, Databricks, Azure OpenAI), AWS

Data Warehousing : Redshift, SQL Server

ETL/ELT Tools : SSIS

Required Educational Background

Bachelor’s degree in Computer Science, Information Technology, Information Systems, Engineering, or a related field.

6+ years of experience in

data engineering or .NET development.

#J-18808-Ljbffr