Purple Drive LLC
We are seeking an experienced
Data Architect
with deep expertise in
ETL/ELT design, Snowflake, and modern data architecture . The ideal candidate will have a strong background in
data modeling, governance, and cloud-based data platforms , with the ability to standardize and streamline data engineering practices across teams. This role requires close collaboration with data engineers, analysts, and business stakeholders to design scalable, secure, and high-performing data solutions. Key Responsibilities
Design and implement
scalable data architectures
supporting analytical and operational use cases across cloud platforms such as
Snowflake, BigQuery, or Redshift . Develop and optimize
ETL/ELT pipelines
for ingestion, transformation, and integration of large-scale structured and unstructured datasets. Define
data modeling standards , governance policies, and architecture frameworks to ensure consistency, quality, and compliance. Collaborate with cross-functional teams to
align data structures
and implement
data lineage, metadata management, and security frameworks . Establish
best practices
for data storage, performance optimization, and cost management in cloud environments. Partner with analytics and engineering teams to support
data warehouse and lakehouse initiatives . Document architecture decisions, standards, and integration patterns for reusability and knowledge sharing. Participate in
data platform modernization initiatives , contributing to the adoption of
data mesh, lakehouse, and modern data stack principles . Provide technical leadership and mentorship to data engineering teams, driving standardization and automation. Required Skills & Experience
7-10 years of experience in
data architecture, data engineering, or BI architecture
roles. Proven hands-on experience with
Snowflake
(schema design, performance tuning, compute optimization, and cost governance). Expertise in
ETL/ELT tools
and workflow orchestration (e.g.,
Airflow, dbt, Glue, Informatica, Matillion ). Strong understanding of
data modeling, warehousing concepts, and dimensional design (Star/Snowflake schema) . Knowledge of
data governance, lineage, security, and compliance frameworks . Experience with
Python, SQL, or Spark
for data transformation and pipeline development. Familiarity with
modern data architectures
such as
data mesh, lakehouse, or data fabric . Excellent
documentation, communication, and stakeholder alignment
skills. Ability to work collaboratively with engineering, analytics, and business teams in an Agile environment.
#J-18808-Ljbffr
Data Architect
with deep expertise in
ETL/ELT design, Snowflake, and modern data architecture . The ideal candidate will have a strong background in
data modeling, governance, and cloud-based data platforms , with the ability to standardize and streamline data engineering practices across teams. This role requires close collaboration with data engineers, analysts, and business stakeholders to design scalable, secure, and high-performing data solutions. Key Responsibilities
Design and implement
scalable data architectures
supporting analytical and operational use cases across cloud platforms such as
Snowflake, BigQuery, or Redshift . Develop and optimize
ETL/ELT pipelines
for ingestion, transformation, and integration of large-scale structured and unstructured datasets. Define
data modeling standards , governance policies, and architecture frameworks to ensure consistency, quality, and compliance. Collaborate with cross-functional teams to
align data structures
and implement
data lineage, metadata management, and security frameworks . Establish
best practices
for data storage, performance optimization, and cost management in cloud environments. Partner with analytics and engineering teams to support
data warehouse and lakehouse initiatives . Document architecture decisions, standards, and integration patterns for reusability and knowledge sharing. Participate in
data platform modernization initiatives , contributing to the adoption of
data mesh, lakehouse, and modern data stack principles . Provide technical leadership and mentorship to data engineering teams, driving standardization and automation. Required Skills & Experience
7-10 years of experience in
data architecture, data engineering, or BI architecture
roles. Proven hands-on experience with
Snowflake
(schema design, performance tuning, compute optimization, and cost governance). Expertise in
ETL/ELT tools
and workflow orchestration (e.g.,
Airflow, dbt, Glue, Informatica, Matillion ). Strong understanding of
data modeling, warehousing concepts, and dimensional design (Star/Snowflake schema) . Knowledge of
data governance, lineage, security, and compliance frameworks . Experience with
Python, SQL, or Spark
for data transformation and pipeline development. Familiarity with
modern data architectures
such as
data mesh, lakehouse, or data fabric . Excellent
documentation, communication, and stakeholder alignment
skills. Ability to work collaboratively with engineering, analytics, and business teams in an Agile environment.
#J-18808-Ljbffr