Dynatron Software
About Dynatron
Dynatron is transforming the automotive service industry with intelligent SaaS solutions that drive measurable results for thousands of dealerships and service departments. Our proprietary analytics and workflow tools empower service leaders to boost profitability, enhance customer satisfaction, and unlock operational excellence. With accelerating growth, strong customer traction, and increasing market demand, we’re scaling, and we’re just getting started.
The Opportunity We’re looking for an experienced, visionary Data Architect to join our expanding data organization. This is a critical role responsible for designing, governing, and optimizing the enterprise data architecture that powers scalable analytics, real-time data processing, AI/ML workflows, and secure data operations across the business.
You will architect end-to-end data ecosystems, spanning streaming, warehousing, lakehouse, governance, and ML enablement, to ensure high performance, extensibility, and long-term sustainability. The ideal candidate is deeply technical, hands‑on with modern cloud data platforms, and highly skilled in data modeling, security, streaming architectures, and enterprise guardrails. If you thrive in complexity, think strategically, and deliver data foundations that scale, this role offers the opportunity to shape Dynatron’s data future.
What You’ll Do Data Architecture & Modeling
Design scalable conceptual, logical, and physical data models supporting OLTP, OLAP, real-time analytics, and ML workloads.
Architect modular, domain-driven data structures for multi-domain analytics.
Apply modern modeling techniques, including 3NF, Dimensional Modeling, Data Vault, Medallion Architecture, and Data Mesh principles.
Define canonical models, conformed dimensions, and enterprise reference datasets.
Ensure performance, usability, and long-term maintainability of data schemas.
Real-Time & Streaming Architecture
Architect real-time ingestion and event-driven pipelines using Kafka, Kinesis, Pulsar, or Azure Event Hub.
Implement CDC frameworks such as Debezium, Fivetran, or StreamSets.
Design low-latency, high-throughput streaming architectures for operational and analytical use cases.
Build real-time data models supporting live analytics and data-driven decision-making.
ML/AI Data Architecture
Design ML-ready datasets, feature stores, and reproducible data pipelines.
Partner with ML and Data Science teams to enable production-grade model workflows.
Integrate modern AI/ML platform capabilities (e.g., Snowflake Cortex, Databricks Feature Store, AWS Bedrock).
Architect for drift detection, data quality monitoring, lineage visibility, retraining workflows, and model governance.
Cloud Data Platform Architecture
Design scalable architectures using Snowflake, Databricks, or other cloud-native platforms.
Build data pipelines using ADF, Databricks Workflows, AWS Glue, Step Functions, or equivalent technologies.
Optimize compute and storage performance leveraging Delta, Iceberg, Parquet, and lakehouse patterns.
Implement governance controls, including RBAC, masking, tokenization, and secure data sharing.
Data Security, Privacy & PII Protection
Architect secure data environments aligned with GDPR, CCPA, PCI, SOC 2, and other regulatory frameworks.
Implement encryption, masking, hashing, and IAM/RBAC policies.
Design retention, lineage, and access governance for sensitive data.
Collaborate with Compliance to ensure proper handling of PII/PHI and protected datasets.
Enterprise Governance & Guardrails
Define enterprise-wide modeling standards, data contracts, and schema evolution guidelines.
Establish reference architectures and curated “golden” datasets.
Create SLAs/SLOs across data domains to ensure reliability and quality.
Enforce adherence to governance, quality, and architectural frameworks.
Leadership, Mentorship & Collaboration
Mentor data engineers and guide architectural best practices.
Lead design reviews and cross-functional architectural discussions.
Partner closely with product, engineering, ML, and analytics teams to ensure alignment on data strategy.
Communicate risks, trade-offs, and long-term architectural impact with clarity.
Delivery, Scalability & Operational Excellence
Ensure data systems meet SLAs and scale with business demand.
Drive observability, monitoring, and alerting across data platforms.
Reduce technical debt through proactive governance and architectural discipline.
Support the full data lifecycle: design → build → deployment → governance.
What You Bring Technical Expertise
7-10+ years of experience as a Data Architect or Senior Data Engineer in enterprise-scale environments.
Deep hands-on experience with Snowflake, Databricks, Azure Data Factory, AWS Glue, Bedrock, Redshift, BigQuery, or Teradata.
Strong SQL and Python/Scala skills, with expertise in schema design and metadata management.
Experience building streaming architectures with Kafka, Kinesis, or Event Hub.
Knowledge of ML/AI pipelines, feature stores, vector databases, and modern AI platform tooling.
Security & Privacy
Expertise in encryption, masking, tokenization, IAM, and RBAC.
Understanding of PII/PHI requirements and regulatory standards.
Experience implementing secure patterns across cloud platforms.
Cloud Architecture
Experience designing distributed systems across AWS, Azure, or GCP.
Strong understanding of compute scaling, storage layers, and cloud-native services.
Soft Skills & Leadership
Strong documentation skills, including architectural diagrams, ADRs, and playbooks.
Excellent communication skills, with the ability to influence at all levels.
Proven mentorship, leadership, and cross-functional collaboration.
Strategic thinker with a high degree of ownership and accountability.
Nice To Have
Experience with data mesh and domain-driven design.
Experience with Snowflake Cortex, Databricks AI, or AWS Bedrock.
Expertise in lakehouse architectures (Delta, Iceberg, Hudi).
Background in large-scale modernization or cloud migration initiatives.
Why Dynatron
Opportunity to architect the data foundation of a rapidly growing SaaS organization.
High-impact role with visibility across engineering, product, ML, analytics, and executive teams.
Values-driven culture built on accountability, urgency, positivity, and delivering results.
Remote-first environment offering flexibility and autonomy.
Compensation Base Salary: $140,000 - $180,000/yr Equity: Participation in Dynatron’s Equity Incentive Plan
Benefits Summary
Comprehensive health, vision, and dental insurance
Employer-paid short- and long-term disability and life insurance
401(k) with competitive company match
Flexible vacation policy and 11 paid holidays
Remote-first culture
Ready to shape the future of Dynatron’s data architecture and build scalable, intelligent systems that power our next stage of growth? Join us as we build the data foundation for extraordinary outcomes.
#J-18808-Ljbffr
The Opportunity We’re looking for an experienced, visionary Data Architect to join our expanding data organization. This is a critical role responsible for designing, governing, and optimizing the enterprise data architecture that powers scalable analytics, real-time data processing, AI/ML workflows, and secure data operations across the business.
You will architect end-to-end data ecosystems, spanning streaming, warehousing, lakehouse, governance, and ML enablement, to ensure high performance, extensibility, and long-term sustainability. The ideal candidate is deeply technical, hands‑on with modern cloud data platforms, and highly skilled in data modeling, security, streaming architectures, and enterprise guardrails. If you thrive in complexity, think strategically, and deliver data foundations that scale, this role offers the opportunity to shape Dynatron’s data future.
What You’ll Do Data Architecture & Modeling
Design scalable conceptual, logical, and physical data models supporting OLTP, OLAP, real-time analytics, and ML workloads.
Architect modular, domain-driven data structures for multi-domain analytics.
Apply modern modeling techniques, including 3NF, Dimensional Modeling, Data Vault, Medallion Architecture, and Data Mesh principles.
Define canonical models, conformed dimensions, and enterprise reference datasets.
Ensure performance, usability, and long-term maintainability of data schemas.
Real-Time & Streaming Architecture
Architect real-time ingestion and event-driven pipelines using Kafka, Kinesis, Pulsar, or Azure Event Hub.
Implement CDC frameworks such as Debezium, Fivetran, or StreamSets.
Design low-latency, high-throughput streaming architectures for operational and analytical use cases.
Build real-time data models supporting live analytics and data-driven decision-making.
ML/AI Data Architecture
Design ML-ready datasets, feature stores, and reproducible data pipelines.
Partner with ML and Data Science teams to enable production-grade model workflows.
Integrate modern AI/ML platform capabilities (e.g., Snowflake Cortex, Databricks Feature Store, AWS Bedrock).
Architect for drift detection, data quality monitoring, lineage visibility, retraining workflows, and model governance.
Cloud Data Platform Architecture
Design scalable architectures using Snowflake, Databricks, or other cloud-native platforms.
Build data pipelines using ADF, Databricks Workflows, AWS Glue, Step Functions, or equivalent technologies.
Optimize compute and storage performance leveraging Delta, Iceberg, Parquet, and lakehouse patterns.
Implement governance controls, including RBAC, masking, tokenization, and secure data sharing.
Data Security, Privacy & PII Protection
Architect secure data environments aligned with GDPR, CCPA, PCI, SOC 2, and other regulatory frameworks.
Implement encryption, masking, hashing, and IAM/RBAC policies.
Design retention, lineage, and access governance for sensitive data.
Collaborate with Compliance to ensure proper handling of PII/PHI and protected datasets.
Enterprise Governance & Guardrails
Define enterprise-wide modeling standards, data contracts, and schema evolution guidelines.
Establish reference architectures and curated “golden” datasets.
Create SLAs/SLOs across data domains to ensure reliability and quality.
Enforce adherence to governance, quality, and architectural frameworks.
Leadership, Mentorship & Collaboration
Mentor data engineers and guide architectural best practices.
Lead design reviews and cross-functional architectural discussions.
Partner closely with product, engineering, ML, and analytics teams to ensure alignment on data strategy.
Communicate risks, trade-offs, and long-term architectural impact with clarity.
Delivery, Scalability & Operational Excellence
Ensure data systems meet SLAs and scale with business demand.
Drive observability, monitoring, and alerting across data platforms.
Reduce technical debt through proactive governance and architectural discipline.
Support the full data lifecycle: design → build → deployment → governance.
What You Bring Technical Expertise
7-10+ years of experience as a Data Architect or Senior Data Engineer in enterprise-scale environments.
Deep hands-on experience with Snowflake, Databricks, Azure Data Factory, AWS Glue, Bedrock, Redshift, BigQuery, or Teradata.
Strong SQL and Python/Scala skills, with expertise in schema design and metadata management.
Experience building streaming architectures with Kafka, Kinesis, or Event Hub.
Knowledge of ML/AI pipelines, feature stores, vector databases, and modern AI platform tooling.
Security & Privacy
Expertise in encryption, masking, tokenization, IAM, and RBAC.
Understanding of PII/PHI requirements and regulatory standards.
Experience implementing secure patterns across cloud platforms.
Cloud Architecture
Experience designing distributed systems across AWS, Azure, or GCP.
Strong understanding of compute scaling, storage layers, and cloud-native services.
Soft Skills & Leadership
Strong documentation skills, including architectural diagrams, ADRs, and playbooks.
Excellent communication skills, with the ability to influence at all levels.
Proven mentorship, leadership, and cross-functional collaboration.
Strategic thinker with a high degree of ownership and accountability.
Nice To Have
Experience with data mesh and domain-driven design.
Experience with Snowflake Cortex, Databricks AI, or AWS Bedrock.
Expertise in lakehouse architectures (Delta, Iceberg, Hudi).
Background in large-scale modernization or cloud migration initiatives.
Why Dynatron
Opportunity to architect the data foundation of a rapidly growing SaaS organization.
High-impact role with visibility across engineering, product, ML, analytics, and executive teams.
Values-driven culture built on accountability, urgency, positivity, and delivering results.
Remote-first environment offering flexibility and autonomy.
Compensation Base Salary: $140,000 - $180,000/yr Equity: Participation in Dynatron’s Equity Incentive Plan
Benefits Summary
Comprehensive health, vision, and dental insurance
Employer-paid short- and long-term disability and life insurance
401(k) with competitive company match
Flexible vacation policy and 11 paid holidays
Remote-first culture
Ready to shape the future of Dynatron’s data architecture and build scalable, intelligent systems that power our next stage of growth? Join us as we build the data foundation for extraordinary outcomes.
#J-18808-Ljbffr