Strategic Staffing Solutions
AI Specialist I
Strategic Staffing Solutions, Charlotte, North Carolina, United States, 28245
AI Specialist
Location:
Charlotte, NC Type:
Contract or Full-Time Compensation:
About the Role
We are seeking a highly skilled
Machine Learning Engineer / AI Specialist
to join a dynamic and fast-evolving data science team. The ideal candidate will bring strong technical expertise in
AWS SageMaker ,
Python programming , and
MLOps practices , along with a deep understanding of scalable model development and deployment in production environments.
This role is ideal for someone who thrives at the intersection of data science, engineering, and automation-building, optimizing, and maintaining robust machine learning systems that drive real business impact.
Key Responsibilities Design, develop, and deploy
machine learning models
using
AWS SageMaker . Build and maintain
ML pipelines
for model training, validation, and deployment. Implement
MLOps best practices , including
CI/CD
workflows for model lifecycle automation. Collaborate closely with data scientists to
productionize research models . Monitor and optimize model performance, cost, and reliability; implement automated retraining processes. Develop and maintain
model versioning, experiment tracking, and data validation frameworks . Debug and maintain
Terraform
and
Concourse
pipelines; proactively update based on organizational changes. Migrate repositories to GitHub
and update associated pipelines for continuous integration. Ensure
data quality , governance, and reproducibility of model outputs. Participate in
code reviews , maintain clean, modular code, and create detailed technical documentation. Required Qualifications
Bachelor's degree in
Computer Science, Data Science, Engineering , or related field (or 8+ years equivalent experience). 3+ years of experience in
machine learning engineering, AI development, or data science operations . Strong
Python
programming skills; proficiency in
NumPy, Pandas, Scikit-learn , and related libraries. Hands-on experience with
AWS SageMaker
for training, tuning, and deploying models. Solid background in
data science methodologies
and
statistical analysis . Experience with
Infrastructure-as-Code
tools (Terraform, CloudFormation). Deep understanding of
MLOps , containerization ( Docker, Kubernetes ), and CI/CD pipelines. Familiarity with
GitHub Actions , version control, and collaborative development workflows. Working knowledge of
AWS services
(S3, EC2, Lambda, CloudWatch). Preferred Qualifications
Master's degree in a relevant technical field. AWS Certifications
(e.g., Machine Learning Specialty, Solutions Architect). Experience with
monitoring tools
(Prometheus, Grafana, CloudWatch) and
big data frameworks
(EMR, Spark, Hadoop). Strong
SQL expertise
(CTEs, indexes, stored procedures, and performance optimization). Experience with
ETL tools
(SSIS, Sqoop, Spark). Hands-on experience building
classification and regression models . Familiarity with
software engineering best practices
and
design patterns .
Location:
Charlotte, NC Type:
Contract or Full-Time Compensation:
About the Role
We are seeking a highly skilled
Machine Learning Engineer / AI Specialist
to join a dynamic and fast-evolving data science team. The ideal candidate will bring strong technical expertise in
AWS SageMaker ,
Python programming , and
MLOps practices , along with a deep understanding of scalable model development and deployment in production environments.
This role is ideal for someone who thrives at the intersection of data science, engineering, and automation-building, optimizing, and maintaining robust machine learning systems that drive real business impact.
Key Responsibilities Design, develop, and deploy
machine learning models
using
AWS SageMaker . Build and maintain
ML pipelines
for model training, validation, and deployment. Implement
MLOps best practices , including
CI/CD
workflows for model lifecycle automation. Collaborate closely with data scientists to
productionize research models . Monitor and optimize model performance, cost, and reliability; implement automated retraining processes. Develop and maintain
model versioning, experiment tracking, and data validation frameworks . Debug and maintain
Terraform
and
Concourse
pipelines; proactively update based on organizational changes. Migrate repositories to GitHub
and update associated pipelines for continuous integration. Ensure
data quality , governance, and reproducibility of model outputs. Participate in
code reviews , maintain clean, modular code, and create detailed technical documentation. Required Qualifications
Bachelor's degree in
Computer Science, Data Science, Engineering , or related field (or 8+ years equivalent experience). 3+ years of experience in
machine learning engineering, AI development, or data science operations . Strong
Python
programming skills; proficiency in
NumPy, Pandas, Scikit-learn , and related libraries. Hands-on experience with
AWS SageMaker
for training, tuning, and deploying models. Solid background in
data science methodologies
and
statistical analysis . Experience with
Infrastructure-as-Code
tools (Terraform, CloudFormation). Deep understanding of
MLOps , containerization ( Docker, Kubernetes ), and CI/CD pipelines. Familiarity with
GitHub Actions , version control, and collaborative development workflows. Working knowledge of
AWS services
(S3, EC2, Lambda, CloudWatch). Preferred Qualifications
Master's degree in a relevant technical field. AWS Certifications
(e.g., Machine Learning Specialty, Solutions Architect). Experience with
monitoring tools
(Prometheus, Grafana, CloudWatch) and
big data frameworks
(EMR, Spark, Hadoop). Strong
SQL expertise
(CTEs, indexes, stored procedures, and performance optimization). Experience with
ETL tools
(SSIS, Sqoop, Spark). Hands-on experience building
classification and regression models . Familiarity with
software engineering best practices
and
design patterns .