TekWissen ®
Base pay range
$71.00/hr - $71.00/hr
Location Seattle, WA 98121
Duration 5 Months
Job Type Temporary Assignment
Work Type Onsite
Overview TekWissen is a global workforce management provider headquartered in Ann Arbor, Michigan that offers strategic talent solutions to our clients worldwide. Our client is a company operating a marketplace for consumers, sellers, and content creators. It offers merchandise and content purchased for resale from vendors and those offered by third party sellers.
Job Description
Client Devices & Services Sustainability is building the future of sustainable product development at scale. Sustain.AI Sustainability Technology Amplifying Innovation—our AI and automation team that enables sustainability experts to focus on what matters most: reducing Client Devices' carbon footprint.
We build intelligent systems that transform manual sustainability workflows into automated, data-driven processes, enabling our organization to scale our environmental impact without scaling overhead.
Our mission is to eliminate operational bottlenecks through automation and AI, creating the technical foundation that enables breakthrough sustainability innovations essential for net zero.
We're looking for a Data Engineer to join Sustain.AI for a focused engagement to lead the migration of critical data infrastructure that powers sustainability metrics, carbon footprint calculations, and Climate Pledge Friendly badging across Client Devices.
Responsibilities
As a Data Engineer on the Sustain.AI team, you will be working with business-critical sustainability data pipelines.
You will be responsible for migrating production ETL jobs from legacy data platforms to modern AWS infrastructure, ensuring zero disruption to operations that drive corporate carbon footprinting, quarterly business reviews, and customer‑facing sustainability communications.
You will establish the technical foundation for our data infrastructure by setting up Redshift clusters, AWS Glue, and workflow orchestration platforms in new AWS accounts.
You will translate existing data workflows into scalable AWS pipelines, implementing robust validation to ensure data consistency and quality throughout the migration.
You will work across multiple critical data domains including sales data, active device telemetry, transportation carbon emissions, energy consumption metrics, sustainability badging workflows, and executive reporting dashboards.
You will partner with Science team members who own carbon calculation methodologies and program teams who depend on these pipelines for daily operations, understanding their requirements and ensuring migrated solutions support their analytical and operational needs.
You will coordinate with upstream data owners to secure access permissions, collaborate with platform teams to troubleshoot integration issues, and communicate migration progress and risks to stakeholders across the organization.
Clear communication will be essential as you navigate dependencies across multiple teams and ensure alignment on migration timelines and success criteria.
You will be responsible for comprehensive documentation of the new architecture, including data lineage, dependencies, monitoring strategies, and operational runbooks. This documentation will enable the Sustain.AI team to maintain and evolve the infrastructure after your engagement completes.
You will implement data quality checks and alerting to proactively identify issues before they impact downstream consumers.
Required Skills & Experience
3‑6 years building and maintaining production data pipelines in complex enterprise environments. Strong SQL skills for writing complex transformations, optimizing query performance, and troubleshooting data quality issues across large datasets.
Hands‑on experience with AWS data services including Redshift, S3, Glue, and IAM for data access management.
Deep understanding of ETL best practices including data validation, error handling, idempotency, and lineage tracking.
Experience with Python or Spark for implementing data transformation logic.
Strong analytical skills and ability to work with large, complex datasets spanning multiple source systems.
Excellent communication and stakeholder management skills, with ability to coordinate across multiple technical teams and translate technical concepts for non‑technical audiences.
Ability to partner directly with technical stakeholders to understand requirements and deliver solutions that meet operational needs.
Proven track record managing data migrations or large‑scale infrastructure transitions with minimal business disruption.
Preferred Qualifications
Experience with workflow orchestration platforms.
QuickSight experience for dashboard development.
Background in large‑scale data migrations.
Familiarity with sustainability data, carbon accounting, or environmental metrics.
Experience setting up monitoring and alerting for production data pipelines.
Compelling Story & Candidate Value Proposition
Work closely with Data Engineers, Data Scientists, Staffing Technology.
Able to work with AI, Software Engineers, and Client Scientists.
Candidate Requirements Degree/Experience
Bachelors Degree in Data Engineering, or any related.
3‑6 years in Data Engineering building and maintaining production data pipelines.
Experience in SQL and ETL.
Better if client ex‑employee.
No experience in Data Engineering.
Best vs. Average
Great communication skills, to walk sideways, migrating pipeline.
Top 3 must-have hard skills
ETL
SQ
TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Seniority level Associate
Employment type Contract
Job function Other
Industries IT Services and IT Consulting
#J-18808-Ljbffr
Location Seattle, WA 98121
Duration 5 Months
Job Type Temporary Assignment
Work Type Onsite
Overview TekWissen is a global workforce management provider headquartered in Ann Arbor, Michigan that offers strategic talent solutions to our clients worldwide. Our client is a company operating a marketplace for consumers, sellers, and content creators. It offers merchandise and content purchased for resale from vendors and those offered by third party sellers.
Job Description
Client Devices & Services Sustainability is building the future of sustainable product development at scale. Sustain.AI Sustainability Technology Amplifying Innovation—our AI and automation team that enables sustainability experts to focus on what matters most: reducing Client Devices' carbon footprint.
We build intelligent systems that transform manual sustainability workflows into automated, data-driven processes, enabling our organization to scale our environmental impact without scaling overhead.
Our mission is to eliminate operational bottlenecks through automation and AI, creating the technical foundation that enables breakthrough sustainability innovations essential for net zero.
We're looking for a Data Engineer to join Sustain.AI for a focused engagement to lead the migration of critical data infrastructure that powers sustainability metrics, carbon footprint calculations, and Climate Pledge Friendly badging across Client Devices.
Responsibilities
As a Data Engineer on the Sustain.AI team, you will be working with business-critical sustainability data pipelines.
You will be responsible for migrating production ETL jobs from legacy data platforms to modern AWS infrastructure, ensuring zero disruption to operations that drive corporate carbon footprinting, quarterly business reviews, and customer‑facing sustainability communications.
You will establish the technical foundation for our data infrastructure by setting up Redshift clusters, AWS Glue, and workflow orchestration platforms in new AWS accounts.
You will translate existing data workflows into scalable AWS pipelines, implementing robust validation to ensure data consistency and quality throughout the migration.
You will work across multiple critical data domains including sales data, active device telemetry, transportation carbon emissions, energy consumption metrics, sustainability badging workflows, and executive reporting dashboards.
You will partner with Science team members who own carbon calculation methodologies and program teams who depend on these pipelines for daily operations, understanding their requirements and ensuring migrated solutions support their analytical and operational needs.
You will coordinate with upstream data owners to secure access permissions, collaborate with platform teams to troubleshoot integration issues, and communicate migration progress and risks to stakeholders across the organization.
Clear communication will be essential as you navigate dependencies across multiple teams and ensure alignment on migration timelines and success criteria.
You will be responsible for comprehensive documentation of the new architecture, including data lineage, dependencies, monitoring strategies, and operational runbooks. This documentation will enable the Sustain.AI team to maintain and evolve the infrastructure after your engagement completes.
You will implement data quality checks and alerting to proactively identify issues before they impact downstream consumers.
Required Skills & Experience
3‑6 years building and maintaining production data pipelines in complex enterprise environments. Strong SQL skills for writing complex transformations, optimizing query performance, and troubleshooting data quality issues across large datasets.
Hands‑on experience with AWS data services including Redshift, S3, Glue, and IAM for data access management.
Deep understanding of ETL best practices including data validation, error handling, idempotency, and lineage tracking.
Experience with Python or Spark for implementing data transformation logic.
Strong analytical skills and ability to work with large, complex datasets spanning multiple source systems.
Excellent communication and stakeholder management skills, with ability to coordinate across multiple technical teams and translate technical concepts for non‑technical audiences.
Ability to partner directly with technical stakeholders to understand requirements and deliver solutions that meet operational needs.
Proven track record managing data migrations or large‑scale infrastructure transitions with minimal business disruption.
Preferred Qualifications
Experience with workflow orchestration platforms.
QuickSight experience for dashboard development.
Background in large‑scale data migrations.
Familiarity with sustainability data, carbon accounting, or environmental metrics.
Experience setting up monitoring and alerting for production data pipelines.
Compelling Story & Candidate Value Proposition
Work closely with Data Engineers, Data Scientists, Staffing Technology.
Able to work with AI, Software Engineers, and Client Scientists.
Candidate Requirements Degree/Experience
Bachelors Degree in Data Engineering, or any related.
3‑6 years in Data Engineering building and maintaining production data pipelines.
Experience in SQL and ETL.
Better if client ex‑employee.
No experience in Data Engineering.
Best vs. Average
Great communication skills, to walk sideways, migrating pipeline.
Top 3 must-have hard skills
ETL
SQ
TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Seniority level Associate
Employment type Contract
Job function Other
Industries IT Services and IT Consulting
#J-18808-Ljbffr