Logo
FastTek Global

Data Engineer#1039117

FastTek Global, Dearborn, Michigan, United States, 48120

Save Job

Job Details Location: Dearborn, Michigan

Position: Data Engineer

Job ID: 1039117

Responsibilities

Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources.

Ensure data is standardized, high-quality, and optimized for analytical use.

Leverage cutting‑edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines.

End‑to‑end Integration Expert: Utilize full‑stack skills to contribute to seamless end‑to‑end development, ensuring smooth and reliable data flow from source to insight.

GCP Data Solutions Leader: Leverage deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet and exceed business needs.

Data Governance & Security Champion: Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data.

Data Workflow Orchestrator: Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC).

Performance Optimization Driver: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost‑effectiveness.

Collaborative Innovator: Collaborate effectively with data architects, application architects, service owners, and cross‑functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering.

Automation & Reliability Advocate: Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency.

Effective Communicator: Clearly and transparently communicate complex technical decisions to both technical and non‑technical stakeholders, fostering understanding and alignment.

Continuous Learner: Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities.

Business Impact Translator: Translate complex business requirements into optimized data asset designs and efficient code, ensuring our data solutions directly contribute to business goals.

Documentation & Knowledge Sharer: Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long‑term system maintainability.

Required Skills

Python

SQL

DBT / Dataform

GCP (BigQuery, Dataflow, Pub/Sub, Cloud Functions, DataProc)

NoSQL (e.g., MongoDB)

Relational databases (PostgreSQL, MySQL)

Kafka

Astronomer

Terraform

Infrastructure as Code (IaC)

CI/CD pipelines and automation frameworks

Experience & Qualifications Senior Engineer – 5 to 7 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands‑on experience building and deploying cloud‑based data platforms (GCP preferred).

Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud‑based data pipelines using GCP services like BigQuery, Dataflow, and DataProc.

Solid understanding of Service‑Oriented Architecture (SOA) and microservices, and their application within a cloud data platform.

Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery).

Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments.

Familiarity with CI/CD pipelines, Infrastructure as Code tools such as Terraform and Tekton.

Excellent analytical and problem‑solving skills, with the ability to troubleshoot complex data platform and microservices issues.

Experience monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).

Education

Bachelor's Degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience).

Benefits & Company Culture FastTek Global is a privately held company providing consultant and client‑focused services. We value responsibility, community impact, and a flexible, creative, honest work environment. We have been in business for 24 years.

Medical and Dental (FastTek pays majority of the medical program)

Vision

Personal Time Off (PTO) Program

Long‑Term Disability (100% paid)

Life Insurance (100% paid)

401(k) with immediate vesting and 3% dollar‑for‑dollar match

AI & Hiring Disclosure We use AI tools to support parts of our hiring process, such as reviewing applications and identifying potential matches. These tools are designed to promote efficiency, consistency, and fairness, and they are always used under human oversight. All personal data collected is used solely for recruitment purposes, and you have the right to know, access, or request deletion of your data at any time, subject to legal limits. If AI will be used in a video interview, you will be informed in advance and asked for your consent, with the option to opt out. Our tools are regularly reviewed to detect potential bias and to ensure compliance with all applicable laws and our commitment to inclusive hiring. To learn more or exercise your rights, please contact info@fasttek.com.

#J-18808-Ljbffr