Stefanini North America and APAC
Data Pipeline Architect & Builder
Stefanini North America and APAC, Dearborn, Michigan, United States, 48120
Get AI-powered advice on this job and more exclusive features.
Direct message the job poster from Stefanini North America and APAC
Talent Acquisition Lead @ Stefanini Group North America and APAC
Details:
Job Description Stefanini Group is hiring!
Stefanini is looking for a Data Pipeline Architect & Builder, Dearborn, MI (Onsite) For quick apply, please reach out Vasudha Lakshmi at 248-263-5273/vasudha.l@stefanini.com We are looking for a Data Pipeline Architect & Builder who will spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Data form, to build robust and efficient data pipelines. Responsibilities Utilize your full-stack (End to End) Integration skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. Act as a GCP Data Solutions Leader leveraging deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs and expectations. Serve as a Data Governance & Security Champion by implementing and managing robust data governance policies, access controls, and security best practices, utilizing GCP native security features to protect sensitive data. Act as a Data Workflow Orchestrator using Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing Infrastructure as Code (IaC) practices. Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborate with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Maintain a Continuous Learner mindset by staying ahead of industry trends and emerging technologies, proactively identifying opportunities to improve the data platform and capabilities. Translate complex business requirements into optimized data asset designs and efficient code, ensuring that data solutions contribute to business goals. Develop comprehensive documentation for data engineering processes, promote knowledge sharing, facilitate collaboration, and ensure long-term system maintainability. Experience Required Expertise in NoSQL, PostgreSQL, Kafka, GCP, Python. 5-7 years of experience in Data Engineering or Software Engineering. Strong proficiency in SQL, Java, and Python, with experience designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and Dataproc. Solid understanding of SOA and microservices in a cloud data platform context. Experience with relational databases (PostgreSQL, MySQL), NoSQL databases, and columnar databases (BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, IaC tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills for troubleshooting data platform and microservices issues. Experience in monitoring and optimizing cost and computing resources for GCP technologies (BigQuery, Dataflow, Cloud Run, Dataproc). Experience Preferred At least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Education Required Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). Listed salary ranges may vary based on experience, qualifications, and local market. Some positions may include bonuses or other incentives. Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those conversations will describe the job for which you have applied and discuss the process, including interviews and job offers. About Stefanini Group The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence includes the Americas, Europe, Africa, and Asia, with more than four hundred clients across various markets. Stefanini is a CMM Level 5 IT consulting company with a global presence. Seniority level
Mid-Senior level Employment type
Contract Job function
Engineering and Information Technology Industries: Business Consulting and Services Referrals increase your chances of interviewing at Stefanini North America and APAC by 2x Sign in to set job alerts for Data Architect roles. Location: Southfield, MI Salary range: $160,000.00-$200,000.00 Posted: 3 months ago Additional listings include roles such as Senior Solutions Architect and GenAI Solutions Architect. #J-18808-Ljbffr
Details:
Job Description Stefanini Group is hiring!
Stefanini is looking for a Data Pipeline Architect & Builder, Dearborn, MI (Onsite) For quick apply, please reach out Vasudha Lakshmi at 248-263-5273/vasudha.l@stefanini.com We are looking for a Data Pipeline Architect & Builder who will spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Data form, to build robust and efficient data pipelines. Responsibilities Utilize your full-stack (End to End) Integration skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. Act as a GCP Data Solutions Leader leveraging deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs and expectations. Serve as a Data Governance & Security Champion by implementing and managing robust data governance policies, access controls, and security best practices, utilizing GCP native security features to protect sensitive data. Act as a Data Workflow Orchestrator using Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing Infrastructure as Code (IaC) practices. Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborate with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Maintain a Continuous Learner mindset by staying ahead of industry trends and emerging technologies, proactively identifying opportunities to improve the data platform and capabilities. Translate complex business requirements into optimized data asset designs and efficient code, ensuring that data solutions contribute to business goals. Develop comprehensive documentation for data engineering processes, promote knowledge sharing, facilitate collaboration, and ensure long-term system maintainability. Experience Required Expertise in NoSQL, PostgreSQL, Kafka, GCP, Python. 5-7 years of experience in Data Engineering or Software Engineering. Strong proficiency in SQL, Java, and Python, with experience designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and Dataproc. Solid understanding of SOA and microservices in a cloud data platform context. Experience with relational databases (PostgreSQL, MySQL), NoSQL databases, and columnar databases (BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, IaC tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills for troubleshooting data platform and microservices issues. Experience in monitoring and optimizing cost and computing resources for GCP technologies (BigQuery, Dataflow, Cloud Run, Dataproc). Experience Preferred At least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Education Required Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). Listed salary ranges may vary based on experience, qualifications, and local market. Some positions may include bonuses or other incentives. Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those conversations will describe the job for which you have applied and discuss the process, including interviews and job offers. About Stefanini Group The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence includes the Americas, Europe, Africa, and Asia, with more than four hundred clients across various markets. Stefanini is a CMM Level 5 IT consulting company with a global presence. Seniority level
Mid-Senior level Employment type
Contract Job function
Engineering and Information Technology Industries: Business Consulting and Services Referrals increase your chances of interviewing at Stefanini North America and APAC by 2x Sign in to set job alerts for Data Architect roles. Location: Southfield, MI Salary range: $160,000.00-$200,000.00 Posted: 3 months ago Additional listings include roles such as Senior Solutions Architect and GenAI Solutions Architect. #J-18808-Ljbffr