ManpowerGroup Global, Inc.
Sr. Data Engineer
ManpowerGroup Global, Inc., Charlotte, North Carolina, United States, 28245
Job Title
Sr. Data Engineer
Duration 18 Month Contract
Location 100 W. Worthington Avenue, Charlotte, NC 28203
Hybrid Hybrid – Tuesday and Thursday
Rate $45/hr on W2
Interview Process 2 Rounds – Teams & Interview may be in person
Round 1: Initial Tech screening
Round 2: Hands on Coding panel of 3
Disqualifiers No Coding skills and if they are ETL Developers
Job Description Overall Responsibilities
Translates complex cross‑functional business requirements and functional specifications into logical program designs, code modules, stable application systems, and data solutions; partners with Product Team to understand business needs and functional specifications.
Collaborates with cross‑functional teams to ensure specifications are converted into flexible, scalable, and maintainable solution designs; evaluates project deliverables to ensure they meet specifications and architectural standards.
Contributes to the design and build of complex data solutions and ensures the architecture blueprint, standards, target state architecture, and strategies are aligned with the requirements.
Coordinates, executes, and participates in component integration (CIT) scenarios, systems integration testing (SIT), and user acceptance testing (UAT) to identify application errors and to ensure quality software deployment.
Participate in all software development end‑to‑end product lifecycle phases by applying and sharing an in‑depth understanding of complex industry methodologies, policies, standards, and controls.
Develop detailed architecture plans for large scale enterprise architecture projects and drives the plans to fruition.
Solve complex architecture/design and business problems; solutions are extensible; works to simplify, optimize, remove bottlenecks, etc.
Data Engineering Responsibilities
Executes the development, maintenance, and enhancements of data ingestion solutions of varying complexity levels across various data sources like DBMS, File systems (structured and unstructured), APIs and Streaming on on‑prem and cloud infrastructure; demonstrates strong acumen in Data Ingestion toolsets and nurtures and grows junior members in this capability.
Builds, tests and enhances data curation pipelines integration data from wide variety of sources like DBMS, File systems, APIs and streaming systems for various KPIs and metrics development with high data quality and integrity.
Supports the development of features/inputs for the data models in an Agile manner; Hosts Model Via Rest APIs; ensures non‑functional requirements such as logging, authentication, error capturing, and concurrency management are accounted for when model hosting.
Works with Data Science team to understand mathematical models and algorithms; recommends improvements to analytic methods, techniques, standards, policies and procedures.
Works to ensure the manipulation and administration of data and systems are secure and in accordance with enterprise governance by staying compliant with industry best practices, enterprise standards, corporate policy and department procedures; handles the manipulation (extract, load, transform), data visualization, and administration of data and systems securely and in accordance with enterprise data governance standards.
Maintains the health and monitoring of assigned data engineering capabilities that span analytic functions by triaging maintenance issues; ensures high availability of the platform; works with Infrastructure Engineering teams to maintain the data platform; serves as an SME of one or more applications.
BI Engineering Responsibilities
Responsible for the development, maintenance, and enhancements of BI solutions of varying complexity levels across different data sources like DBMS, File systems (structured and unstructured) on‑prem and cloud infrastructure; creates level metrics and other complex metrics; use custom groups, consolidations, drilling, and complex filters.
Demonstrates database skill (Teradata/Oracle/DB2/Hadoop) by writing views for business requirements; uses freeform SQLs and pass‑through functions; analyzes and finds errors from SQL generation; creates RSD and dashboard.
Responsible for building, testing and enhancement of BI solutions from a wide variety of sources like Teradata, Hive, Hbase, Google Big Query and File systems; develops solutions with optimized data performance and data security.
Works with business analysts to understand requirements and create dashboard/dossiers wireframes; makes use of widgets and Vitara charts to make the dashboard/dossiers visually appealing.
Coordinates and takes necessary actions from DART side for application upgrades (e.g., Teradata, Workday), storage migration, and user management automation; supports such things as Cluster Management and Project Configuration settings.
Day to Day Responsibilities
Attending Scrum calls to provide daily status updates.
Pickup assigned JIRA stories and work with Analysts or Product Owners to understand.
Responsible for the development, maintenance, and enhancements of Data Engineering solutions varying complexity levels across different data sources like DBMS, File systems (structured and unstructured) on‑prem and cloud infrastructure; creates level metrics and other complex metrics.
Demonstrates Data Engineering skills (Spark Data Frame, Big Queries) by writing pipelines for business requirements.
Responsible for building, testing and enhancement of Data Pipelines solutions from a wide variety of sources like Kafka streaming, Google Big Query and File systems; develops solutions with optimized data performance and data security.
Minimum Qualifications
Bachelor’s degree in engineering /computer science, CIS, or related field (or equivalent work experience in a related field).
5 years of experience in Data or BI Engineering, Data Warehousing/ETL, or Software Engineering.
4 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC).
Required Skills
Spark Framework – 4 Years Minimum.
Kafka – 3 Years.
GCP – 3 years.
Airflow – 3 years.
Big Query.
Preferred Skills
Java – 3 years.
Python – 3 years.
Micro Services – 2 years.
SQL – 4 Years.
Required Testing Person should be taking ownership to test the feature End to End.
Software Skills Required He/she is required to communicate with Business Users.
If this is a role that interests you and you’d like to learn more, click apply now and a recruiter will be in touch with you to discuss this great opportunity. We look forward to speaking with you!
About ManpowerGroup, Parent Company of Manpower, Experis, Talent Solutions, and Jefferson Wells.
ManpowerGroup® (NYSE: MAN), the leading global workforce solutions company, helps organizations transform in a fast‑changing world of work by sourcing, assessing, developing, and managing the talent that enables them to win. We develop innovative solutions for hundreds of thousands of organizations every year, providing them with skilled talent while finding meaningful, sustainable employment for millions of people across a wide range of industries and skills. Our expert family of brands – Manpower, Experis, Talent Solutions, and Jefferson Wells – creates substantial value for candidates and clients across more than 75 countries and territories and has done so for over 70 years. We are recognized consistently for our diversity – as a best place to work for Women, Inclusion, Equality and Disability and in 2022 ManpowerGroup was named one of the World’s Most Ethical Companies for the 13th year – all confirming our position as the brand of choice for in-demand talent.
#J-18808-Ljbffr
Duration 18 Month Contract
Location 100 W. Worthington Avenue, Charlotte, NC 28203
Hybrid Hybrid – Tuesday and Thursday
Rate $45/hr on W2
Interview Process 2 Rounds – Teams & Interview may be in person
Round 1: Initial Tech screening
Round 2: Hands on Coding panel of 3
Disqualifiers No Coding skills and if they are ETL Developers
Job Description Overall Responsibilities
Translates complex cross‑functional business requirements and functional specifications into logical program designs, code modules, stable application systems, and data solutions; partners with Product Team to understand business needs and functional specifications.
Collaborates with cross‑functional teams to ensure specifications are converted into flexible, scalable, and maintainable solution designs; evaluates project deliverables to ensure they meet specifications and architectural standards.
Contributes to the design and build of complex data solutions and ensures the architecture blueprint, standards, target state architecture, and strategies are aligned with the requirements.
Coordinates, executes, and participates in component integration (CIT) scenarios, systems integration testing (SIT), and user acceptance testing (UAT) to identify application errors and to ensure quality software deployment.
Participate in all software development end‑to‑end product lifecycle phases by applying and sharing an in‑depth understanding of complex industry methodologies, policies, standards, and controls.
Develop detailed architecture plans for large scale enterprise architecture projects and drives the plans to fruition.
Solve complex architecture/design and business problems; solutions are extensible; works to simplify, optimize, remove bottlenecks, etc.
Data Engineering Responsibilities
Executes the development, maintenance, and enhancements of data ingestion solutions of varying complexity levels across various data sources like DBMS, File systems (structured and unstructured), APIs and Streaming on on‑prem and cloud infrastructure; demonstrates strong acumen in Data Ingestion toolsets and nurtures and grows junior members in this capability.
Builds, tests and enhances data curation pipelines integration data from wide variety of sources like DBMS, File systems, APIs and streaming systems for various KPIs and metrics development with high data quality and integrity.
Supports the development of features/inputs for the data models in an Agile manner; Hosts Model Via Rest APIs; ensures non‑functional requirements such as logging, authentication, error capturing, and concurrency management are accounted for when model hosting.
Works with Data Science team to understand mathematical models and algorithms; recommends improvements to analytic methods, techniques, standards, policies and procedures.
Works to ensure the manipulation and administration of data and systems are secure and in accordance with enterprise governance by staying compliant with industry best practices, enterprise standards, corporate policy and department procedures; handles the manipulation (extract, load, transform), data visualization, and administration of data and systems securely and in accordance with enterprise data governance standards.
Maintains the health and monitoring of assigned data engineering capabilities that span analytic functions by triaging maintenance issues; ensures high availability of the platform; works with Infrastructure Engineering teams to maintain the data platform; serves as an SME of one or more applications.
BI Engineering Responsibilities
Responsible for the development, maintenance, and enhancements of BI solutions of varying complexity levels across different data sources like DBMS, File systems (structured and unstructured) on‑prem and cloud infrastructure; creates level metrics and other complex metrics; use custom groups, consolidations, drilling, and complex filters.
Demonstrates database skill (Teradata/Oracle/DB2/Hadoop) by writing views for business requirements; uses freeform SQLs and pass‑through functions; analyzes and finds errors from SQL generation; creates RSD and dashboard.
Responsible for building, testing and enhancement of BI solutions from a wide variety of sources like Teradata, Hive, Hbase, Google Big Query and File systems; develops solutions with optimized data performance and data security.
Works with business analysts to understand requirements and create dashboard/dossiers wireframes; makes use of widgets and Vitara charts to make the dashboard/dossiers visually appealing.
Coordinates and takes necessary actions from DART side for application upgrades (e.g., Teradata, Workday), storage migration, and user management automation; supports such things as Cluster Management and Project Configuration settings.
Day to Day Responsibilities
Attending Scrum calls to provide daily status updates.
Pickup assigned JIRA stories and work with Analysts or Product Owners to understand.
Responsible for the development, maintenance, and enhancements of Data Engineering solutions varying complexity levels across different data sources like DBMS, File systems (structured and unstructured) on‑prem and cloud infrastructure; creates level metrics and other complex metrics.
Demonstrates Data Engineering skills (Spark Data Frame, Big Queries) by writing pipelines for business requirements.
Responsible for building, testing and enhancement of Data Pipelines solutions from a wide variety of sources like Kafka streaming, Google Big Query and File systems; develops solutions with optimized data performance and data security.
Minimum Qualifications
Bachelor’s degree in engineering /computer science, CIS, or related field (or equivalent work experience in a related field).
5 years of experience in Data or BI Engineering, Data Warehousing/ETL, or Software Engineering.
4 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC).
Required Skills
Spark Framework – 4 Years Minimum.
Kafka – 3 Years.
GCP – 3 years.
Airflow – 3 years.
Big Query.
Preferred Skills
Java – 3 years.
Python – 3 years.
Micro Services – 2 years.
SQL – 4 Years.
Required Testing Person should be taking ownership to test the feature End to End.
Software Skills Required He/she is required to communicate with Business Users.
If this is a role that interests you and you’d like to learn more, click apply now and a recruiter will be in touch with you to discuss this great opportunity. We look forward to speaking with you!
About ManpowerGroup, Parent Company of Manpower, Experis, Talent Solutions, and Jefferson Wells.
ManpowerGroup® (NYSE: MAN), the leading global workforce solutions company, helps organizations transform in a fast‑changing world of work by sourcing, assessing, developing, and managing the talent that enables them to win. We develop innovative solutions for hundreds of thousands of organizations every year, providing them with skilled talent while finding meaningful, sustainable employment for millions of people across a wide range of industries and skills. Our expert family of brands – Manpower, Experis, Talent Solutions, and Jefferson Wells – creates substantial value for candidates and clients across more than 75 countries and territories and has done so for over 70 years. We are recognized consistently for our diversity – as a best place to work for Women, Inclusion, Equality and Disability and in 2022 ManpowerGroup was named one of the World’s Most Ethical Companies for the 13th year – all confirming our position as the brand of choice for in-demand talent.
#J-18808-Ljbffr