Motion Recruitment
Overview
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for an Infrastructure Engineer/Cloud and Big Data Tools Engineer in Dallas, TX or Charlotte, NC (Hybrid). Contract Duration: 12 Months+ with possible extensions | W2 Only - Green Card, USC or H4EAD only What You Will Be Doing
Administer and support tools on the Data private cloud, including CDP, HWX, MapR. Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale, across multiple clusters. Develop proof-of-concept solutions leveraging CDP and OCP technologies. Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities. Act as a technical subject matter expert, supporting programming staff during development, testing, and implementation phases. Develop automation scripts for configuration and maintenance of data virtualization tools. Lead complex platform design, coding, and testing efforts. Drive advanced modeling, simulation, and analysis initiatives. Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures. Generate reports on cluster usage, performance metrics, and capacity utilization. Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support. Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services. Required Skills & Experience
Strong experience with big data platforms: MapR, Hortonworks, Cloudera Data Platform. Hands-on expertise with data virtualization tools: Dremio, JupyterHub, AtScale. Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP). Solid understanding of platform engineering, automation scripting, and DevOps practices. Proven ability to troubleshoot complex issues and perform root cause analysis. Experience in leading technical efforts and mentoring team members. Dremio, Hadoop, Splunk and Grafana Preferred Qualifications
Certifications in Cloudera, OpenShift, or related technologies. Experience with enterprise-level data lake architectures and governance. Location
Dallas, TX or Charlotte, NC (Hybrid) Seniority level
Mid-Senior level Employment type
Contract
#J-18808-Ljbffr
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for an Infrastructure Engineer/Cloud and Big Data Tools Engineer in Dallas, TX or Charlotte, NC (Hybrid). Contract Duration: 12 Months+ with possible extensions | W2 Only - Green Card, USC or H4EAD only What You Will Be Doing
Administer and support tools on the Data private cloud, including CDP, HWX, MapR. Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale, across multiple clusters. Develop proof-of-concept solutions leveraging CDP and OCP technologies. Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities. Act as a technical subject matter expert, supporting programming staff during development, testing, and implementation phases. Develop automation scripts for configuration and maintenance of data virtualization tools. Lead complex platform design, coding, and testing efforts. Drive advanced modeling, simulation, and analysis initiatives. Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures. Generate reports on cluster usage, performance metrics, and capacity utilization. Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support. Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services. Required Skills & Experience
Strong experience with big data platforms: MapR, Hortonworks, Cloudera Data Platform. Hands-on expertise with data virtualization tools: Dremio, JupyterHub, AtScale. Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP). Solid understanding of platform engineering, automation scripting, and DevOps practices. Proven ability to troubleshoot complex issues and perform root cause analysis. Experience in leading technical efforts and mentoring team members. Dremio, Hadoop, Splunk and Grafana Preferred Qualifications
Certifications in Cloudera, OpenShift, or related technologies. Experience with enterprise-level data lake architectures and governance. Location
Dallas, TX or Charlotte, NC (Hybrid) Seniority level
Mid-Senior level Employment type
Contract
#J-18808-Ljbffr