J.P. Morgan
DESCRIPTION
Duties: Conduct data analysis to support model development, including documenting metadata and data dictionaries.
Participate in strategic projects, providing insights to leverage quantitative analytics for actionable business insights and influence strategies.
Drive data-driven decision-making and strategic initiatives.
Conduct data analysis.
Collaborate with business partners to identify impactful projects, influence key decisions with data, and ensure client satisfaction.
Utilize AWS Cloud services, such as RDS and Redshift, for data management and analysis.
Perform user acceptance testing and deliver demos to stakeholders using SQL queries or Python scripts.
Design and implement cloud-based solutions by integrating RDS with EKS to support scalable application deployments.
Monitor and optimize the performance of cloud-native applications deployed on EKS.
Develop and maintain applications that track the status of Alteryx pipelines, ensuring seamless ETL operations.
Collaborate with cross-functional teams to automate data workflows using ETL such as PySpark, enabling advanced querying and transformation processes using SQL and python.
Leverage PySpark to process large datasets and create KPI-based tables to generate comprehensive reports and deliver actionable insights.
Track and analyze real-time customer behavior to inform business strategies and optimize decision‑making.
Perform user acceptance testing to validate the functionality and reliability of deployed systems and allow stakeholders to test systems.
Ensure seamless integration between services such as Athena, RDS, EKS, Alteryx, and reporting tools such as Tableau and Quicksight to enhance system reliability and scalability.
Continuously monitor system performance and recommend improvements for data pipelines and cloud infrastructure.
QUALIFICATIONS
Minimum education and experience required: Master's degree in Information Technology Management, Business Analytics, Data Science, Information Technology, Computer Information Systems, or related field of study plus 3 years of experience in the job offered or as Data Scientist, Data Analyst, Business Analyst, Analytics, or related occupation.
Skills Required: This position requires three (3) years of experience with the following: working in a big data environment using AWS, Python, Spark, and SQL.
Skills Required: This position requires two (2) years of experience with the following: using AWS services including Oracle RDS, Aurora RDS, Lambda, S3, SQS, SNS, KMS, Step Functions, ECS, EKS, Athena, and Lake Formation; using Terraform for infrastructure as code; using CI/CD pipelines and tools including Jenkins and Spinnaker; observability with CloudWatch, Grafana, Splunk, and Datadog; developing file compression algorithms for improved efficiency and security; developing custom security lake formation permissions on data elements using AWS Athena; using structured and non‑structured database design, optimization, and management; designing APIs with X‑509, barrer‑token, and certificate‑based authentication methodologies; performing gremlin and unit testing; designing and developing applications using KMS security and certificate store; performing and managing operational resiliency, including root cause analysis and developing and implementing solutions; the financial services industry and their IT systems.
Skills Required: This position requires any amount of experience with the following: using boto3 libraries with Python or R; using data wrangling tools including Alteryx; using data methodologies including Oracle and AWS Cloud; Excel pivot tables; data migration and ETL processes on AWS; cloud‑native and microservice architectures.
Job Location: 8181 Communications Parkway, Plano, TX 75024
#J-18808-Ljbffr
Duties: Conduct data analysis to support model development, including documenting metadata and data dictionaries.
Participate in strategic projects, providing insights to leverage quantitative analytics for actionable business insights and influence strategies.
Drive data-driven decision-making and strategic initiatives.
Conduct data analysis.
Collaborate with business partners to identify impactful projects, influence key decisions with data, and ensure client satisfaction.
Utilize AWS Cloud services, such as RDS and Redshift, for data management and analysis.
Perform user acceptance testing and deliver demos to stakeholders using SQL queries or Python scripts.
Design and implement cloud-based solutions by integrating RDS with EKS to support scalable application deployments.
Monitor and optimize the performance of cloud-native applications deployed on EKS.
Develop and maintain applications that track the status of Alteryx pipelines, ensuring seamless ETL operations.
Collaborate with cross-functional teams to automate data workflows using ETL such as PySpark, enabling advanced querying and transformation processes using SQL and python.
Leverage PySpark to process large datasets and create KPI-based tables to generate comprehensive reports and deliver actionable insights.
Track and analyze real-time customer behavior to inform business strategies and optimize decision‑making.
Perform user acceptance testing to validate the functionality and reliability of deployed systems and allow stakeholders to test systems.
Ensure seamless integration between services such as Athena, RDS, EKS, Alteryx, and reporting tools such as Tableau and Quicksight to enhance system reliability and scalability.
Continuously monitor system performance and recommend improvements for data pipelines and cloud infrastructure.
QUALIFICATIONS
Minimum education and experience required: Master's degree in Information Technology Management, Business Analytics, Data Science, Information Technology, Computer Information Systems, or related field of study plus 3 years of experience in the job offered or as Data Scientist, Data Analyst, Business Analyst, Analytics, or related occupation.
Skills Required: This position requires three (3) years of experience with the following: working in a big data environment using AWS, Python, Spark, and SQL.
Skills Required: This position requires two (2) years of experience with the following: using AWS services including Oracle RDS, Aurora RDS, Lambda, S3, SQS, SNS, KMS, Step Functions, ECS, EKS, Athena, and Lake Formation; using Terraform for infrastructure as code; using CI/CD pipelines and tools including Jenkins and Spinnaker; observability with CloudWatch, Grafana, Splunk, and Datadog; developing file compression algorithms for improved efficiency and security; developing custom security lake formation permissions on data elements using AWS Athena; using structured and non‑structured database design, optimization, and management; designing APIs with X‑509, barrer‑token, and certificate‑based authentication methodologies; performing gremlin and unit testing; designing and developing applications using KMS security and certificate store; performing and managing operational resiliency, including root cause analysis and developing and implementing solutions; the financial services industry and their IT systems.
Skills Required: This position requires any amount of experience with the following: using boto3 libraries with Python or R; using data wrangling tools including Alteryx; using data methodologies including Oracle and AWS Cloud; Excel pivot tables; data migration and ETL processes on AWS; cloud‑native and microservice architectures.
Job Location: 8181 Communications Parkway, Plano, TX 75024
#J-18808-Ljbffr