Andiamo
Get AI-powered advice on this job and more exclusive features.
This range is provided by Andiamo. Your actual pay will be based on your skills and experience talk with your recruiter to learn more.
Base pay range
$75.00/hr - $130.00/hr Senior Software Engineer Our client is transforming the government sector by delivering one of the most sophisticated research platforms on the market, with a strong emphasis on automation, analytics, and real-time answers. Their goal is to be an indispensable tool for both private and public sector professionals by supporting day-to-day tasks and providing solutions that help users get timely insights to better serve their organizations. The Team
The Cloud Solutions team is responsible for designing and building the clients data platform and developing the tools that support it. The team leverages AWS technologies such as Lambda, S3, and Step Functions to create robust ingestion workflows, while also implementing a Databricks Lakehouse architecture. Engineers are encouraged to display creativity and deliver continuous end-user value in an Agile environment. This is a team of self-motivated engineers who research, learn, and apply modern technologies to anticipate and meet customer needs. Project Description
This role will focus on migrating legacy data ingestion pipelines into a shared data management system (BDMS). Key aspects of the project include: Loading datasets into BDMS. Creating new tools to allow data analysts to modify datasets. Enabling analysts to validate and cleanse data before delivery to AWS S3. Enriching data with Databricks, AWS Lambda, Step Functions, and Jupyter Notebooks/Workflows, then loading it into Delta tables. Responsibilities
You will be trusted to: Work directly with product owners and engineers to design and build solutions. Collaborate with data analysts and scientists to identify efficiencies in data collection and opportunities for workflow automation. Develop infrastructure as code (IaC) and apply DevOps principles. Build and manage microservices using AWS serverless technologies. Requirements
4+ years of programming experience in Python or Java. Strong understanding of database systems and SQL (including Oracle and PostgreSQL). A degree in Computer Science, Engineering, Mathematics, or a related fieldor equivalent experience. Hands-on experience with AWS services such as S3, Lambda, and CloudWatch. Preferred Qualifications
Experience migrating data from legacy systems to the cloud. Knowledge of Spark, Databricks, and infrastructure-as-code tools such as Terraform or CloudFormation. Seniority level
Mid-Senior level Employment type
Contract Job function
Information Technology
#J-18808-Ljbffr
$75.00/hr - $130.00/hr Senior Software Engineer Our client is transforming the government sector by delivering one of the most sophisticated research platforms on the market, with a strong emphasis on automation, analytics, and real-time answers. Their goal is to be an indispensable tool for both private and public sector professionals by supporting day-to-day tasks and providing solutions that help users get timely insights to better serve their organizations. The Team
The Cloud Solutions team is responsible for designing and building the clients data platform and developing the tools that support it. The team leverages AWS technologies such as Lambda, S3, and Step Functions to create robust ingestion workflows, while also implementing a Databricks Lakehouse architecture. Engineers are encouraged to display creativity and deliver continuous end-user value in an Agile environment. This is a team of self-motivated engineers who research, learn, and apply modern technologies to anticipate and meet customer needs. Project Description
This role will focus on migrating legacy data ingestion pipelines into a shared data management system (BDMS). Key aspects of the project include: Loading datasets into BDMS. Creating new tools to allow data analysts to modify datasets. Enabling analysts to validate and cleanse data before delivery to AWS S3. Enriching data with Databricks, AWS Lambda, Step Functions, and Jupyter Notebooks/Workflows, then loading it into Delta tables. Responsibilities
You will be trusted to: Work directly with product owners and engineers to design and build solutions. Collaborate with data analysts and scientists to identify efficiencies in data collection and opportunities for workflow automation. Develop infrastructure as code (IaC) and apply DevOps principles. Build and manage microservices using AWS serverless technologies. Requirements
4+ years of programming experience in Python or Java. Strong understanding of database systems and SQL (including Oracle and PostgreSQL). A degree in Computer Science, Engineering, Mathematics, or a related fieldor equivalent experience. Hands-on experience with AWS services such as S3, Lambda, and CloudWatch. Preferred Qualifications
Experience migrating data from legacy systems to the cloud. Knowledge of Spark, Databricks, and infrastructure-as-code tools such as Terraform or CloudFormation. Seniority level
Mid-Senior level Employment type
Contract Job function
Information Technology
#J-18808-Ljbffr