RIT Solutions, Inc.
Top 3 Technical skills:
Data Pipeline Development:
Build and maintain data pipelines using Python and Airflow. Cloud & Workflow Tools:
Work with Google Cloud (GCP), including Pub/Sub, Dataflow, and integration services. Scripting & Automation:
Use Bash for data handling and automation tasks.
About The Role
We're building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You'll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.
What You'll Do
Design, implement, and maintain backend services and APIs in Python Build and optimize data pipelines using Apache Airflow Collaborate with product and frontend teams to define clear service contracts Develop infrastructure-as-code for GCP resources (Pub/Sub, Cloud Functions, BigQuery, Cloud Storage) Ensure reliability: write tests, set up monitoring/alerting, troubleshoot production issues Participate in code reviews, mentor junior engineers, and help evolve our best practices
What We're Looking For
4+ years of professional Python development experience Hands-on experience with Apache Airflow (authoring DAGs, operators, scheduling) Strong working knowledge of Google Cloud Platform services (Compute Engine, Cloud Functions, BigQuery, Pub/Sub, IAM) Experience containerizing applications (Docker) and deploying with CI/CD (GitHub Actions, Cloud Build, etc.) Solid understanding of SQL and relational databases; bonus for NoSQL (Firestore/Datastore) Familiarity with RESTful API design Commitment to code quality: automated tests, linting, type checking
Nice-to-Haves
Experience with Terraform or other IaC tools Knowledge of Kubernetes and serverless architectures Background in event-driven or streaming data systems (Dataflow, Kafka) Exposure to security best practices in cloud environments Experience performing statistical analysis and data modeling (e.g., using NumPy, pandas, SciPy) Familiarity with machine learning frameworks and workflows (e.g., scikit-learn, TensorFlow, Pytorch)
Data Pipeline Development:
Build and maintain data pipelines using Python and Airflow. Cloud & Workflow Tools:
Work with Google Cloud (GCP), including Pub/Sub, Dataflow, and integration services. Scripting & Automation:
Use Bash for data handling and automation tasks.
About The Role
We're building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You'll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.
What You'll Do
Design, implement, and maintain backend services and APIs in Python Build and optimize data pipelines using Apache Airflow Collaborate with product and frontend teams to define clear service contracts Develop infrastructure-as-code for GCP resources (Pub/Sub, Cloud Functions, BigQuery, Cloud Storage) Ensure reliability: write tests, set up monitoring/alerting, troubleshoot production issues Participate in code reviews, mentor junior engineers, and help evolve our best practices
What We're Looking For
4+ years of professional Python development experience Hands-on experience with Apache Airflow (authoring DAGs, operators, scheduling) Strong working knowledge of Google Cloud Platform services (Compute Engine, Cloud Functions, BigQuery, Pub/Sub, IAM) Experience containerizing applications (Docker) and deploying with CI/CD (GitHub Actions, Cloud Build, etc.) Solid understanding of SQL and relational databases; bonus for NoSQL (Firestore/Datastore) Familiarity with RESTful API design Commitment to code quality: automated tests, linting, type checking
Nice-to-Haves
Experience with Terraform or other IaC tools Knowledge of Kubernetes and serverless architectures Background in event-driven or streaming data systems (Dataflow, Kafka) Exposure to security best practices in cloud environments Experience performing statistical analysis and data modeling (e.g., using NumPy, pandas, SciPy) Familiarity with machine learning frameworks and workflows (e.g., scikit-learn, TensorFlow, Pytorch)