Twisto
As a Senior Database Engineer & Architect (SQL), you will play a key role in
designing, optimizing, and scaling
our database infrastructure. You will collaborate closely with engineers, data analysts, and product teams to improve the current database architecture and build new high-performance, scalable, and secure databases tailored to our business needs. Design, develop, and optimize highly scalable database architectures with a focus on
PostgreSQL
.
Lead the
migration, optimization, and restructuring
of existing databases to ensure efficiency and scalability.
Implement and enforce database best practices, including indexing, partitioning, query optimization, and normalization.
Define and execute
high-availability and disaster recovery
strategies.
Develop and maintain
ETL pipelines
and data processing workflows.
Work closely with engineering teams to ensure efficient database schema design and support the development of new features.
Monitor and troubleshoot database performance, identifying and resolving bottlenecks.
Ensure data security and compliance with industry standards, implementing encryption, access control, and auditing policies.
Optimize stored procedures, triggers, and functions for maximum efficiency.
Technology Stack:
Database:
PostgreSQL (Primary), AWS RDS, SQL.
Big Data & ETL:
Apache Kafka, Airflow, dbt, Snowflake (nice to have).
Performance Optimization:
Indexing, Partitioning, Query Optimization, Caching.
Security:
Data Encryption, Role-Based Access Control (RBAC), Auditing.
Tools & DevOps:
Liquibase, Flyway, Terraform, Kubernetes (for DB management).
Scripting & Programming:
SQL, PL/pgSQL, Python, Bash.
Methodologies:
Agile, Scrum.
What we can offer:
Meaningful work & impact -
Your solutions will shape Param Group’s AI strategy and directly influence our products and internal processes.
Flexibility -
Fully remote; an office is available if you’re near one of our locations.
Learning culture -
Work alongside fintech experts, experiment with new tech, and keep growing
.
Modern tech stack -
AWS, PostgreSQL, Kafka, Airflow, dbt, Snowflake, and more.
5+ years of experience
as a
Database Developer or Database Architect
, focusing on PostgreSQL.
Proven expertise in designing, optimizing, and managing
large-scale, high-performance databases
.
Strong knowledge of
AWS database services
(RDS, Aurora, EC2-based PostgreSQL instances) and
on-premise PostgreSQL
deployments.
Experience implementing HA (High Availability) & replication strategies (e.g., Patroni, PgBouncer, Streaming Replication).
Deep understanding of query optimization, partitioning, indexing, caching, and concurrency control.
Proficiency in backup/recovery strategies and disaster recovery planning.
Hands-on experience with
CI/CD pipelines
and
Infrastructure as Code
tools like Terraform, Ansible, or AWS CloudFormation.
Experience in
database security
,
ETL pipelines and data warehousing
.
Proficiency in
SQL, PL/pgSQL
, and scripting languages like
Python
or
Bash
for automation.
Familiarity with monitoring and alerting tools (AWS CloudWatch, Prometheus, Grafana, Datadog).
Excellent English communication skills, with the ability to collaborate effectively with both technical and non-technical teams.
#J-18808-Ljbffr
designing, optimizing, and scaling
our database infrastructure. You will collaborate closely with engineers, data analysts, and product teams to improve the current database architecture and build new high-performance, scalable, and secure databases tailored to our business needs. Design, develop, and optimize highly scalable database architectures with a focus on
PostgreSQL
.
Lead the
migration, optimization, and restructuring
of existing databases to ensure efficiency and scalability.
Implement and enforce database best practices, including indexing, partitioning, query optimization, and normalization.
Define and execute
high-availability and disaster recovery
strategies.
Develop and maintain
ETL pipelines
and data processing workflows.
Work closely with engineering teams to ensure efficient database schema design and support the development of new features.
Monitor and troubleshoot database performance, identifying and resolving bottlenecks.
Ensure data security and compliance with industry standards, implementing encryption, access control, and auditing policies.
Optimize stored procedures, triggers, and functions for maximum efficiency.
Technology Stack:
Database:
PostgreSQL (Primary), AWS RDS, SQL.
Big Data & ETL:
Apache Kafka, Airflow, dbt, Snowflake (nice to have).
Performance Optimization:
Indexing, Partitioning, Query Optimization, Caching.
Security:
Data Encryption, Role-Based Access Control (RBAC), Auditing.
Tools & DevOps:
Liquibase, Flyway, Terraform, Kubernetes (for DB management).
Scripting & Programming:
SQL, PL/pgSQL, Python, Bash.
Methodologies:
Agile, Scrum.
What we can offer:
Meaningful work & impact -
Your solutions will shape Param Group’s AI strategy and directly influence our products and internal processes.
Flexibility -
Fully remote; an office is available if you’re near one of our locations.
Learning culture -
Work alongside fintech experts, experiment with new tech, and keep growing
.
Modern tech stack -
AWS, PostgreSQL, Kafka, Airflow, dbt, Snowflake, and more.
5+ years of experience
as a
Database Developer or Database Architect
, focusing on PostgreSQL.
Proven expertise in designing, optimizing, and managing
large-scale, high-performance databases
.
Strong knowledge of
AWS database services
(RDS, Aurora, EC2-based PostgreSQL instances) and
on-premise PostgreSQL
deployments.
Experience implementing HA (High Availability) & replication strategies (e.g., Patroni, PgBouncer, Streaming Replication).
Deep understanding of query optimization, partitioning, indexing, caching, and concurrency control.
Proficiency in backup/recovery strategies and disaster recovery planning.
Hands-on experience with
CI/CD pipelines
and
Infrastructure as Code
tools like Terraform, Ansible, or AWS CloudFormation.
Experience in
database security
,
ETL pipelines and data warehousing
.
Proficiency in
SQL, PL/pgSQL
, and scripting languages like
Python
or
Bash
for automation.
Familiarity with monitoring and alerting tools (AWS CloudWatch, Prometheus, Grafana, Datadog).
Excellent English communication skills, with the ability to collaborate effectively with both technical and non-technical teams.
#J-18808-Ljbffr