Logo
KY Staffing

Software Engineer Senior

KY Staffing, Cincinnati, Ohio, United States, 45208

Save Job

Software Engineer Senior

Worldpay, LLC seeks a Software Engineer Senior in Cincinnati, OH to develop data pipelines using SQL and Python and create new APIs to support scalable applications which use huge data volume and complexity. Coordinate with business teams to gather requirements and refine the data models that will empower the customer facing BI applications, which in turn improves data accessibility and enables the customers and internal users to make data-driven decisions. Develop scalable BI applications using tools like Webfocus, Tableau, and Power BI. Design and implement data quality scripts, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Create test cases and scenarios which will be used for validating the data, document the processes and the design of the data models and make them available on the shared company portals. Support the production related issues of the existing products by doing the data analysis required to troubleshoot data related issues and also the front-end related issues. Use Snowflake's abilities to work effectively with structured, semi-structured, and unstructured data in Snowflake. Fine tune the SQL scripts and improve performance using advanced techniques such as data clustering and materialized views. Requirements: Bachelor's degree or foreign equivalent in Computer and Information Systems, Computer Engineering, or related field and five (5) years of progressively responsible experience in the job offered or a related occupation: utilizing SQL and Python to build the data pipelines and ETL scripts to transform and organize data; analyzing the data from various aspects to help design the architecture needed to support the end applications; creating scalable BI applications using tools including WEBFOCUS, Tableau, and Power BI; performing bulk load the structured, semi-structured, and unstructured data using Snowflake; creating, scheduling, and monitoring workflows using Apache Airflow; and testing and perform the validation of the pyspark code to convert to the Snowflake scripts. In the alternative, the employer will accept a Master's degree in the above listed fields and three (3) years of experience in the above listed skills. Telecommuting and/or working from home may be permissible pursuant to company policies.