Iquery
Overview
Job Summary:
Develops, maintains, and supports policies and procedures for ensuring the functionality and accuracy of the programs used to spider labor data from external sites. Relies on direction and good judgment to plan and accomplish goals. Works under general supervision. Viable candidates must be willing to work onsite at iQuery's headquarters in Palm Harbor, Florida. Responsibilities
Supports and documents new development of spidering robot programs (bots) to quickly and accurately extract data from external websites Ensures all bots meet full QA standards Ensures assigned bots are fully functional and resolves problems when they are not Meets performance metrics as expected, with goals on a weekly and monthly basis As experience grows will have an understanding of the bot library, spidering software, and database processing of extracted data Know what situations require the implementation and use of a proxy Able to maintain an average of 70 – 175 bot repairs per week with minimal QA issues (depends on experience and the mix of large/medium/small bots) May be required to perform manual bot QA with reasonable speed and accuracy Requirements
Work Experience / Knowledge: Understanding of computer programming methodologies Superior analytical skills Able to quickly learn and adapt to new software Basic knowledge of HTML, web technologies, and programming languages Basic SQL knowledge Familiar with Design Studio, Microsoft SQL Server, and Microsoft Team Foundation Server Basic understanding of database tables directly related to Kapow software and all tables involving storage of spidered jobs Demonstrates good comprehension of the data flow from the spidering applications through the databases to the production environment Qualifications / Certifications
Two-year degree in computer science or equivalent work experience Equal Opportunity Employer. M/F/D/V
#J-18808-Ljbffr
Job Summary:
Develops, maintains, and supports policies and procedures for ensuring the functionality and accuracy of the programs used to spider labor data from external sites. Relies on direction and good judgment to plan and accomplish goals. Works under general supervision. Viable candidates must be willing to work onsite at iQuery's headquarters in Palm Harbor, Florida. Responsibilities
Supports and documents new development of spidering robot programs (bots) to quickly and accurately extract data from external websites Ensures all bots meet full QA standards Ensures assigned bots are fully functional and resolves problems when they are not Meets performance metrics as expected, with goals on a weekly and monthly basis As experience grows will have an understanding of the bot library, spidering software, and database processing of extracted data Know what situations require the implementation and use of a proxy Able to maintain an average of 70 – 175 bot repairs per week with minimal QA issues (depends on experience and the mix of large/medium/small bots) May be required to perform manual bot QA with reasonable speed and accuracy Requirements
Work Experience / Knowledge: Understanding of computer programming methodologies Superior analytical skills Able to quickly learn and adapt to new software Basic knowledge of HTML, web technologies, and programming languages Basic SQL knowledge Familiar with Design Studio, Microsoft SQL Server, and Microsoft Team Foundation Server Basic understanding of database tables directly related to Kapow software and all tables involving storage of spidered jobs Demonstrates good comprehension of the data flow from the spidering applications through the databases to the production environment Qualifications / Certifications
Two-year degree in computer science or equivalent work experience Equal Opportunity Employer. M/F/D/V
#J-18808-Ljbffr