aKube
Senior Data Analyst – Reporting & Automation- 1523
New York, United States | Posted on 11/03/2025
City:
NYC/ Santa Monica, CA
Onsite/ Hybrid/ Remote:
Hybrid, 4 days onsite
Duration
: 6 months with strong potential for extension/conversion (up to 24 months)
Rate Range
: Up to$67.75/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization
: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
Advanced SQL for analytical ETL/pipeline development (Snowflake or similar MPP).
Data quality engineering in pipelines (inline checks, alerting, exception handling).
Dashboarding/BI (Looker and/or Tableau) with strong data modeling fundamentals.
Version control and code review workflows (Git/GitHub).
Excel workflow automation (macros, formula-driven processes refactored to code).
Responsibilities:
Build end-to-end reporting automation for business-critical initiatives (templated email sends, automated deck builds, scheduled jobs).
Translate intake requests into clear data requirements and SLAs; design resilient pipelines that minimize manual touchpoints.
Develop performant SQL and transformation logic against vetted analytical datasets; productionize jobs with parameterization and logging.
Embed inline data-quality tests and safeguards (freshness, completeness, threshold checks) to ensure executive-grade accuracy.
Create maintainable dashboards/visuals for operational through executive audiences; push beyond third-party tool limits when needed.
Partner cross-functionally with upstream data/software engineering and domain analysts to align schemas, definitions, and cadence.
Document solutions (runbooks, configs, data contracts) and contribute to reusable internal tooling/components for reporting enablement.
Proactively identify opportunities to de-risk manual workflows and drive standardization across domains (e.g., Partnerships, Subscriber Planning).
Qualifications:
Bachelor’s degree in a STEM/analytical field.
5+ years in analytics or data product/enablement roles with measurable automation impact.
3+ years hands-on SQL in an analytical ETL environment (Snowflake/BigQuery/Redshift).
Proven track record converting manual Excel processes into robust, scheduled pipelines.
BI experience (Looker/Tableau); strong ability to convey technical/analytical concepts to diverse audiences.
Familiarity with Git/GitHub and collaborative dev practices.
Preferred:
Advanced custom visualization frameworks (D3.js, Streamlit) to extend interactivity beyond standard BI tools.
Experience prototyping “homegrown” reporting tools/products from scratch.
Workflow orchestration familiarity (e.g., Airflow or equivalent).
Applied statistics exposure (hypothesis testing, regressions) for KPI validation and QA.
#J-18808-Ljbffr
City:
NYC/ Santa Monica, CA
Onsite/ Hybrid/ Remote:
Hybrid, 4 days onsite
Duration
: 6 months with strong potential for extension/conversion (up to 24 months)
Rate Range
: Up to$67.75/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization
: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
Advanced SQL for analytical ETL/pipeline development (Snowflake or similar MPP).
Data quality engineering in pipelines (inline checks, alerting, exception handling).
Dashboarding/BI (Looker and/or Tableau) with strong data modeling fundamentals.
Version control and code review workflows (Git/GitHub).
Excel workflow automation (macros, formula-driven processes refactored to code).
Responsibilities:
Build end-to-end reporting automation for business-critical initiatives (templated email sends, automated deck builds, scheduled jobs).
Translate intake requests into clear data requirements and SLAs; design resilient pipelines that minimize manual touchpoints.
Develop performant SQL and transformation logic against vetted analytical datasets; productionize jobs with parameterization and logging.
Embed inline data-quality tests and safeguards (freshness, completeness, threshold checks) to ensure executive-grade accuracy.
Create maintainable dashboards/visuals for operational through executive audiences; push beyond third-party tool limits when needed.
Partner cross-functionally with upstream data/software engineering and domain analysts to align schemas, definitions, and cadence.
Document solutions (runbooks, configs, data contracts) and contribute to reusable internal tooling/components for reporting enablement.
Proactively identify opportunities to de-risk manual workflows and drive standardization across domains (e.g., Partnerships, Subscriber Planning).
Qualifications:
Bachelor’s degree in a STEM/analytical field.
5+ years in analytics or data product/enablement roles with measurable automation impact.
3+ years hands-on SQL in an analytical ETL environment (Snowflake/BigQuery/Redshift).
Proven track record converting manual Excel processes into robust, scheduled pipelines.
BI experience (Looker/Tableau); strong ability to convey technical/analytical concepts to diverse audiences.
Familiarity with Git/GitHub and collaborative dev practices.
Preferred:
Advanced custom visualization frameworks (D3.js, Streamlit) to extend interactivity beyond standard BI tools.
Experience prototyping “homegrown” reporting tools/products from scratch.
Workflow orchestration familiarity (e.g., Airflow or equivalent).
Applied statistics exposure (hypothesis testing, regressions) for KPI validation and QA.
#J-18808-Ljbffr