91黑料爆料

Skip to main content

Staff Technical Data Analyst

Category Data Location Remote – Washington – United States Job ID 2025-72358

Company Overview

91黑料爆料 is the global financial technology platform that powers prosperity for the people and communities we serve. With approximately 100 million customers worldwide using products such as TurboTax, Credit Karma, QuickBooks, and Mailchimp, we believe that everyone should have the opportunity to prosper. We never stop working to find new, innovative ways to make that possible.

Job Overview

As a Staff Technical Data Analyst (T4) you operate at the functional鈥慻roup level to turn go鈥憈o鈥憁arket strategy into reliable, governed data products and automated pipelines that power sales productivity and decision鈥憁aking. You lead discovery鈫抎esign鈫抎elivery for enterprise鈥慻rade datasets, semantic layers, dashboards, and event鈥慸riven integrations across CRM/CPQ/LMS and adjacent 1P/3P systems. You are hands鈥憃n in Databricks (SQL/PySpark, Delta Lake, Unity Catalog), fluent in API querying and integration patterns, and use Workato (or equivalent) to automate data flows and user lifecycle management. You partner tightly with RevOps, Sales, Data Engineering, and Data Science to land measurable outcomes with clear ownership, observability, and compliance (e.g., 7216, RBAC, retention). You connect data, tooling, and behavior change鈥攂uilding telemetry and insights that improve seller effectiveness, tool utilization, program adoption, and forecast quality, and you operationalize those insights back into workflows (e.g., coaching loops, next鈥慴est鈥慳ction, play enforcement).

Responsibilities

Strategic Thinking and Analytics Solution Design

  • Translate Sales Enablement strategy into a data strategy: KPI frameworks (ramp, attainment, pipeline health, conversion, velocity, forecast accuracy) and measurement plans for programs and experiments.
  • Define an enablement metrics taxonomy: time鈥憈o鈥慺irst鈥憁eeting/opportunity, onboarding completion and proficiency, content utilization, play adoption, call outcomes, pipeline hygiene, coaching coverage.
  • Lead intake, scoping, and high鈥憀evel product data design for sales datasets and features; write crisp requirements and decision records; plan phased delivery that de鈥憆isks adoption.
  • Synthesize findings into digestible dashboards and actionable recommendations for sales and leadership, translating complex data into clear insights that drive informed decision-making and enablement program adjustments.

Execution: Technical Proficiency to Deliver Business/Product Outcomes

  • Build Databricks ELT/ETL pipelines (SQL/PySpark) using Delta/medallion patterns and Unity Catalog; implement orchestration, cost awareness, and data鈥憅uality monitors (freshness, completeness, validity).
  • Create governed semantic layers and gold datasets for Tableau, Salesforce CRM Analytics, and Amazon QuickSight; establish dashboard standards and self鈥憇erve patterns (exec scorecards, manager coaching packs, SDR/AE ramp dashboards).
  • Query and integrate APIs (REST/GraphQL): handle OAuth2/JWT, pagination, rate limits, retries, idempotency, and bulk APIs/webhooks; codify reusable connectors.
  • Design Workato (or equivalent) automations for cross鈥慳pp workflows and user lifecycle management(provisioning, role/permission changes, deprovisioning, license hygiene) with observability, alerts, and audit trails.

Data Asset Management

  • Drive usability and adoption of standardized data products through documentation, data dictionaries, enablement of field leaders, and office hours; champion paved paths and inner鈥憇ource.
  • Enforce governance (RBAC, PII handling, 7216, retention) and end鈥憈o鈥慹nd lineage; define SLAs/SLOs and create runbooks for incident response and recovery.
  • Prepare and maintain data for AI/ML pipelines (feature stores, batch/stream ingestion, backfills); enable model monitoring hooks and feedback capture. Maintain seller 360 datasets (territory, capacity, activity, pipeline, attainment) for downstream analytics.

Example Initiatives You Might Lead

  • Build a Sales Data Platform: standardized lead→opportunity→order gold tables with lineage, SLAs, telemetry, and adoption instrumentation; power Tableau/CRM Analytics/QuickSight.
  • Deliver seller鈥慹ffectiveness features (activity, content engagement, coverage/capacity) in Databricks and surface next鈥慴est鈥慳ction in CRM with measurable lift; publish manager coaching packs.
  • Stand up RPA/Workato automations across 1P and 3P platforms.
  • Create a Pipeline Hygiene Score and Enablement Impact Model that attribute uplift to content, plays, and training; drive quarterly roadmap and investment decisions.

Qualifications

  • 6+ years in analytics/data roles (TDA, BDA, Analytics Engineering, or equivalent) with program鈥憀evel delivery in GTM domains.
  • Advanced SQL and practical PySpark; hands鈥憃n Databricks (Delta Lake, Unity Catalog, Jobs/Workflows) and data鈥憅uality frameworks.
  • Proven API integration experience (REST/GraphQL, OAuth2/JWT, retries/backoff, idempotency, schema evolution) and Workato (recipes, callable patterns, promotion, RBAC, secrets).
  • Dashboarding at scale in Tableau, Salesforce CRM Analytics, and Amazon QuickSight; semantic鈥憀ayer design and governance including adoption instrumentation.
  • Excellent stakeholder facilitation and storytelling; able to align RevOps/Sales/Finance/Legal/IT/DE/DS and drive tool adoption.

Preferred Qualifications

  • Databricks Delta Live Tables or streaming; MLflow or feature store experience; cost optimization for pipelines.
  • Quote鈥憈o鈥慶ash and entitlement/role modeling knowledge; territory/account hierarchy design.
  • Experience instrumenting enablement platforms and sales engagement tools for utilization and proficiency scoring.
  • Prior ownership of cross鈥態U data standards and paved鈥憄ath contributions

We use the technology for good to help small businesses and consumers.

Ercan Kaynakca Staff Data Crypto Analyst