test: backfill coverage for genie connector, service context, stream registry#327
Merged
Conversation
…registry Pure test additions for existing code — no production behavior changes. Coverage deltas: - genie connector: 61% → ~97% - service-context: 7% → 100% - stream-registry: 32% → 100% Carved out of #256 to land independently. Tests exercise existing code paths that were under-covered, surfaced while reworking the analytics format model. Co-authored-by: Isaac
This was referenced Apr 29, 2026
ServiceContext.initialize's catch branch uses `e instanceof ConfigError` imported from @databricks/sdk-experimental. The vi.mock only exported WorkspaceClient, so the import resolved to undefined and vitest's strict mock validation threw `No "ConfigError" export is defined` whenever the catch path ran — failing 6 tests in service-context.test.ts after merging main. Add a minimal ConfigError class with the `baseMessage` field ConfigurationError.databricksAuthenticationSetupFailed reads, so the instanceof check evaluates cleanly without changing production behavior. Signed-off-by: James Broadhead <jamesbroadhead@gmail.com>
vi.mock is hoisted to the top of the file, so MockConfigError defined as a top-level class declaration was not yet initialised when the mock factory ran (ReferenceError: Cannot access 'MockConfigError' before initialization). Move the class into the existing vi.hoisted block so it is constructed before vi.mock executes. Signed-off-by: James Broadhead <jamesbroadhead@gmail.com>
…paceId ServiceContext.initialize() eagerly resolves workspaceId and warehouseId inside Promise.all, so any failure surfaces as a rejection of initialize() itself — the state object is never produced. Five "should throw …" tests awaited initialize() and then asserted on state.workspaceId/state.warehouseId, which meant the rejection escaped to the outer test as an unhandled error before the rejects.toThrow assertion ran. Move the assertion onto the initialize() call. No production change. Signed-off-by: James Broadhead <jamesbroadhead@gmail.com>
calvarjorge
approved these changes
May 11, 2026
jamesbroadhead
added a commit
that referenced
this pull request
May 11, 2026
Renames the client-side analytics format model from "JSON"/"ARROW" to
"JSON_ARRAY"/"ARROW_STREAM" to match the Statement Execution API enum
verbatim — no more local-name to API-name translation.
Pure mechanical rename. No behavior change. Internal type values only;
the lowercase user-facing values passed to useChartData ("json", "arrow",
"auto") are unchanged.
Carved out of #256 (#327 is layer 1, this is layer 2). The actual
inline-Arrow-IPC + warehouse-fallback fix sits on top of this in layer 3.
Note: this is a breaking change for any direct consumer of
useAnalyticsQuery passing explicit format: "JSON" or "ARROW" — they will
need to update to "JSON_ARRAY" / "ARROW_STREAM". Consumers using
useChartData (lowercase "json"/"arrow"/"auto") are unaffected.
Co-authored-by: Isaac
jamesbroadhead
added a commit
that referenced
this pull request
May 11, 2026
Renames the client-side analytics format model from "JSON"/"ARROW" to
"JSON_ARRAY"/"ARROW_STREAM" to match the Statement Execution API enum
verbatim — no more local-name to API-name translation.
Pure mechanical rename. No behavior change. Internal type values only;
the lowercase user-facing values passed to useChartData ("json", "arrow",
"auto") are unchanged.
Carved out of #256 (#327 is layer 1, this is layer 2). The actual
inline-Arrow-IPC + warehouse-fallback fix sits on top of this in layer 3.
Note: this is a breaking change for any direct consumer of
useAnalyticsQuery passing explicit format: "JSON" or "ARROW" — they will
need to update to "JSON_ARRAY" / "ARROW_STREAM". Consumers using
useChartData (lowercase "json"/"arrow"/"auto") are unaffected.
Co-authored-by: Isaac
jamesbroadhead
added a commit
that referenced
this pull request
May 11, 2026
Serverless warehouses return ARROW_STREAM + INLINE results as base64 Arrow IPC in result.attachment rather than result.data_array. The previous code path discarded inline data for any ARROW_STREAM response (designed for EXTERNAL_LINKS), so these warehouses silently returned empty results. This commit makes the analytics plugin work across classic and serverless warehouses by handling both dispositions for ARROW_STREAM, decoding inline Arrow IPC attachments server-side, and falling back to JSON_ARRAY when a warehouse rejects ARROW_STREAM + INLINE. Changes - Inline Arrow IPC decoding (new arrow-schema.ts) via apache-arrow's tableFromIPC, producing the same row-object shape as JSON_ARRAY regardless of warehouse backend. apache-arrow@21.1.0 added as a server dep. - Format fallback: ARROW_STREAM + INLINE requests automatically fall back to JSON_ARRAY if a classic warehouse rejects them. Explicit format requests are respected without fallback. - Zod-validated SSE wire protocol for /api/analytics/query (shared schema between server and client; malformed payloads surface a clear error instead of silent undefined). - Default remains JSON_ARRAY for compatibility. Stack: layer 3 of 3 carved from #256. - #327 — coverage backfill (layer 1) - #328 — AnalyticsFormat rename to API enum names (layer 2) - (this PR) — the actual fix Fixes #242 Co-authored-by: Isaac
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Pure test additions for existing code — no production behavior changes.
This is layer 1 of 3 carved out of #256 to land independently. The full feature stack:
JSON_ARRAY/ARROW_STREAMto match the Statement Execution API enum)Coverage deltas
connectors/genie— 61% → ~97%context/service-context— 7% → 100%stream/stream-registry— 32% → 100%These coverage gaps were surfaced while reworking the analytics format model in #256. They aren't tied to that fix and shouldn't gate it.
Test plan
This pull request was AI-assisted by Isaac.