Introducing Patterns

February 15, 2024

A digital clone of your data team, Patterns amplifies their impact.

Facilitating one’s quest for knowledge, Patterns is a data analytics AI encoded with your organization's collective data knowledge and interfacing with you like a coworker. The lack of data literacy at companies means that only a few all-star analysts can check the work of others, these people are always underwater. Patterns simulates your best data analysts so they can be everywhere all at one — virtually unblocking everyday users of data on their quest to turn data into knowledge.

Patterns' interface looks more like a coworker than a BI tool; make a request, get a response. We think the coworker interface, like copying someone on an email, or assigning them a ticket in Jira, is the right way to interact with AI and will become commonplace. Instead of searching a trove of dashboards for the right one, simply make a request where it’s most convenient and either get a link to an existing dashboard or get a new one generated just for you — this is the future of data access for enterprises.

While AI can do a lot, it has some flaws such as hallucination and explainability. That’s why we designed Patterns not as a simple text-to-SQL AI tool, but as a virtual representation of your best data analysts to amplify their intuition and creditability. We do this by training the bot on real verified queries, citations, and keeping humans-in-the-loop for when the bot isn’t confident.

Patterns’ Capabilities

Patterns is our first release, focused on analytics question answering, future versions will be more capable with features such as data modeling and proactive insights. We think about Patterns’ capabilities like a job description for a data analyst:

  • Knows all about your business and data and can answer questions.

  • Expert SQL knowledge, can query and manipulate data.

  • Generate visualizations of data in vega-lite.

  • Generate reports, interpret and synthesize findings.

  • Learns continuously from past analytical exercises.

  • Asks for expert help when uncertain.

  • Can create business metrics and help explore business questions.

  • Provides auditable and re-runable analyses backed by code.

Customizing Patterns

Patterns is a base model that is intended to be customized. Once you connect your database and select a set of tables, Patterns automatically queries your database (tables, query history, etc) to create its own context and understanding on your business. You can also optionally add your own context to enhance it’s performance.

Customizing Patterns has many use cases. For example, large enterprises require sandboxed bots for particular use cases, such as the marketing or sales team having a bot that only has access to required data and is a specialist in that team’s metrics.

Example Domain Experts

We utilize Patterns' customization features to demonstrate how you can build domain expert bots, tuned for performance in domain such as product analytics, venture capital data, and customer support analysis.

  • Product analytics with BotHog (link coming)

  • Venture data with CrunchBot (link coming)

  • Customer support data with CXBot (link coming)

Architecture

LLM (large language mode)

We use OpenAI’s GPT-4 as the language model for our platform. We have the ability to exchange GPT4 for any other language model, and in the future will experiment with open-source models to power this part of the stack.

Context Engine

Context is everything. The core intelligence layer for Patterns is implemented via RAG (retrieval augmented generation). When a user asks a questions, we retrieve all the relevant information across your data stack to generate an answer. Our context engine combines information from five different parts.

  • System Prompt

  • Global Context

  • Documents

  • Tables

  • Analyses

Database

Customers data warehouse, currently support for BigQuery, MySQL, Postgres, and Snowflake.

How it works?

Onboarding

  • Connect your database and activate tables, Patterns pulls all your schemas and samples of your data, creating and saving these results to it’s memory.

  • Optionally you can upload your semantic layer, data catalog, or metrics library, to add further information to the knowledge base.

Responding to requests

  • When asked a question, Patterns searches all related information in the knowledge base and then generates a response.

  • Depending on the user request, the LLM will generate SQL, charting code, or just plain text to answer the users query.

  • After finished writing SQL or chart code, the bot will attempt to execute the query against the database and afterwards will render the results when the query completes.

Creating analyses

  • After you save an analysis, the bot will generate another summary of the interaction including context from the chat as if generating a report.

Sharing analyses

  • Share via link, or export to PDF.

Next steps in the evolution

  • More interfaces - email, text, ticket, etc.

  • Deep-dive analysis - asynchronous analyses with Quest autonomously working on a problem, self-correcting errors and improving analyses.

  • Data modeling - let Quest make suggestions to improve your underlying data model.

  • Proactive engagement - give Quest high-level analytical objectives and let it push relevant information to end-users.

  • Auto-context - let Quest create it’s own learning and context from interactions with users.

  • Validation flows - an operator interface for the data team to manage and oversee knowledge suggestions.

  • Native integrations with data catalogs, semantic layers, and BI tools.

Analytics in natural language