GETSPARKS.IO
How might we provide knowledge workers the tools for a centralized learning experience
A case study detailing the end-to-end process of designing Conversational AI-enabled 'Questions' functionality for getSparks.
The conversational AI assisted 'Questions' functionality. A solution I helped
design to serve a specific persona / Ideal customer profile.
GetSparks started as a platform for content curation and comprehension, for a diversified user base of knowledge workers, researchers and students.
I helped pivot the product into a comprehensive learning tool by replacing a multitude of content management point solutions with one end-to-end, AI-powered platform.
MY ROLE
Led the strategy and design for Questions, the AI-assisted learning experience for
GetSparks. For the
duration of this project I worked with the founder, a product manager and a full stack developer.
OUTCOME
Successfully delivered and launched the ship milestone for GetSparks
Questions Functionality, resulting in a 3x increase in user engagement and a 20% increase
in conversion
rate.
Background
Initially, as a platform for content curation and comprehension, getSparks aspired to deliver value to
multiple user personas, offering these two key solutions:
Key Value 1: Bookmarking
Gather and access content from your trusted sources—all in one place. Bookmarks was a key functionality that
allowed users to save links from the web, pdf files and to typed in text into organized user-created topic
led groups/folders.
Key Value 2: Summaries
Learn from your personally curated knowledge sources through detailed paragraph level summaries. Enabling
quick insights from trusted sources that the user themselves curated. Saving users time, effort and
brain-space.
Addressing
User Churn
User Churn
Initially, the key value offering “organize, understand and gain insights from personally curated content”
worked well for the intended user bases. They were saving links from the web and organizing them into topics;
along with gaining high-level insights at a glance from AI-generated summaries. This saved them significant
research time and brain-space, early user activity was promising. But - after an initial surge, the
users
started dropping off at a concerning rate.
User churn is expected in early-stage products especially in pre-PMF stage that requires teams to fail fast and
early. Detecting a gap between user expectations and the actual user experience, the team started to gather user
feedback and investigate the problem.
This is when I joined the team in early 2022 to investigate the user drop off points, ideate and implement
solutions to curb user churn.
I led the concept, strategy and design of the AI-assisted learning experience
called
'Questions' for GetSparks.
I worked alongside the founder, the product manager and a full stack developer for the duration of this project.
Additionally, I worked on Content Discovery for GetSparks and did a complete UX
overhaul for
the responsive web
platform. That part would be covered in a later case study.
User
Personas
Personas
We developed three initial personas in order to align user needs and product value offering.
We interviewed users based on the three personas, that we created working with
existing user
data and
assumptions. These personas followed our initial segmentation of user bases, described as follows:
Persona 1: The Student
Uses GetSparks to organize
study
material from web
sources and assigned class
syllabus. Takes notes and generates summaries for assignments and research.
Persona 2: The Professional Researcher
Relies on GetSparks to manage
extensive
resources,
summarize materials, and
support in-depth analyses.
Persona 3: The Knowledge Worker
Uses GetSparks for informal,
ad-hoc,
everyday
problem-solving personal research.
Is a habitual learner. Collects information on the certification he is doing and uses getSparks to
understand it quickly.
As the interviews progressed, we realized one user group was closest to the value proposition of the product and
we focused on that specific persona as our Ideal Customer Profile. Now we started designing
catering to the
needs of this particular persona.
'Now we started designing catering to the needs of this particular persona.'
Knowledge workers are defined as people who have to keep track of and process a substantial amount of
information on the daily.
Product managers, software developers, marketers, lawyers, educationists are some examples of knowledge workers. Here's a knowledge worker persona created to understand the user better:
Product managers, software developers, marketers, lawyers, educationists are some examples of knowledge workers. Here's a knowledge worker persona created to understand the user better:
Detailed persona of a knowledge worker. This persona helped the team visualize the user and their wants/needs.
User Journeys
Synthesizing data, mapping ideal-state user journeys, and working backwards to align design decisions with user
needs and business goals.
I created an Ideal State User Journey Map for the Knowledge Worker Persona. Then Journey Map
served as a tool to
understand the thought process and motivations of the user, and a conversation tool for team alignment ensuring
that the user was always front and center of product and design decisions.
This is the map we started with. As we talked to more users, our assumptions were challenged we kept updating
it. We used this map to define user insights, user pain-points and gaps in our knowledge.
Key User Insight
From user interviews and closely monitoring usage metrics we discovered that users liked the summary feature but
left GetSparks for Google when they needed deeper insights, often not returning to the app causing drop-offs. We
were failing to deliver recurring value to the user.
"Summaries gives me a surface-level overview, but when I need more information, I end up on google and lose
track of where I started"
- Interview Participant
Jobs
To-Be Done
To-Be Done
The shift in focus from targeting three user groups, to one led us to redefine the job to be done statements.
The redefinition was targeted to address user pain points as discovered in user interviews and user feedback
synthesis. The new JBTD statement ensured our solutions align more closely with knowledge workers unique
needs
and challenges.
Job to be Done (EXISTING)
" How might we help users curate content in one
place
and understand the curated content"
The shift in focus from targeting three user groups, to one led us to redefine the job to be done statements.
Ensuring our solutions align more closely with knowledge workers unique needs and challenges.
Job to be Done (REDEFINEd)
"How might we provide knowledge workers a centralized learning
experience
in-platform"
Focused on one user group, allowing us to widen our product scope. Resulting in deeper understanding of user
motivations.
Design
Explorations
Explorations
I explored various design directions to ideate ways for the user to accomplish the JTBD.
These were the winning ideas that would help the user to learn from and add to their knowledge base - allowing them to close the learning loop without feeling the need to go out of the platform, to complete the job they started in-platform. These are the design ideas we explored:
These were the winning ideas that would help the user to learn from and add to their knowledge base - allowing them to close the learning loop without feeling the need to go out of the platform, to complete the job they started in-platform. These are the design ideas we explored:
exploration 1: Questions and ANswers
A way to answer user questions directly in the platform. Eliminating the need for google.
exploration 2: content discovery feed
A discovery feed or 'suggested content' merged seamlessly at key interaction points for the user. This content
will be sourced from users trusted media outlets. Here, 'Trusted' is defined as links online sourced they've
saved links from before.
exploration 3: PERSONALIZED INSIGHTS FEED
An "Insights" section with quick, digestible bite sized information. Learned from users previous interactions,
saved links and other content.
After researching analogous products, analyzing our findings, and reviewing business goals, one big idea stood
out, which later took form of an AI assisted conversational component called
Questions in
GetSparks responsive
web app.
Ideas
to Action
to Action
The winning design direction was 'Questions'—a conversational AI assistant for GetSparks, designed to leverage
generative AI for delivering contextual, intelligent answers.
The answers provided are based on the user's bookmarked links from the web, documents uploaded to the platform,
and typed in notes.
With an ability to query their data, users could now get deeper insights from the trusted sources they have curated themselves within getSparks. This provides users with all of the functionality needed across the learning lifecycle in one place.This proved to be a breakthrough functionality at the time and showed promising adoption numbers. Increasing product usage by 30%.
With an ability to query their data, users could now get deeper insights from the trusted sources they have curated themselves within getSparks. This provides users with all of the functionality needed across the learning lifecycle in one place.This proved to be a breakthrough functionality at the time and showed promising adoption numbers. Increasing product usage by 30%.
A user querying an article they added to getSparks both using suggested questions and typing in a specific
query.
Questions Flow
Initially, the solution consisted of a simple ask a question flow -- Multiple iterations and testing cycles
later we iterated towards an additional AI generated 'Suggested Questions' functionality.
Adding to a seamless
user experience, helping user get started, getting rid of the blank page syndrome. Shown below are both these
functionalities in one continuous flow:
1. Ask a Question (Suggested)
2. Ask a Question Flow (Self-Typed)
1. Ask a Question (Suggested)
2. Ask a Question Flow (Self-Typed)
Outcome
Shipped a successful redesign of the search experience. The re-imagined experience of asking questions and
receiving context-driven value-packed answers resulted in an overall 3x increase in user engagement and
a 20%
increase in conversion rate.