-
Notifications
You must be signed in to change notification settings - Fork 306
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'main' into nbaumgold-patch-1
- Loading branch information
Showing
20 changed files
with
538 additions
and
69 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,59 @@ | ||
--- | ||
slug: contact-center | ||
date: 2024-05-26 16:36:00 -0500 | ||
title: Contact Center | ||
summary: "Explore methods of service delivery that improve customer experience in government contact centers." | ||
|
||
# See all topics at https://digital.gov/topics | ||
topics: | ||
- contact-centers | ||
- customer-experience | ||
|
||
event_cop: | ||
- contact-center | ||
|
||
community_list: | ||
- platform: listserv | ||
type: government | ||
subscribe_email: "[email protected]" | ||
subscribe_email_subject: "Join the Contact center community" | ||
terms: "Government employees and contractors with an official .gov or .mil email are eligible to join." | ||
members: 1,000 | ||
join_cop_button: "Contact center community members" | ||
|
||
# Controls how this page appears across the site | ||
# 0 -- hidden | ||
# 1 -- visible | ||
weight: 1 | ||
|
||
# Spotlight Digital.gov Communities of Practice (COP) at top of /communities | ||
dg_highlight: true | ||
dg_shortname: Contact Center | ||
dg_acronym: CC | ||
dg_logo: communities-contact-center.svg | ||
|
||
kicker: "Join the Contact center community" | ||
|
||
primary_image: "white-bg-digital-gov-card-community" | ||
--- | ||
|
||
Government contact centers are often the primary means of communication between an organization and its customers. In government, contact centers often set expectations for satisfaction and trust across the customer experience. Simply put, the voice of the organization and the customer is fostered through the contact center experience. | ||
|
||
The Contact Center Community was founded for government contact center professionals to collaborate and improve the experience of citizens and customers when they contact federal, tribal, state, and local agencies. | ||
|
||
## What We Do | ||
|
||
Our mission is to share information and news, identify contact center best-in-class practices, and evaluate evolving contact center technologies. | ||
|
||
The educational forums and discussions help members find innovative methods to improve the customer experience with their agency’s contact centers. Members share ideas, ask questions, and request assistance on their agency’s projects through our community Listserv. All federal, tribal, local, and state government employees and contractors working towards best-in-class contact centers are invited to join the community. | ||
|
||
## Who We Are | ||
|
||
We are a collaborative interagency group of contact center professionals working together to improve government contact centers' performance, experience, and efficiency across federal, tribal, state, and local agencies. | ||
|
||
## Related Resources | ||
|
||
- [Topic: Contact centers](https://digital.gov/topics/contact-centers/) | ||
- [Contact center guidelines](https://digital.gov/resources/contact-center-guidelines/) | ||
- [Contact center technologies](https://digital.gov/resources/contact-center-guidelines/contact-center-technologies/) | ||
- [Customer experience](https://digital.gov/topics/customer-experience/) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
159 changes: 159 additions & 0 deletions
159
...licited-data-a-valuable-resource-for-digital-customer-experience-enhancement.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,159 @@ | ||
--- | ||
# View this page at https://digital.gov/2024/06/25/unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement | ||
# Learn how to edit our pages at https://workflow.digital.gov | ||
|
||
slug: unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement | ||
date: 2024-06-25 | ||
title: "Unsolicited data: A valuable resource for digital customer experience enhancement" | ||
summary: "Optimizing federal service touchpoints involves analyzing both actively-sought and spontaneous feedback, introducing new metrics and data points." | ||
|
||
# See all authors at https://digital.gov/authors | ||
authors: | ||
- isabel-izzy-metzger | ||
|
||
# See all topics at https://digital.gov/topics | ||
topics: | ||
- analytics | ||
- user-experience | ||
|
||
# Controls how this page appears across the site | ||
# 0 -- hidden | ||
# 1 -- visible | ||
weight: 1 | ||
|
||
primary_image: "" | ||
|
||
--- | ||
|
||
When it comes to digital federal services, the stakes are high. For millions of the American public, these platforms are not just convenient online options; they are lifelines for essential needs. As highlighted in [Transforming the American digital experience: A report about what’s next for the U.S. Web Design System](https://designsystem.digital.gov/next/introduction/): | ||
|
||
{{< card-quote text="For millions, access to digital services isn’t a luxury — it’s critical. And their experiences using government websites to find unemployment support, file taxes, apply for student loans, or get assistance with housing, childcare, or food can dramatically affect how they feel about the government." cite="" >}} | ||
|
||
Understanding and optimizing digital touchpoints requires listening to users in the various solicited and unsolicited ways they communicate with the government. The [21st Century Integrated Digital Experience Act](https://digital.gov/resources/delivering-digital-first-public-experience/) (21st Century IDEA) emphasizes that federal agencies must design their digital services, including websites and applications, to meet user goals, needs, and behaviors based on analysis of various types of data. | ||
|
||
Agencies collect data on their services through a variety of avenues, but teams often face barriers to leveraging that data. These barriers may include unfamiliarity with the format of the data, methods for analyzing the data, and types of insights the data can uncover. | ||
|
||
This blog provides a checklist for getting started on analyzing unsolicited customer feedback. | ||
|
||
You can learn more about survey design and analyzing solicited customer feedback by checking out these blog posts: [Decoding public sentiment](https://digital.gov/2023/10/30/decoding-public-sentiment-harnessing-open-data-to-gain-insights-into-service-delivery/) and [Amplifying customer voices](https://digital.gov/2023/12/19/amplifying-customer-voices/). | ||
|
||
## Unsolicited, unstructured customer feedback: A treasure trove of data | ||
|
||
Structured data, such as numerical survey response ratings and website traffic, only represent a fraction of the available data on public interactions with digital services. Unstructured data – such as open-ended survey questions, emails, chatbot conversations, and search queries – can also be mined for insights. Both structured and unstructured data can be further categorized as “solicited” or “unsolicited,” depending on whether it was initiated by the provider or the user. | ||
|
||
The table below shows types of customer feedback along with examples: | ||
|
||
<table class="usa-table usa-table--striped"> | ||
<caption></caption> | ||
<thead> | ||
<tr> | ||
<th scope="col">Types of feedback</th> | ||
<th scope="col">Example sources</th> | ||
</tr> | ||
</thead> | ||
<tbody> | ||
<tr> | ||
<th scope="row"><strong>Solicited and structured</strong>: Requested by a provider and limited to predefined response options</th> | ||
<td>Numerical ratings in survey responses and comment cards</td> | ||
</tr> | ||
<tr> | ||
<th scope="row"><strong>Solicited and unstructured</strong>: Requested by a provider but free-form</th> | ||
<td>Open-ended survey responses, customer advisory board interviews</td> | ||
</tr> | ||
<tr> | ||
<th scope="row"><strong>Unsolicited and structured</strong>: Initiated by customer but limited to predefined response options</th> | ||
<td>Product and service ratings on third-party review sites</td> | ||
</tr> | ||
<tr> | ||
<th scope="row"><strong>Unsolicited and unstructured</strong>: Initiated by customer and free-form</th> | ||
<td>Social media posts, contact center calls, emails, and chats</td> | ||
</tr> | ||
</tbody> | ||
</table> | ||
|
||
## A key distinction between solicited and unsolicited feedback | ||
|
||
Solicited feedback reflects responses to specific inquiries posed by the service provider, and may be skewed toward more-engaged users who are willing to participate in surveys. While extremely valuable, solicited feedback data will present a view constrained by the questions asked. Unsolicited feedback, however, arises from the user's initiative. It offers another view into the customer experience and captures a broad spectrum of user experiences, including needs that might not surface in a structured survey. | ||
|
||
## Examples of unsolicited data, and questions to ask | ||
|
||
Digital service providers can leverage unsolicited customer experience (CX) data to understand different user segments and their needs, preferences, and behaviors (e.g., users’ search habits). This can lead to specific changes in how your digital service is provided—for example, by proactively generating or modifying content, or by creating a more targeted experience for specific customer types. | ||
|
||
Emails are a great place to start for customer feedback analysis, especially since it is a data stream already being collected. Email data is particularly valuable for digital service providers not yet collecting user survey data. Even if you are collecting survey data, emails are important to analyze because there may be differences in usage across feedback channels (e.g., some users may be more likely to email their feedback rather than participate in surveys). Additionally, unsolicited feedback in emails can uncover areas not addressed in surveys. For example, emails may capture new user groups outside of those predefined in questionnaires, or provide deeper insight into how well a provider is delivering customer service. | ||
|
||
You can analyze the content of the customer-initiated emails to address questions such as: | ||
|
||
* What types of information and services do users need? | ||
* Is there recurring negative and positive feedback in the emails? | ||
* How have the topics of emails changed over time? | ||
|
||
You can also segment the emails into customer groups and delve into the group-specific trends and patterns. Such customer experience insights can reveal new, non-obvious insights on service delivery, such as discovering that customer interactions via email might be a more powerful CX-improvement lever for the digital service provider than website content. | ||
|
||
### Website search data | ||
|
||
Analyze the search terms visitors use on the site to see their frequency and patterns over time. | ||
|
||
* What are the top search terms used in the digital platform’s search tools, and what does this indicate about your users’ interests? | ||
* Are there new and emerging search terms that can be used for content strategy? | ||
* Utilize website URL structure to correlate search terms with user groups; this can help inform how the website caters resources to its diverse user groups. | ||
* Analyze user behavior before and after major site updates to measure the impact of these changes, e.g., did a new feature lead to an increase in searches for specific content or services? | ||
|
||
### Emails | ||
|
||
Analyze the content of the emails to understand what customers initiating emails are requesting. | ||
|
||
* What types of information and services do they need? | ||
* Is there recurring negative and positive feedback in the emails that echo what we found in the solicited data? | ||
* How have the topics of emails changed over time? | ||
* Conduct a deeper sentiment analysis (the process of analyzing digital text to determine if the emotional tone of the message is positive, negative, or neutral) to understand the emotional signals in customer emails. | ||
* Segment emails into customer groups to identify group-specific trends and patterns. | ||
* What percentage of customer emails received a response? What was the sentiment of the overall email interaction? | ||
|
||
### Chatbot conversations and queries | ||
|
||
Analyze the types of queries and issues raised by users to understand needs and service effectiveness. | ||
|
||
* Can we identify chatbot user archetypes and understand their respective interaction patterns? | ||
* What are key factors of website chatbot user experience? What impact do response relevance and dialogue helpfulness have? | ||
* Look at the number of interactions between chatbot and user, as well as the emotions detected throughout the chat conversation. Is there a peak negative interaction? At what point? | ||
* Would sentiment pattern mining of chatbot dialogues show common user journeys — for example, from negative to positive indicating maybe a resolution, or from neutral to negative indicating dissatisfaction? | ||
* Look at the volume of chatbot interactions over time to identify any spikes that may correlate with events such as COVID-19 and major site updates. | ||
|
||
## You can’t change what you don’t measure: New performance metrics for digital services to consider | ||
|
||
Shortcomings with CX metrics based solely on structured data — such as website hits and solicited survey satisfaction ratings like [CSAT and NPS](https://digital.gov/2016/08/05/csat-nps-ces-3-easy-ways-to-measure-customer-experience-cx/) — are well-documented. | ||
|
||
As highlighted in an article about using artificial intelligence to track how customers feel <sup><a aria-describedby="footnote-label" href="#fn1" id="footnotes-ref1">[1]</a></sup>: | ||
|
||
{{< card-quote text="Companies spend huge amounts of time and money in efforts to get to know their customers better. But despite this hefty investment, most firms are not very good at listening to customers. It’s not for lack of trying, though — the tools they’re using and what they’re trying to measure may just not be up to the task. Our research shows that the two most widely used measures, customer satisfaction (CSAT) and Net Promoter Scores (NPS), fail to tell companies what customers really think and feel, and can even mask serious problems." cite="" >}} | ||
|
||
To address this within the General Services Administration (GSA), the Office of the Chief Financial Officer’s Analytics and Decision Support Division created open-source tools, the [DigitalCXAnalyzer](https://github.com/GSA/DigitalCXAnalyzer.git) and [GovCXAnalyzer](https://github.com/GSA/GovCXAnalyzer/). These tools facilitate the implementation and systematic processing and analyzing of vast and diverse types of structured and unstructured customer feedback data by allowing users to do trend analysis, sentiment analysis, topic modeling, and more. The team at GSA also developed and published code to support the generation of new performance metrics from structured and unstructured data that could give more granular depictions of customer experience. | ||
|
||
The code supports calculating metrics that offer a more comprehensive understanding of how different users engage with the platform’s services; an example metric is shown below. The metrics range in level of difficulty to implement, but offer new ways of understanding digital service performance and customer experience. While the toolkit was piloted with Department of Labor’s Employment Training Administration’s CareerOneStop platform, the code repository can be adapted and applied across various digital services. | ||
|
||
{{< note >}}**Example of one of the metrics a user can generate from the code repository** | ||
|
||
**Help Request Rate Measure**: You can monitor the rate at which different user groups are making help requests or asking for assistance by leveraging keyword matching on emails (e.g., “please assist,” “having trouble”) to further identify and measure gaps across your user groups. For example, if certain user groups exhibit higher help request rates, this could indicate usability issues that disproportionately affect those users. Differences in rates across user groups could point to issues with content clarity. Content that resonates well with one user group might be confusing or less intuitive for another, prompting more help requests.{{< /note >}} | ||
|
||
## Call to action: Embrace unsolicited customer feedback | ||
|
||
In a time when access to digital government services is critical, understanding and enhancing user interactions with those services is essential. Traditional analytics and metrics fall short of capturing the full spectrum of user experiences. GSA encourages readers to adopt a more comprehensive approach to digital user experience data analysis by leveraging the unsolicited feedback that users are providing. | ||
|
||
There are various areas where your agency can begin to tap in to unsolicited feedback: | ||
|
||
{{< box >}}We recommend starting with the available data and then incorporating more data and advanced analytics. | ||
|
||
**Inventory existing CX data**: Identify the data you currently have, its sources, and formats (structured vs. unstructured). This includes both solicited data (such as surveys and feedback forms) and unsolicited data (such as emails). | ||
|
||
**Create a CX pilot**: Starting small with simpler implementations and straightforward analysis questions is recommended to test your approach, tools, and team’s analytical capabilities. Starting with a pilot can also help avoid substantial information technology (IT) changes. For example, if big data cloud processing is not an option for you, consider analyzing a subset of CX data on a government-furnished computer or virtual desktop. Leverage the open-source Python [CX analysis toolkits](https://github.com/GSA/GovCXAnalyzer/tree/main/notebooks/digitalcx) on the available data and get preliminary CX insights on specific customer experience issues or opportunities. | ||
|
||
**Start using new CX metrics**: Leverage the CX analysis toolkit [to implement new CX metrics](https://github.com/GSA/GovCXAnalyzer/blob/main/notebooks/digitalcx/digital_metrics.py). In the toolkit, measures such as Help Request Rate, User Sentiment Score, Resource Utilization Rate, and Service-Specific Metrics, enable you to make informed decisions by pinpointing specific issues and successes within different user groups or service areas. For instance, high Request Rates can directly signal areas where users need more support, allowing you to focus improvements where they are most needed. Similarly, you can leverage the User Sentiment Score tool to give you a detailed view of user emotions, which can guide enhancements in user interaction and service delivery. We encourage you to adopt these measures and metrics to understand your service’s impact, actively refine your strategies, and drive meaningful improvements in your digital offerings.{{< /box >}} | ||
|
||
We hope this work holds value for you and your agency. If you have questions or would like to learn more about this work, please reach out to the Analytics and Decision Support Division at [[email protected]](mailto:[email protected]). | ||
|
||
<footer> | ||
<h3 id="footnote-label">Footnotes</h3> | ||
<ol> | ||
<li id="fn1">Zaki, Mohamed, Janet McColl-Kennedy, and Andy Neely. 2021. “Using AI to Track How Customers Feel — in Real Time.” <em>Harvard Business Review</em>, May 4, 2021. <a href="https://www.hbr.org/2021/05/using-ai-to-track-how-customers-feel-in-real-time">www.hbr.org/2021/05/using-ai-to-track-how-customers-feel-in-real-time</a> <a href="#footnotes-ref1" aria-label="Back to content">↩</a></li> | ||
</ol> | ||
</footer> |
Oops, something went wrong.