Measuring to Improve: Our Approach to Customer Satisfaction

How we aim to turn interactions with our technical support into a valuable experience 

In recent years at WEGG, we’ve taken the time to reflect on how we deliver technical support. Together with the Marketing Team, we worked with a clear goal: aligning our service more closely with the Focus on Value principle, one of the pillars of ITIL 4.

For a long time, Technical Support was seen as a reactive service: a problem arises, a ticket is opened, we intervene, we close it. A structured flow, measured through metrics like response times and resolved tickets. Useful indicators, but incomplete ones—they only describe the most visible part of the service.

 

Behind every data point lies an experience. There are people, expectations, frustrations, relief. A simple example: a customer loses phone connectivity. The ticket is closed within the expected time, but in the meantime the customer had to repeat the issue multiple times to different operators and received no clear answers.

 

The result? A negative feeling—one that weighs more than any SLA met

 

Step 1 – Redefining the Concept of Value 

So the first question we asked ourselves was: Are we really measuring the customer experience? Numbers help us understand operational efficiency, but they don’t describe how the customer feels or how supported they feel. This is why, alongside the technical metrics we were already collecting, we added qualitative measurements focused on experience.

We revisited the very concept of value. For us, it’s not the speed with which a ticket is closed; it’s about what the customer perceives throughout the entire journey—welcome, clarity, attention, and the reassurance of not being left alone.

This change in perspective led us to imagine support not as a switch to be flipped when needed, but as continuous guidance: we listen, contextualize, and build the best solution together

And improving the experience doesn’t just mean acting only on the resolution phase, but also on earlier steps. For example, feedback showed that opening tickets felt complicated, so we introduced a more immediate channel—sending requests via email to [email protected]—to simplify the process and reduce customer stress.

 

Step 2 – Building a Measurement Model 

We divided the experience into three key moments—ticket opening, management, and closure—and for each one we identified the aspects that most influence service perception, allowing us to explore them in the feedback questionnaire.

 

1. Opening: the first impression

The entry point often sets the tone for everything that follows. If the process is simple, the customer already feels the situation is in good hands. This is why we measure ease and speed with a direct question: “Was it easy to submit your support request?”

 

2. Management: the heart of the relationship

It’s during management that the true quality of the service is perceived. It’s not only about technically correct answers—it’s about how well we read the customer’s situation. So we ask: “Was the attention given to you adequate for your problem?”

Once the context is clear, we recalibrate the priority together with the customer and decide what actions to take. This is where our new approach comes in: we don’t treat tickets as automatic operations—we immerse ourselves in the customer’s situation to provide truly informed and proportionate answers.

Una volta chiarito il contesto, ricalibriamo la priorità insieme al cliente e decidiamo quali azioni intraprendere. È proprio qui che entra in gioco il nostro nuovo approccio: non trattiamo i ticket come operazioni automatiche, ma cerchiamo di immergerci nella situazione del cliente per dare risposte davvero consapevoli e proporzionate.  

In other words, management is not only about solving a problem: it’s about accompanying the customer while they face it.

 

3. Closure: the final assessment 

Once the problem is resolved, the customer evaluates the experience mainly on two aspects:

  • the quality of the solution, 
  • the adherence to expected timing. 

 

An effective response delivered within the right timeframe can turn even a critical incident into a positive experience. 

We also include one final question: “How would you rate your overall experience with our technical support service?”

To better capture nuances, we chose a 1–6 scale, which helps distinguish a fully positive experience from one that, while not negative, could be improved. Intermediate scores like 3 or 4 are often the most insightful, indicating that something didn’t meet expectations.

For this reason, when the rating falls below a certain threshold, the questionnaire automatically opens a short follow-up question with options to clarify what didn’t work. This way, feedback doesn’t remain an isolated number—it becomes a small story, a concrete context from which we can learn.

Thanks to this structure—stages, questions, scores, and follow-ups—we can better understand the customer experience from both a technical and relational perspective.

 

Step 3 – Personalizing the Measurement 

Not all aspects of the service have the same weight in customer perception. To understand this, we didn’t decide ourselves—we asked directly. 

The preliminary survey revealed this order of importance: 

  1. Level of attention to the problem 
  2. Speed of resolution 
  3. Adequacy of the proposed solution 
  4. Ease of opening the ticket 

 

We also collected preferences on how often customers want to receive questionnaires: some want them after every ticket, others monthly or quarterly, and some prefer not to receive them at all.

These preferences—modifiable at any time by contacting the Service Manager—are configured in our platform: each customer receives surveys only when and how they choose. If no preference is indicated, the default frequency is quarterly.

 

Step 4 – Humanizing Automated Feedback Requests 

To make the feedback process more pleasant and less formal, we introduced a virtual butler: Alfredo.

Polite, light, and tastefully humorous, Alfredo guides customers through the questionnaires with a tone that’s anything but bureaucratic. This approach has increased not only the quantity, but also the quality of responses: human language generates more authentic feedback.

Alfredo also steps in proactively: if a customer hasn’t opened a ticket for a long time, he sends a message to check whether everything is working properly or to detect potential issues that haven’t been reported.

 

Step 5 – Turning Data into Concrete Actions 

Collecting feedback only makes sense if the data leads to real actions.

We configured an interactive Power BI dashboard that allows us to view aggregated results in real time. The questionnaires are anonymous, but the ability to analyze data by area helps us understand precisely, during periodic reviews with the whole team, where to intervene and which aspects to focus on with our operators.

 

Concrete examples: 

  • If ticket opening seems difficult… we simplify forms, improve instructions, open new channels.
  • If perceived attention is low… we analyze communication style and work on operator interaction.
  • If solutions aren’t satisfying… we identify training gaps or technological limits and intervene with coaching or partner escalation.
  • If timings don’t match expectations… we identify bottlenecks and optimize intake, diagnosis, and escalation flows.

 

Questo ciclo continuo di raccolta e interpretazione dei feedback ci consente di evolvere il servizio in tempo reale, adattandoci alle esigenze dei clienti e anticipando le criticità prima che diventino problemi strutturali.  

In questo modo, il Customer Care non è più solo un meccanismo di risposta, ma un vero e proprio sistema di allerta precoce e un motore di miglioramento continuo per l’assistenza tecnica, capace di superare la logica reattiva per abbracciare un approccio proattivo e orientato alla relazione.  

Finally, we decided to make the aggregated experience indicators visible directly on the Home page of our Technical Support Portal. This ensures maximum transparency and, at the same time, keeps our team focused so that evaluations remain consistently positive.

 

Conclusion: from ticket to relationship 

Our goal is not only to solve technical problems: we want to build a relationship with each customer based on trust, transparency, and attentive listening. Every piece of feedback is an opportunity to improve, anticipate needs, and make our support increasingly human and helpful. 

This is why we ask all our customers to lend us a hand: their feedback is essential in enabling us to support them better. We are aware that we are not perfect, and precisely for this reason we commit ourselves every day to improving.

By Alberto Andriolo, Service Manager

02-s pattern02

Vuoi sapere di più su come gestiamo l'Assistenza Tecnica in WEGG?

Contattaci per approfondire!