Built a dashboard with all the key metrics but noticed most people still ping me for basic numbers.
Starting to think the interface might be too complicated or maybe they just don’t trust the data they’re seeing.
Built a dashboard with all the key metrics but noticed most people still ping me for basic numbers.
Starting to think the interface might be too complicated or maybe they just don’t trust the data they’re seeing.
Same thing happened when I rolled out analytics for different app teams. They kept asking me for conversion rates even though everything was right there.
Turns out they wanted to see how the numbers were calculated. I started adding small tooltips showing the math behind each metric - like “DAU = unique users who opened app today” or “ROAS = revenue ÷ ad spend”.
Also made sure the data refreshed at predictable times. Nothing kills trust faster than seeing yesterday’s numbers at 2pm when they expect real-time updates.
This video covers some solid approaches for getting teams to actually adopt self-service tools:
Once people understood what they were looking at and when it updated, the random Slack messages dropped by maybe 70%.
Training helps too. Most people avoid tools they don’t know how to use.
People skip dashboards because they’re drowning in options or don’t know which numbers matter for their role.
Create team-specific views with just 3-4 relevant metrics. Add context like ‘good when above X’ so they’re not guessing what the data means.
One metric per page. That’s it. Dashboards with 15 charts? Nobody uses those. People want to click once and get their answer. Build separate pages for each team’s main KPI - growth team gets acquisition cost, product team gets retention, revenue team gets LTV. Big number at the top, trend below, one sentence saying if it’s good or bad. Done.
Put raw numbers right in Slack. Skip the dashboard entirely.
Test different dashboard versions with a few teammates first. Have them hunt for specific metrics while you watch.
I do this with app campaign tracking and always catch weird assumptions I made about what people expect. Sometimes the data’s perfect but the labels don’t match how they think about their work.
Also check if they need historical comparisons. People usually want to see if this month beat last month, not just current numbers.