Hacker News new | past | comments | ask | show | jobs | submit login

We generally combine what we learn from interviews/usability testing with what we can learn from analytics. Analytics often highlights use patterns that are of a 'we can definitely see that users are doing 'x' but we don't understand why' genre. Then we can craft testing/interviews that help us understand the why. So that's analytics helping us target our interviews/user testing. It also works the other way. User testing indicates users will more often get to where they need to be with design a versus design b. But user testing is always contrived: users are in an "I'm being tested mode" not a "I'm actually using the internet for my own purposes" mode. So it's hard to be sure they'll act the same way in vivo. With analytics you can look for users making the specific move your testing indicated they would. If they do great. But if not you know your user testing missed something or was otherwise off base.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: