References for Brightwork UX Research Articles
Last Updated on October 22, 2021 by Shaun Snapp
- These are the references that were used for our UX Research Articles.
Learn why so few entities in many areas include references in their work.
This is the reference list for the UX Research Articles articles, as well as interesting quotes from these references at Brightwork Research & Analysis.
You can select the article title to be taken to the article.
Reference #1: Article Titled:
Fresh Consulting: A Review of The Websites of UX Companies
Reference #2: Article Titled:
Answer Lab: A Review of The Websites of UX Companies
Reference #3: Article Titled:
The Brightwork Remote UX Research Approach
Doing UX research remotely with tools in the past.
Often, participants didn’t have the right technology to participate easily in remote sessions. Internet connections, especially in homes, were not nearly as fast as they are today, which could make screen sharing and the use of Webcams difficult. Plus, most participants didn’t have Webcams or microphones on their computers, so they had to call into online meetings by phone.
Benefits of the new tools.
Tools such as Zoom and WebEx can also transcribe the audio and present the transcription alongside your video recordings. While the software doesn’t always capture people’s exact words perfectly, the transcriptions are good enough to make it much easier to find and watch specific parts of videos, either to supplement your notes or to create video clips. When you download the videos, WebEx adds the transcriptions as captions at the bottom of the video frames.
Before beginning your sessions, determine the best monitors on which to position all of these items. Here’s how I set up my monitors. On my central monitor, I usually set up the Excel spreadsheet that contains both my discussion guide and my notes so it takes up the left half of the screen. This monitor has my Webcam on top of it, which lets me look at my questions and type notes while I’m looking in the general direction of the Webcam, making it appear that I’m making eye contact with the participant. The right half of this central monitor displays the software I’m using to communicate with the clients and team members observing the sessions so I can easily notice and answer their questions.
User in a Car?
When you recruit and schedule participants, tell them that they cannot join their session from a car or other vehicle. Some people want to participate in the session during their workday, so they go out to their car to have some privacy and perhaps avoid getting in trouble at work. The problem is that their company’s Wi-Fi connection usually is not strong enough out in the parking lot for them to participate effectively in an online session.
Talking Out Loud
Some participants seem to have difficulty thinking aloud during in-person sessions so you might constantly have to prod and remind them to think aloud. Although thinking aloud seems to come naturally to others. However, I’ve found that, for some reason, thinking aloud seems to feel more natural to participants in remote sessions.
Easier to Record the Session Remotely
In-person sessions that you conduct in a research facility often require that you set up recording software, cameras, and microphones to capture both the participant’s computer or mobile-device screen and video and audio of the participant and UX researcher.
You’ll usually pay much less attention to participants’ Webcam video during remote sessions and focus much more on the screen they’re sharing.
During remote research, I find that I focus more on what the participant is doing and saying. So I feel that I’m better able to pick up on hesitations and the tone of the participant’s voice than I would during in-person research.
Security and IP Protection
When participants share their screen during the online meeting, ask them to share their entire screen, not just their Web browser. This makes it easier to see whether they’re trying to take screenshots of the prototype or record their screen. If you notice that they are taking screenshots or using recording software, politely ask them to stop.
Overcoming Collaboration Limitations
You can try to replicate the observation-room dynamic by using collaborative, whiteboarding tools such as Miro or Mural. This helps keep observers more actively engaged and paying attention to the sessions. Schedule debrief time between the sessions for observers to discuss your findings and collaborate on analysis. Tools such as Miro and Mural also let observers draw, create diagrams, and type up their insights on virtual Post-it notes that they can move around the screen and group into categories.
On Balance: Remote UX Testing
Although remote UX research does have some disadvantages, the advantages of remote research greatly outweigh them.
Remote, unmoderated sessions: Tools such as Lookback, dscout, and Userbrain help capture qualitative insights from video recordings and think-aloud narration from users. Tools such as Koncept App and Maze capture quantitative metrics such as time spent and success rate. Many platforms have both qualitative and quantitative capabilities, such as UserZoom and UserTesting. (Be sure to check whether these tools work well with mobile applications, as needed.)
Remote, moderated sessions: Any video conferencing platform that has screensharing, call recording and the ability to schedule meetings in advance is likely to meet the needs of most teams. Zoom, GoToMeeting, and Google Hangouts Meet are frequently used. (Remember to consider platforms that do not require participants to download anything to join the meeting.)
Additionally, our high estimate for an unmoderated platform subscription is $100 per month. Organizations can spend much more than that depending on the tool. Some large platforms with lots of features and tools (like UserZoom) can cost thousands of dollars per month.
Reference #4: Article Titled:
Moderated Versus Unmoderated Remote UX Research Testing
It’s also worth noting that the metrics-only tools included in this diagram, Maze and KonceptApp, are both designed to be used for testing prototypes and are not suitable for testing live websites or applications. Although they can simulate interactions, such as letting test participants click a link and move to another screen, this behavior requires you to actually build or import an interactive prototype.
For example, moderated usability testing (whether in-person or remote) is more appropriate for evaluating an early-stage prototype or to identify usability issues in interface or tasks that are so complex that it’s necessary to provide personalized directions and ask followup questions to fully understand users’ behavior.
Who gets paid what?
Monetary incentives were paid to only 10% of participants in internal studies, such as tests of intranets or MIS systems. This finding corresponds well with our recommendation not to pay a company’s own employees extra money simply to participate in usability testing, because they are already being paid for their time
(see Tip 26). About a third (35%) of companies did provide a non-monetary incentive to internal test participants, most commonly a small gift, such as a coupon for a free book or a lunch in the company cafeteria. In contrast, participants recruited from the outside most often received cash as their incentive for coming to the test. 63% of external users received monetary compensation, 41% received non-monetary incentives, and 9% didn’t get anything.
(The numbers total more than 100% because a lucky 13% of external users were given both monetary and non-monetary incentives.) The average incentive paid to external users was $64 per hour of test time. Again, the US West Coast was the most expensive with an average incentive of $81 per hour.
If you are a small company, or if you haven’t done much user testing in the past, we recommend that you consider hiring a professional recruiting agency.
Reference #5: Article Titled:
The Importance of Lower Common Denominator UX Research Testing
Reference #6: Article Titled:
The Number of Test Subjects Versus the Number of UX Test Observers
Reference #7: Article Titled:
Prompting the UX Test Subject to Find Information and Pathways
Reference #8: Article Titled:
The Problem With the Term UX or User Experience
Reference #9: Article Titled:
The Contradiction Between Agile and UX Research
We ran an unmoderated remote usability study with 5 participants: 3 prospective students and 2 parents of prospective students. (Even with just a few users, it’s possible to identify most of the main usability issues in a design.) We asked participants to explore the site to see if the university is a good match for their needs. Then we asked participants to find answers to questions that are typical for users of such sites (e.g., the cost of tuition, the application deadline).
Reference #10: Article Titled:
What is the General ROI or Return from UX Research?
User testing is different from focus groups, which are a poor way of evaluating design usability. Focus groups have a place in market research, but to evaluate interaction designs you must closely observe individual users as they perform tasks with the user interface. Listening to what people say is misleading: you have to watch what they actually do.
Before starting the new design, test the old design to identify the good parts that you should keep or emphasize, and the bad parts that give users trouble.
Unless you’re working on an intranet, test your competitors’ designs to get cheap data on a range of alternative interfaces that have similar features to your own. (If you work on an intranet, read the intranet design annual to learn from other designs.)
Reference #11: Article Titled:
Evaluation of the NIH Website Usability