When evaluating or subscribing to a software platform, we all want to make an informed decision. User reviews are one way for individuals or very small teams to quickly make a decision in merely days, or sometimes even hours.
Unfortunately, a quick decision isn’t always an informed decision. User reviews aren’t always what they appear to be. Relying on them can give someone a warm, fuzzy feeling about a purchase that turns out to be a bad experience in the end. In this article we will outline seven reasons why we’ve decided to skip incorporating user reviews in our software evaluation processes here at Doakio.
TLDR – “Basically, you can rely on software user reviews if the stakes are very low and it doesn’t matter if you make a bad choice.”
- Legitimate Software Reviews vs Corporate Astroturfing
- Inability for Businesses to Respond to Negative Reviews
- Insufficient Number of User Reviews for Small Review Platforms and Software Solutions
- Influenced Software Reviews & Incentivized Software Reviews
- AI Generated Software Reviews
- Software Reviews – Focus on Feelings & Emotional Response
- Review Bombing of Software Solutions
- Aggregation of Software Review Scores
- Software Reviewers’ Use Case vs Your Use Case
- Reviews Often Lack Rigour and Detail
- Software Reviews Age Poorly
- Understanding Unstructured Text Data from Software Reviews is Complex at Scale
- Closing Remarks
Legitimate Software Reviews vs Corporate Astroturfing
Real software reviews used to be an effective, important aspect of purchasing software. People valued input from neutral parties that had used the tool that they were thinking of buying. But finding an accurate review of software has become increasingly difficult. In an effort to increase sales or improve their reputation, some corporations will write false software reviews. This is one form of a process called astroturfing. Today, astroturfing is a serious internet threat. It’s the masking of a message and then showing an altered message to the public so that it looks like it came from a grass-roots level. Fortunately astroturfing detection is becoming more popular since many sites are flooded with fake messages, for example: news sites, political blogs and review sites.
Inability for Businesses to Respond to Negative Reviews
Some platforms do allow businesses to respond to negative software reviews. This is a good thing. But it is challenging to do right. Do you let the customer then respond again to the company’s response? Soon the review platform is playing the role of mediator between two parties in dispute.
But if there is no possibility for the business to leave feedback in response to specific user reviews, then you don’t get to see the strength of that company’s customer support in action. Moreover, an inability for software companies to respond could encourage competitors to disparage other software companies in their space with no repercussions.
Not to mention, publishing and hosting reviews from third parties puts the review platform in a potentially precarious legal position. It can be difficult to be the arbiter of truth because of the sheer amount of work involved and litigation exposure.
Insufficient Number of User Reviews for Small Review Platforms and Software Solutions
In order to gather lots of data about software from user reviews, you need a lot of reviews. This is a very real problem for both smaller user-review platforms (like AppsandReports and Reviano) and for smaller, niche software platforms. You’ll end up with a very small, statistically insignificant number of user reviews on many (or even most) software solutions in minor categories, making the whole evaluation process basically pointless.
The first several reviewers who do opine end up with an inordinate amount of influence on the perception of the software in question, whether the review is positive or negative or neutral. One solution to this problem is to simply not display any reviews at all until a certain review n-count is reached. But this too is a very bad experience for those participating, who engage and leave a review, then don’t see it show up on the site in public for weeks or months.
Influenced Software Reviews & Incentivized Software Reviews
The majority of shoppers today will look up online reviews before they make a purchase. This is one reason companies like to offer money, or other incentives for reviews. Incentivized and influenced software reviews carry risks. It is a controversial practice for businesses. The review usually ends up biased. On the other hand, an unpaid individual, who is naturally inclined to leave a software review, ends up portraying a more accurate sentiment.
Yelp even goes so far as to tell their businesses that they shouldn’t even be asking users to write reviews at all! Merely telling someone they could leave a review is construed as unwarranted influence by Yelp since businesses are likely to only tell happy customers about the review process.
AI Generated Software Reviews
Thanks to modern generative neural networks like GPT-2, GPT-3 and many others, creating human-like customer reviews at a very large scale is trivial. It is essentially impossible for a human to tell if the review they are reading was generated by a machine or if it is a genuine human response.
Take these two customer reviews, for example. Can you guess which one was generated automatically by a machine and which one is from a real human?
Trick question. They were both completely machine generated.
There is basically no way we know of for a company to effectively defend itself against this kind of fake-review abuse. This is one reason why we’ve decided to forego publishing and evaluating software reviews in our evaluation processes at Doakio.
Software Reviews – Focus on Feelings & Emotional Response
This is the greatest strength, and the greatest weakness, of user-generated software reviews. Reviews can easily summarize the frustrations, successes, failures, and victories of any software platform with a few sentences or paragraphs laden with emotions and sentiment.
But how reliable is that feeling-based data? All it takes is one bad day from unrelated issues, combined with a bad software experience to amplify one’s hatred towards a company or product. We’ve all been there. We’ve likely all left the occasional scathing review to process the experience and vent. But how reliable is that information, really, for other users who are using the software for different purposes at some unspecified point in the future?
Especially when it comes to software that will be used in a business setting, we humbly suggest that a more analytical approach could yield better business results once the chosen software has been deployed.
Review Bombing of Software Solutions
Review bombing happens constantly in the gaming space, and less often in the business software solution space. But it does happen.
Review bombing is when a large number of people from a given community (or communities) decide that they should collectively punish a company by leaving bad public reviews. The reason behind the flood of bad reviews is often related to the behavior of the software vendor, and not the product itself. These negative reviews may be from legitimate long-time users, or they may even be from people who have become “legitimate” users of the software temporarily for the sole purpose of leaving a negative review.
If you agree with the cause behind the review bombing, then this can be seen as a good thing. But it can be bad if the cause behind the review bombing is unimportant to you (or if you directly disagree with the cause). Then you’ll end up with overall review scores that don’t accurately reflect your evaluation of the software solutions under consideration.
Aggregation of Software Review Scores
When you combine and aggregate reviews, you end up with a user perspective that doesn’t actually exist in reality. You lose the deviation of each individual response from the mean. This approach is appropriate for some applications, but when it comes to evaluating software we simply aren’t all the same. We have different needs and expectations based on our own experiences and challenges.
By reviewing each software solution yourself based on its feature set, you can avoid these kinds of problems.
Software Reviewers’ Use Case vs Your Use Case
Even without the problems of aggregation or genuine review verification, the same piece of software can be used in different applications. There may be dozens of use cases that are submitted by reviewers, and their responses won’t always align well with your own use case.
You know your own use case(s) better than anyone else. You know the business context and what other tools you already use that the new software must integrate with. Your workflow, habits, and patterns are unique to you and your organization. Much of this nuance is lost or scrambled in a large body of user reviews.
Reviews Often Lack Rigour and Detail
Most software reviews lack the substance that is needed for intelligent purchasing decisions. In fact, some businesses are looking for ways to utilize artificial intelligence algorithms to improve their company’s decision-making.
Today, what every software buyer absolutely needs is transparent, comprehensive, software comparisons. Consumers need an in-depth analysis that compares details like number of features, business scores, pricing model, age of company and more. It’s what we do at Doakio, because no one else will.
Software Reviews Age Poorly
It’s fairly rare to have a software platform that is massive enough to have a never-ending, constant stream of recent feedback commentary containing information on the features and use cases that align with your own. Most of the time, to find feedback that is specific enough, you need to go years into the past. With today’s modern, agile development cycles, new features are added or old capabilities are tweaked weekly in some cases.
User reviews from months or years ago just might not be accurate any more. Especially for the very large software platforms, where existing features are just as likely to be removed as new features are to be added.
How do you know if that otherwise excellent, detailed user commentary from 2-3 years ago is still relevant?
Understanding Unstructured Text Data from Software Reviews is Complex at Scale
Parsing a large amount of unstructured text data (e.g. software review responses) is a very difficult problem to solve. It falls under the umbrella of natural language processing (NLP) and is an entire industry in and of itself.
It is unlikely that a software review company has the expertise or funding to build and maintain the ever-changing linguistic models (multiple) necessary to understand and prioritize unstructured user feedback. Many software evaluation platforms even advertise that they use customer feedback in their software prioritization and analysis algorithms. But it is virtually impossible for a user of these platforms to validate the accuracy of these algorithms.
As a result, most companies simply rely on a primitive 5-point scale system which is hopelessly inadequate to reproduce the strengths and weaknesses of any modern SaaS platform.
As Eugen Esanu from UX Planet explains when going beyond a five point scale, “Some companies offer a review system with many levels of feedback (for example, Adidas). Where you can choose the level of comfort of a shoe, material quality, design, and describing each part of the product.” We agree with this approach to accurate, software product evaluation.
Can you still get value out of user reviews? Yes, absolutely. Just as long as there is not an over-reliance thereon. With the aforementioned drawbacks in mind, you’ll at least be equipped to look out for common downfalls and weaknesses of user commentary during a software evaluation process.
“Basically, you can rely on software user reviews if the stakes are very low and it doesn’t matter if you make a bad choice. Unfortunately, this is not usually the case in software buying decisions, especially within businesses.”
We conducted extensive research on all our own competitors in the Software Evaluation space. There are a lot of companies out there who have built some impressive platforms around the digestion of user reviews. You can use our research along with our survey and custom report to find a software user review site that aligns well with your own research philosophies. You can also filter on the scatter plot to find software advice sites that, like us, don’t utilize third party reviews at all.
Not many companies out there will comprehensively list all of their competitors, and score them against themselves. But we do. We are confident that what we have built here at Doakio is truly unique in the market, and delivers value more quickly and more accurately than any other software review site on the internet.
What has your experience been as you look for new and improved software solutions? Let us know in the comments below!