If you’re a composer, producer or anyone who makes music, sooner or later you will find yourself reading a review of something you want to buy – a synth, a compressor, or even a whole new DAW. Welcome to the wild west: the product review minefield.

Reviews are everywhere now: websites, YouTube, forums, newsletter blurbs, and it’s easy to get lost when reading music software reviews. The trouble is, not all reviews are created equal. Some are useful. Some are marketing dressed up as opinion. And some are just a hot take with no testing behind it.

So how do you figure out which is which? Here’s my attempt to lay out a sensible approach that actually helps you decide, without turning you into a spreadsheet of specs.

Summary: In this article I will take a practical look at how to decipher music software reviews. It explores what makes a review credible, how to spot marketing spin, and why transparency around sponsored content and NFRs actually matters for VSTs, sample libraries, and other music software.

The Ultimate Guide to Music Software Reviews

What a good music software review should do

A solid software review for music gear (VSTs, sample libraries, and other music software) tells you more than whether the product “sounds pretty good.” It should answer these practical questions:

  • What problem does this solve, and for whom?
  • How does it integrate into a real session?
  • What are the limitations: CPU, compatibility, edge cases?
  • What did the reviewer actually test, and how?
  • Are there audio examples, and are those examples meaningful?

If a review gives you all that, it’s worth your time. If it is only screenshots and sweeping praise, be cautious. Real testing needs time and context. Testing a reverb on a single vocal snippet tells you a little, but testing it across different sources, with different pre-delay and decay settings, tells you a lot more.

Something else we try here at Music Nation is descriptive overviews. Check out our Best DAW Tier List The Top 15 Digital Audio Workstations Ranked, it’s not a review but rather a listing of all the major DAWs on the market with a dedicated overview page for each one. The idea is to provide all the information so you can make your own informed decision.

How We Approach Testing at Music Nation

When I review something, I’m mostly thinking like a working composer, not a lab technician. I want to know how a plugin or library behaves in a real DAW session, surrounded by other instruments and tracks that are already fighting for space. My tests are usually part of ongoing projects – orchestral, hybrid, or something in between – so I can see how the software blends, inspires, or occasionally gets in the way. When we use guest reviewers, I try to impress these points to them.

The Ultimate Guide to Music Software Reviews

Workflow is usually the deciding factor. If a tool helps me write faster or sparks an idea, that’s worth more than any spec sheet. If it feels clunky, or I spend half my time hunting through menus, I’ll probably move on. I also keep an eye on CPU use and stability, because crashes or slow loading times kill creative momentum faster than anything.

Cost is always part of the conversation too. Most independent composers work to tight budgets, so I try to judge value as much as sound quality. And just to be clear, I don’t use affiliate links or earn from product sales — if I recommend something, it’s purely because it proved itself useful in real work. I’m not testing to find perfection; I’m testing to see if a tool genuinely earns its place, or if it’s just another shiny thing destined to gather dust.

Not every reviewer does all this. Some do none of it. Which is why you should look for the details, not just the 10/10 score in the conclusion.

How to tell genuine reviews from marketing

There are obvious red flags, though some reviews can be sneaky.

Clear red flags:

  • No disclosure of sponsorship or affiliate links. If the reviewer is earning from sales and they do not mention it, treat their praise sceptically.
  • Overly short testing window. If the “review” drops the same day as release and is full of gushing adjectives, it’s likely not deeply tested (unless they have beta access).
  • No audio examples or only the manufacturer’s presets. If you can only hear the official demos, you are listening to curated marketing.
  • Excessive focus on UI polish over sound and behaviour. Yes, UI matters. No, a shiny GUI does not fix a plugin that causes phase issues.

Some reviewers just have a naturally positive attitude, or they’re genuinely excited about a product – nothing wrong with that. But watch out for the classic bait-and-switch style of writing, where someone briefly criticizes a feature, then immediately buries it under a pile of praise. That usually means they’re padding out the positives or not fully standing by their own critique.

Being critical isn’t easy. It takes time, testing, and a bit of backbone. Pointing out flaws shows you’ve actually dug into the product and care enough to be honest about what doesn’t work. It also means you’re willing to take heat from both readers and developers. The best reviews strike that balance – fair praise, real critique, and enough honesty to make it worth reading in the first place.

Ethics: Sponsored content vs NFRs

With music software, there’s a big difference between a sponsored review and one written with an NFR copy, though to people outside the industry it can look like the same thing. The distinction matters, mostly because it’s about transparency and trust.

A sponsored review is when money changes hands. The reviewer is being paid, either directly or indirectly, to feature or discuss the product. Sometimes it’s a straight-up payment, sometimes it’s bundled into ad placements, sometimes it’s just a “review opportunity” that comes with a fee. Whatever form it takes, it should always be clearly labelled as sponsored content. Readers deserve to know when money might influence the tone or visibility of what they’re seeing. That doesn’t mean sponsored reviews are automatically dishonest, but the label gives everyone context. It’s about honesty, not guilt.

NFR stands for Not For Resale. These are fully functional copies of the software provided to reviewers, magazines, or influencers so they can do their job: test, write, demonstrate. The key point is that NFRs are not payment. They’re tools. You can’t legally sell them, transfer them, or profit from them in any way. They’re issued under the assumption that the reviewer will use the software solely for evaluation. In most cases, the developer’s EULA makes it clear that ownership remains with them. The copy is just temporarily granted for review purposes.

The Ultimate Guide to Music Software Reviews

Now, in practice, developers don’t send out time-limited keys that self-destruct after the article is done. It’s a trust system. Once the review’s complete, technically you should uninstall it. Some reviewers do, some don’t, and most developers turn a blind eye because the hassle of license management outweighs the risk. But ethically, that’s how it’s supposed to work.

The important thing is that NFRs don’t carry obligation. Developers don’t get to demand a positive spin, they don’t get editing privileges, and they don’t influence the verdict. If they did, that would make it sponsored. Most professional developers understand this and never push opinions or goals alongside the NFR copy. They know that credibility is the only real currency reviewers have, and the second they start manipulating that relationship, everyone loses.

So when you see a reviewer mention they received an NFR copy, that’s not a confession – it’s simply standard disclosure. It’s assumed in this industry that reviews are written with NFR versions because, frankly, no one could afford to buy every plugin they need to test. What matters is the difference in intent: sponsorship influences the content, NFRs enable it.

Practical tips for buyers

I think a good review should whet your appetite for the product you’re interested in. It’s not there to sell you anything — you’ve already made your own mind up. A review is more about helping you feel confident in your decision.

That said, no matter how convincing the reviewer is, I still strongly recommend exercising a bit of care and restraint. Try to give the software a test run yourself if possible, and see how it stacks up against the pros and cons highlighted in the review.

  • Try before you buy. Many developers offer demos or trial versions. Use them. Don’t just watch demos – drop it into a project of your own and try to reproduce the sounds you want.
  • Check forums and user communities for common issues or hidden quirks.
  • Watch independent YouTube demos to see how it behaves in practical setups.
  • Compare it to similar VSTs or sample libraries you already own to see if it truly adds value.
  • Look at update history and support responsiveness to gauge long-term reliability.

Final thoughts – a slightly opinionated wrap

Reviews should be tools, not gospel. Use them to gain perspective, not to outsource judgment. If a plugin review teaches you how a tool behaves in session and gives you concrete examples of its strengths and faults, it has done its job. If it merely amplifies marketing claims, don’t let the hype nudge you into an impulse buy.

And remember: your ears and your workflow know more than a headline. Trust the testing that fits your musical context when evaluating VSTs, DAWs, or sample libraries. Try to be kind to reviewers. Honest, thorough reviewing takes time and gear and often no financial reward. But hold them to a standard: transparency, reproducible tests, and a bit of humility.

Until next week – keep making more music!

Like the article? Shout us a cup of coffee!