I am sure reaper and other DAW users after watching this video they will respect Samplitude.
best sonic transparenty
i love Samplitude
please watch the video below
Tags
Comments
No sorry - nothing there I could pin down at all. The null test
No sorry - nothing there I could pin down at all. The null test seems to have the reverb un-nulled, doesn't it so I don't know the purpose of that test. What I do know is that the results are subjective, not objective. The levels are also quite high so we could also be hearing what happens when levels approach maximum. My own experiments with editors suggest that metering is never exactly the same between different software. The meter ballistics certainly seem to behave differently. However, I've never discovered any differences in quality. I'm pretty sure that in a mix situation, it's not possible to duplicate fader settings - setting a fader to -XdB is not matched to a different program. I also suspect that fader behaviour isn't log or Lin, but something approximating log, but squashed in the top end of the scale. Cubase, my favourite, always seems to run out of fader at the top, while Audition and Audacity seem to do more from 70 to 100% of their travel.
Listening to the video above WITHOUT watching the display revealed to my ears no difference at all between the three, while listening while watching made me wonder if there were some? Doing it blind again, and I couldn't? That's my opinion on the video - not sure it helps really. However - I did rather like the music!
Thanks for your input Paulears. This video has a part 2, on whi
Thanks for your input Paulears.
This video has a part 2, on which I refined the tests and was able to have a perfect null test between the 3 DAWS when applying no mixing and effects.
I also did some mono mixes to avoid differences due to panning behaviours. The levels weren't a factor because all results were normalised to -8db if I remember.
Even then I tried to play with the levels manually to hear if the nulling will be more accurate or not.. The results in the videos are with all humility the most accurate I could do.
Here is part 2 :
Sorry - but I think you have a flaw in the testing system. Your
Sorry - but I think you have a flaw in the testing system. Your null test. You proved that when you invert a signal, if the two files are the same but one is inverted it fully cancels. That's normal, and correct. However, how these three applications process their internal pathways depends on so many things - and the critical one is conversions. If you convert 96K to 48K, it takes time. The output file will be shifted in time. Shifts in time wreck nulling. You say they sound different. I don't think they sound different. Your test shows they are different because the test you used is time sensitive. If you use the popular applications and set up multi pathways, with each one having a plug in of some kind, then each one inflicts timing delays as they process the audio. If you create a burst in a waveform editor that lasts a very brief time with a very sharp rise time and stick this on the left channel, and then send it through any of your plug ins, and put that on the right channel and record that - I bet each one is nowhere near sample accurate in time. Each of these DAWs will process the audio differently. Some could send the data through numerous pathways without altering them if they are patched in but switched off - but maybe one realises they are switched off, and shortens the path? Who knows. This isn't new - when we first started using digital processors, the out and back latency was a surprise to many.
I suspect that if you measure the in to out latency of any of these packages it will change depending on how you have your system configured. This impacts on the null test, making it appear that it isn't as good, when it's probably behaving perfectly, and the test is the flaw in the system.
I've done a fair bit of testing over the years and sometimes a perfectly decent test ends up being flawed by events you didn't plan for. In most cases you get a test result that doesn't;t align with what you expected, and you immediately conclude there is a problem, but later, you find your test process was the problem. Audio testing always has subjectivity. The only way to do it is double blind testing - where you also don't know what is being tested until you analyse the results. In the second video, the artefacts close to 20KHz suggest a problem - but you have no idea if they really exist, or is there a problem in the way the software displays it - maybe filtering, maybe just in how the amplitude measurement algorithm responds to certain types of signal? Is what you see able to be heard? Your eyes detect a problem, but do your ears?
My conclusion is that your tests show conclusively the three DAWs under test process and display audio differently. This appears to have no impact on the sound quality. Of course, there could be real issues with the sound quality and the algorithm Youtube use to compress the video on their servers reacts differently and has removed these negative artefacts.
I still support the theory from back when digital audio first burst onto the scene. It is, at best, a close representation of the original waveform. It can never, at any sampling rate and bit depth, be the same. I'd love to see somebody put together a bunch of clips starting with low bit rate mp3, moving up to something stupid at the other end, and then see what order people would put them in low to high. Of course, people would then query if the conversion to the highest quality wav altered the test - which of course it would, but I would doubt many people would be able to tell. We'd also argue about the content too - if the recording was acoustic guitar, would the perceived order be different to a recording featuring a piano, or worse, a distorted electric guitar.
I suspect the conclusion would be we could detect the worst versions, but then as the differences shifted upwards in spectral content, we'd all split apart. recording studios record to be neutral, the punters listening add a smiley eq curve because that sounds better. The studios never mixed with that curve in mind. If they did, would punters dial in the neutral eq? None of this is 'quality'. We all base our opinions on a different set of rules.
My own conclusion on your tests is that they sound the same. Others will hear very clear differences. Nobody is absolutely correct.
paulears, post: 456409, member: 47782 wrote: If you convert 96K
paulears, post: 456409, member: 47782 wrote: If you convert 96K to 48K, it takes time.
The last tests were done at the same samplerate as the original files , no convertion. Those gave the best null results. I realised it while doing the part 2
paulears, post: 456409, member: 47782 wrote: In the second video, the artefacts close to 20KHz suggest a problem - but you have no idea if they really exist, or is there a problem in the way the software displays it
To me those artifacts are not problems. They are harmonics added to the signal by the wave compressors (which are simulation of analog gear)
What is interesting is that only Samplitude let them through.
paulears, post: 456409, member: 47782 wrote: Each of these DAWs will process the audio differently.
That my point exactly. I didn't show it in the video but I did sample aligned the tracks to compensate for time differential.
But of course some might have missed my goal here. I never wanted to say one is better than the others or which is the best.
My only intent was to show that without any changes to the default settings, they sound different. I leave to everyone to evaluate which is for them. ;)
paulears, post: 456409, member: 47782 wrote: My own conclusion on your tests is that they sound the same. Others will hear very clear differences. Nobody is absolutely correct.
Thanks for sharing your thoughts !
Surely, this also assumes that you are not interested in how wel
Surely, this also assumes that you are not interested in how well they work for you? I hear no sound quality differences so, given the choice of those three, I'd choose the one that fits my way of working best.
If you aligned the samples - then surely that destroyed one measurable difference in them?
paulears, post: 456413, member: 47782 wrote: Surely, this also a
paulears, post: 456413, member: 47782 wrote: Surely, this also assumes that you are not interested in how well they work for you? I hear no sound quality differences so, given the choice of those three, I'd choose the one that fits my way of working best.
If you aligned the samples - then surely that destroyed one measurable difference in them?
Come on Paulears, I do value workflow and easyness of use etc..
All I wanted is for people to stop saying DAWs all sound the same...
Thing is, I do hear the difference and I do perceive a more dynamic, defined and clear sound with samplitude which if my priority was accuracy would make me choose it over others.
Quality is not defined by all the same way so I can't debate on it...
Have you tried a double blind test? Can you pick out the samplit
Have you tried a double blind test? Can you pick out the samplitude sample from the others when you don't know which it is. I played the samples not looking at the screen, and I couldn't hear a difference. I still feel that they do sound the same. I'm happy you have a different conclusion from your tests, but it didn't work for me. It was fun doing it though, and it certainly wasn't a waste of time, but when I used to manage research for education, it's so easy to have an opinion and then arrange your research to validate it. It's so easy to do this and it removes the objectivity.
paulears, post: 456422, member: 47782 wrote: I'm happy you have
paulears, post: 456422, member: 47782 wrote: I'm happy you have a different conclusion from your tests, but it didn't work for me.
That's respect right there.. Thanks man.
With the songs mixed, even on earbuds, I do hear samplitude being more open, defined and clear, with better instrument seperation.
Placebo effect ? I doubt it because even if others on the forum praised it, samplitude's look and learning curve wasn't appealing to me before my tests.
But the sound difference alone may be my decisive argument to switch. It'll be hard for me after being a long time cakewalk user.
Hey thanks for taking my video as a reference. I'm glad I could
Hey thanks for taking my video as a reference.
I'm glad I could leave a trace to help people make educated choices ;)