People want to believe their way is the only way. That is why the Duggars bother people.
I've only seen a few episodes but have never seen the Duggars say "their way is the only way to raise a family". I've never heard them say a negative word about any other religion or way of life. That's why it's hard for me to fathom the hatred for their family. Their children seem happy and are certainly not abused so why all the malice?