Apologies in advance if this is just a rehash of an older thread. Question: Do you think animals should have rights? Why or why not? My two cents: Groups like PETA frustrate me to no end. I'm an avid hunter and fisherman, and I'm enfuriated by their attempts to ban my outdoor hobbies from American society. I believe that animals were put on Earth to serve humans, not the other way around, and we should have no qualms about killing animals for food or if medicinal research requires it.