jimthompson
New Member
I have not been to US in many years, but I do sometimes see tv from there, as well as read American and British books. I also try to keep up on the news. This is what I do not understand. People seem to have strange ideas, at least to me, of what is important. Like when some movie star breaks up with her boyfriend or gets arrested for DWI. This is considered to be big news. More important than the deaths of hundreds of people. People in the west seem to be obsessed with what different celebrity people, many of whom I never heard of, are doing. Also I recently saw some movies which were very disgusting and depraved about people being tortured. Saw I think was name of this trash. Is this kind of thing popular? It is nothing but people in pain. This is entertainment?
I have personal experience with torture and do not wish to go into it but the fact is I see nothing entertaining in it even in fiction. If there is torture which is part of story then I understand, no problem, but just to watch this for its own sake is sick to me.
I could write a whole volume but my question really is, is this the way society really is now in the west, or perhaps am I only seeing distorted view of life from tv and other media.
I have personal experience with torture and do not wish to go into it but the fact is I see nothing entertaining in it even in fiction. If there is torture which is part of story then I understand, no problem, but just to watch this for its own sake is sick to me.
I could write a whole volume but my question really is, is this the way society really is now in the west, or perhaps am I only seeing distorted view of life from tv and other media.