I'm sorry if this does not belong here but the topic just caught me.

There was a time when the word "Disney" was written; read or spoken it had a special meaning. It (Disney) was understood to be all about honesty, morality & integrity. It was everything good & decent about this country. Now I know that things change, but I hope that what "Disney" now stands for is not as true (about this country) as in days past !!