A little while ago I read something somewhere in replies on YT I think and it kinda made sense
Something like:
"Our parents grew up with the idea that the US was the land of chances and opportunities but our generation almost sees it more like a developing country"
Not to talk down the US but I also experienced this, I always wanted to live in the US very bad but nowadays I rather stay away..
That's worrying, I think