I found some promising documentary series. It was about expeditions and stuff. But I was so fucking turned off of it when they censored women nipples when the expeditioner visited some tribe. Male nipples were of course not censored.
American documentaries are the worst. They are so fucking bad. Apparently nudity was common in european entertainment and commercials too some decades ago, but american culture ruined that too? I don't know, I had niche religious orientation upbringing and I did not watch TV as a kid much, and still don't, because it is shit anyway. But that's what I have been told.
European world should have forced americans to watch nipples and not the other way around. But it is not too late! We should actually make a huge boob balloon and let it soar over USA, forcing them to face the reality, the same reality they have grown from, the reality that has feed us, and made this reality.
Even german documentaries are better. But british documentaries? Oh, THAT'S the good stuff!