[FONT=Book Antiqua]Article by TAMBAY over on Shadow and Act | Indiewire
The Black Folk Don't... web series we featured here on S&A got me thinking... one of the episodes was titled Black Folk Don't Travel; watch it below first, and continue reading underneath.
[FONT=Book Antiqua]So... it got me thinking about the myriad of films we've covered here on S&A over the last few years that feature, most often, a Caucasian man or woman (usually American or from continental Europe) in an African country; [FONT=Helvetica Neue]We've seen quite a number of films about white Americans or white Europeans either already living in Africa, or visiting some African country, in search of something or someone - whether it's salvation, redemption, inspiration, vacation, themselves, their spouses, children, friends, their dogs, cats, apes, whatever; and it's rare that they're villains, nor in positions of inferiority. [/FONT][FONT=Helvetica Neue]Also, those that are historically based usually involve white *settlers* (or remnants of colonialism) who come to see themselves as native to the land that their ancestors once occupied.
And in thinking further about this, I realized that I couldn't come up with many titles of fictional narrative feature films that centered on stories about African Americans in Africa; it's not like black Americans don't travel right? Or more specifically, it's not like black Americans don't travel to/visit/live/work in African countries right? I know more than a few.
Their reality just isn't reflected on our screens, big and small; as is the case for much of the so-called black experience, so nothing terribly shocking. But just making an observation. I'm speaking in the spirit of what we call Pan Africanism.
If Hollywood movies are any indication, one would think that white people were the only "race" of people who traveled internationally.
One recent example that immediately comes to mind is Sex And The City 2, in which the characters spent much of the movie in Abu Dhabi, and, by most accounts (I haven't seen the film), do some purportedly ignorant, cringe-worthy things that insult the emirate and its people.
So there I was wondering... it'd be refreshing to see more films about African Americans outside the USA, specifically in Africa (although let's face it, it'll be just as refreshing to see a wider variety of films about African Americans in America). I couldn't think of many films with that as a basis for the story.
Does Shaft In Africa (photo above) count? Haile Gerima's Sankofa is another. And, to be clear, I don't mean films that star African Americans playing Africans (there are certainly numerous examples of those); nor am I including documentaries. I'm thinking of narrative fiction feature films with stories centered on black Americans either visiting a country (or several countries) in Africa for whatever reason, or who are already living in an African country.
Can you name any? Maybe I'm suffering from some form of temporary amnesia, and just can't remember any films that fit the criteria. Discussions abound about unifying the Diaspora; it's not quite happening in real life from where I'm standing; but at least, in the fantasy, make-believe world of the cinema, we can pretend, or show what could (or could not) be .
Black folks travel don't they?[/FONT][FONT=Book Antiqua]