Food is a key part of any culture. Take the USA: Could there be a more potent symbol of all things Americana than BBQ?
For many, to go against this national pastime amounts to a form of treason. Which is why it should cause little surprise to learn that a new culture has begun to take root among African Americans: veganism.
In years past, this dietary decision was largely associated with being, like, super white. In part, this could be due to the fact that avoiding all animal products is seen as a bourgeois indulgence, enjoyed by the sorts of people who like to proclaim that “All Lives Matter.” That perception is starting to shift.