The 1970s and 1980s heralded the rise of neoliberalism in United States culture, fundamentally reshaping life and work in the United States. Corporate culture increasingly …