In the decades after World War II, unions were an integral part of American prosperity. But since the 1970s there has been a slow and steady decline in union membership and influence. Why have workers, particularly in the private sector, drifted from union membership? Were the unions inept, or have rights eroded
to the point where unions cannot play their traditional role?