Does America ever seem to be in a situation where presidents lead states? Is it that the leader of the United States of America only has a say on the things that matter to their state and not on issues that affect them as a whole? What is the role of the state in America?