I was just turning the pages of a handsome coffee table book on Jackson, Mississippi.
While I was doing so I got to thinking that it would be interesting to find a book that dealt with how the people of the “southern states” learned to be “southerners” from the time Lee and Grant quit fighting until we decided to kill Europeans in WWI. Can anyone recommend to me a social history of The South during that period?
Then I got to thinking that I’d like to find a similar history of the perceptions of the French people and the American people of their interactions from the signing of the Declaration of Independence to the Louisiana Purchase. It might be called “What Have They Ever Done for Us?” or “Thank Them for What?”
Just Checking In
-
While I've not been posting here in my blog
I am on Instagram and Facebook
11 months ago
No comments:
Post a Comment
Check out older posts. Comment on a post by clicking on its title